(ฎ)
Quality Assurance Guidance
Document
Conducting Technical Systems
Audits of Ambient Air Monitoring
Programs

-------
EPA-454/B-17-004
November 2017
This page intentionally left blank
ii

-------
EPA-454/B-17-004
November 2017
Quality Assurance Guidance Document
Conducting Technical Systems Audits of Ambient Air
Monitoring Programs
U.S. Environmental Protection Agency
Office of Air Quality Planning and Standards
Air Quality Assessment Division
RTP, NC 27711
November 2017

-------
EPA-454/B-17-004
November 2017
Disclaimer
The statements in this document, with the exception of referenced requirements, are intended solely as
guidance. This document is not intended, nor can it be relied upon, to create any rights enforceable by
any party in litigation with the United States. This guidance may be revised without public notice to
reflect changes in EPA's approach to implementing 40 CFR Parts 50, 53, and 58.
Mention of commercial products or trade names should not be interpreted as endorsement. Some types
of instruments currently in use may be described in text or in example figures or tables. Sometimes
these products are given as a typical and perhaps well-known example of the general class of
instruments. Other instruments in the class are available and may be fully acceptable.
iv

-------
EPA-454/B-17-004
November 2017
Table of Contents
1.0 Introduction	2
1.1	Technical Systems Audit (TSA)	2
1.2	CFR TSA Requirements	2
1.3	Regulation and Guidance	3
1.4	CFR Monitoring Requirements	4
1.5	PQAO Quality Assurance Project Plans (QAPP)	4
1.6	Guidance Documents	5
2.0 Technical Systems Audit Objectives	7
2.1	Scope	7
2.2	Resources	8
2.2.1	Staffing	8
2.2.2	Funding	9
2.2.3	Time	9
2.2.4	Tools and Resources	9
2.2.4.1	Air Quality System (AQS)	10
2.2.4.2	Quarterly Data Reviews	10
2.2,43 Field Equipment	11
3.0 Roles and Responsibilities	12
3.1	EPA Regions	12
3.2	Office of Air Quality Planning and Standards (OAQPS)	12
3.3	Contractors	13
4.0 Pre-Audit Activities	15
4.1	EPA Regional Office TSA Scheduling	16
4.1.1 Assigning the Audit Team	16
4.2	TSA Planning	18
4.2.1	Basic Information Gathering (Pre-Planning)	19
4.2.2	Initial Contact with the Monitoring Organization	21
4.2.3	Refining the Audit Strategy	23
4.2.4	Selecting Specific Field Sites to Inspect	25
4.2.5	Finalized Audit Itinerary	28
4.3	Quality System Document Review	30
v

-------
EPA-454/B-17-004
November 2017
4.3.1	Document Status	30
4.3.2	Document Content	32
4.3.3	Important Note about QAPP Approvals	34
4.4	Data Review	35
4.4.1	AQS Reports	36
4.4.2	Review Criteria	38
4.4.3	Additional QC Data Requests and Review	41
4.5	Questionnaire Response	42
4.6	Preliminary Findings Summary	42
4.7	Audit Team Communications and Planning	43
4.7.1	Team Meetings Prior to the On-Site TSA	43
4.7.2	Daily Briefings During the On-Site TSA	44
5.0 On-Site Audit Activities	46
5.1	Entrance Briefing	48
5.2	TSA Questionnaire Review	49
5.3	Auditing the Monitoring Organization	50
5.3.1	Monitoring Headquarters	51
5.3.2	Maintenance and Repair Shop	52
5.3.3	Monitoring Data Review and Validation Group	53
5.3.3.1	Tracking Data History	54
5.3.3.2	Data Review Process	55
5.3.4	Quality Assurance Group	56
5.3.5	Analytical Laboratories	58
5.3.5.1	PM2.5/Low-Volume PMio Gravimetric Laboratory	61
5.3.5.2	Hi-Volume PMio Gravimetric Laboratory	62
5.3.5.3	Lead (Pb) Analysis Laboratory	64
5.3.5.4	PM2.5 Chemical Speciation Laboratories	65
5.3.5.5	Air Toxics Laboratories	66
5.3.6	AQS Submittal Group	66
5.4	Standards Certification Records Review	67
5.5	Monitoring Site Inspections	68
5.5.1	Inside the Monitoring Station	70
5.5.2	Outside the Monitoring Station	71
vi

-------
EPA-454/B-17-004
November 2017
5.6 Concluding the Audit	71
5.6.1 Exit Briefing	72
5.6.1.1	Exit Briefing Preparation	72
5.6.1.2	Conducting the Exit Briefing	73
6.0 Report Development and Corrective Action Process	75
6.1	Unresolved Issues and Additional Analyses	75
6.2	TSA Report	77
6.2.1	Report Structure & Elements	78
6.2.1.1	Report Elements	80
6.2.1.2	Ranking TSA Findings	83
6.2.2	Draft Findings Summary (Report)	84
6.2.2.1	EPA Regional Office	85
6.2.2.2	Issuance of Draft Findings Summary	87
6.2.3	Final TSA Report	87
63 Audited Organization Corrective Action Plans	89
6.3.1	Auditee Response	89
6.3.2	Auditor Review	89
6.4	Corrective Action Follow-Up	90
6.4.1	Tracking	90
6.4.2	Follow-up Site Visit	92
6.5	TSA Close-Out	92
6.5.1 TSA Project File	93
6.6	TSA Information in AQS	94
7.0 TSA Training	96
8.0 EPA Audit Communication	97
8.1	Regional Management Elevation Points	97
8.2	Inter-Regional Elevation Points	98
8.3	OAQPS Elevation Points	98
vii

-------
EPA-454/B-17-004
November 2017
Appendices
Appendix A
Technical Systems Audit Questionnaire	100
Appendix B
Example Field Audit Logbook	101
Appendix C
Low-Volume Weighing Laboratory Audit Checklist	102
Appendix D
Laboratory Systems Review: Lead in Air
Audit Checklist	103
Appendix E
Example Technical Systems Audit
Report Template	104
Appendix F
Example Technical Systems Audit Close-Out Letter	105
Figures
Figure 4.1 TSA Pre-Audit Activities Flow Chart	15
Figure 5.1 TSA On-Site Activities Flow Chart	47
Figure 6.1 TSA Report Development and Corrective Action Flow Chart	75
Figure 6.2 Example Corrective Action Tracking Mechanism	95
Tables
Table 5.1 Pollutants and Corresponding Reference Methods and Guidance Documents	59
Table 6.1 Example Findings Categorization	83
viii

-------
EPA-454/B-17-004
November 2017
Acronyms and Abbreviations
APTI	Air Pollution Training Institute
AMNP	Annual monitoring network plan
AMTIC	Ambient Monitoring Technical Information Center
AQI	Air Quality Index
AQS	Air Quality System
ASTM	American Society for Testing and Materials
CFR	Code of Federal Regulations
CL	confidence limit
CBSA	core-based statistical area
COC	chain of custody
CSN	PM2.5 Chemical Speciation Network
CV	coefficient of variation
CY	calendar year
DAS	data acquisition system
DASC	Data Assessment Statistical Calculator
DQA	data quality assessment
DQI	data quality indicators
DQOs	data quality objectives
EDO	environmental data operation
EPA	Environmental Protection Agency
FEM	federal equivalent method
FR	flow rate
FRM	federal reference method
FY	fiscal year
GC/MS	gas chromatography mass spectrometry
HPLC	high performance liquid chromatography
ICP	inductively coupled plasma
IMPROVE	Interagency Monitoring of Protected Visual Environments
ISA	internal systems audit
IT	information technology
LDL	lower detectable limit
LIMS	laboratory information management systems
MDL	method detection limit
MFC	mass flow control
MQOs	measurement quality objectives
MSA	Metropolitan Statistical Area
NAAQS	National Ambient Air Quality Standards
NATTS	National Air Toxics Trends Sites
NCore	National Core Network
NIST	National Institute of Standards and Technology
NPAP	National Performance Audit Program
NPEP	National Performance Evaluation Program
ix

-------
Acronyms and Abbreviations (Continued)
OAQPS	Office of Air Quality Planning and Standards
SORD	Office of Research and Development
PAMS	Photochemical Assessment Monitoring Stations
Pb	lead
PE	performance evaluation
PEP	Performance Evaluation Program
ppb	parts per billion
ppm	parts per million
PQAO	primary quality assurance organization
PT	proficiency test
QA	quality assurance
QA/QC	quality assurance/quality control
QAGD	Quality Assurance Guidance Document
QAM	quality assurance manager
QAO	quality assurance officer
QAPP	quality assurance project plan
QMP	quality management plan
RO	Regional Office
SLAMS	state and local monitoring stations
SOP	standard operating procedure
SPMS	special purpose monitoring stations
SRP	standard reference photometer
TAD	technical assistance document
TEOM	tapered element oscillating microbalance
TSA	technical systems audit
TSP	total suspended particulate
VOC	volatile organic compound

-------
EPA-454/B-17-004
November 2017
Acknowledgements
This document is the result of the collaboration of the Technical Systems Audit Workgroup. The
Technical Systems Audit Workgroup is a group of quality assurance and air monitoring technical experts
from across all EPA Regions and EPA Headquarters who are involved in Technical Systems Audits for
ambient air monitoring organizations that monitor for comparison to the National Ambient Air Quality
Standards. The goal of the workgroup is to provide guidance, technical expertise, and tools to aid in the
technical systems audit process. The members of the workgroup at the time of development of this
document are as follows:
Gregory Noah, US EPA OAQPS
Michael Papp, US EPA OAQPS
Robert Judge, US EPA Region 1
Alysha Thompson, US EPA Region 1
Catherine Taylor, US EPA Region 1
Chris St. Germain, US EPA Region 1
Mustafa Mustafa, US EPA Region 2
Elizabeth Gaige, US EPA Region 3
Kia Hence, US EPA Region 3
Loretta Hyden, US EPA Region 3
Verena Joerger, US EPA Region 3
Laura Ackerman, US EPA Region 4
Stephanie McCarthy, US EPA Region 4
Richard Guillot, US EPA Region 4
Keith Harris, US EPA Region 4
Anthony Bedel, US EPA Region 4
Adam Zachary, US EPA Region 4
Scott Hamilton, US EPA Region 5
Bilal Qazzaz, US EPA Region 5
Chad McEvoy, US EPA Region 5
Kara Allen, US EPA Region 6
Trisha Curran, US EPA Region 6
James Regehr, US EPA Region 7
Leland Grooms, US EPA Region 7
Richard Payton, US EPA Region 8
Albion Carlson, US EPA Region 8
Joshua Rickard, US EPA Region 8
Ethan Brown, US EPA Region 8
Gwen Yoshimura, US EPA Region 9
Mathew Plate, US EPA Region 9
Randall Chang, US EPA Region 9
Michael Flagg, US EPA Region 9
Jennifer Williams, US EPA Region 9
Chris Hall, US EPA Region 10
Doug Jager, US EPA Region 10
Names that appear in bold are the primary contacts for the TSA workgroup in each EPA Region.
XI

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 1 of 105
Preface
Intent of Document
The intent of this document is to provide guidance to assist auditors in implementing the Technical
Systems Audit (TSA) requirements and to provide guidance and tools to aid in conducting TSAs of
ambient air monitoring programs. While this document is geared primarily for federal auditors
conducting TSAs of air monitoring programs required by CFR, the principles and tools provided in the
document could be a framework for an auditor performing a TSA of any ambient air monitoring
network. This document is intended to present "best practices" that, if implemented and followed,
would result in an accurate and complete assessment of a monitoring organization's ambient air
monitoring program.
This document makes use of internet links that provide the user with access to more detailed
information on a particular subject. Due to the limitations of Adobe, full URL addresses must be
provided for the links to function. Web links (URLs) to references are included as footnotes for the
reader to follow for additional information.
Document Review and Distribution
The information in this guidance document was developed by the members of the TSA Workgroup,
representing EPA Headquarters and the ten EPA Regional Offices, and has been subjected to a peer
review by the workgroup. This TSA Quality Assurance Guidance Document (QAGD) has been signed and
distributed by OAQPS Quality Assurance (QA) staff to promote consistency across EPA in understanding
and conducting TSAs. This guidance document may be viewed on the Internet and downloaded from the
EPA AMTIC Homepage1.
Recommendations for improvement are welcome, and comments should be directed to the Regional
TSA Workgroup contact identified in bold in the Acknowledgements section. The TSA QAGD will be
reviewed at least every 5 years by the TSA Workgroup and revised as needed. The TSA QAGD may
require more frequent revision following significant rule changes and/or to keep pace with technological
advances in monitoring methodology. Appendices that contain checklists or other audit tools may also
require more frequent updates.
1 httpsi//www. epa.gov/amtic/amtic-quality-assurance

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 2 of 105
1.0 Introduction
1.1	Technical Systems Audit (TSA)
A TSA is an on-site review and inspection of a monitoring organization's ambient air monitoring program
to assess its compliance with established regulations and guidance governing the collection, analysis,
validation, and reporting of ambient air quality data. A TSA is also an opportunity to highlight areas
within a monitoring organization where it has shown innovation and improvement, identify areas where
programs can be strengthened, and to provide feedback to the organization.
An effective TSA includes many components, such as document reviews, data reviews, interviews,
observations, and site evaluations. Performance audits and other activities may also be included in a
TSA, but the aforementioned components comprise the core of the TSA. Selecting audit staff with
pertinent expertise, clearly defining roles and responsibilities, and providing sufficient resources,
including both time and funding, are elements of the TSA process that promote the goal of completing
high quality TSAs. This guidance document provides best practices to aid in conducting TSAs to assess
the suitability and effectiveness of an ambient air monitoring quality system, to assess the quality of the
data collected, and to identify strengths and areas for improvement within each program.
1.2	CFR TSA Requirements
TSAs are required under 40 Code of Federal Regulations (CFR) Part 58, Appendix A, ง 2.5. Also, EPA
Order CIO 2105.02 establishes policy and program requirements for an Agency-wide Quality System that
include assessments to determine if data are "of sufficient quantity and adequate quality for their
intended use."
More specifically, 40 CFR Part 58, Appendix A, ง 2.5 states the following:
Technical systems audits of each PQAO shall be conducted at least every 3 years by the
appropriate EPA Regional Office and reported to the AQS. If a PQAO is made up of more than
one monitoring organization, all monitoring organizations in the PQAO should be audited within
6 years (two TSA cycles of the PQAO). As an example, if a state has five local monitoring
organizations that are consolidated under one PQAO, all five local monitoring organizations
should receive a technical systems audit within a 6-year period. Systems audit programs are
described in reference 10 of this appendix.
Reference 10 of the quote is the QA Handbook for Air Pollution Measurement Systems, Volume II
Ambient Air Quality Monitoring Program (EPA-454/B-17-001, January 2017)3, commonly known as the
QA Handbook or"Redbook".
To summarize 40 CFR Part 58, Appendix A, ง2.5, each EPA Regional Office must conduct a TSA of each
PQAO within its region every three years, and each monitoring organization within each PQAO should
be audited every six years. While the regulation does not require audits of the monitoring organizations
2	https://www.epa.gov/sites/production/files/2013-10/documents/21050.pdf
3	https://www3.epa.gov/ttn/amtic/files/ambient/pm25/qa/Final%20Handbook%20Document%201 17.pdf

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 3 of 105
within the PQAOs, it is strongly recommended that EPA conduct a TSA of each monitoring organization
within a PQAO every six years.
Note that there is no language limiting an EPA Regional Office from auditing more frequently. An EPA
Regional Office may perform a TSA of any federally-funded air monitoring organization at any time.
1.3 Regulation and Guidance
When conducting a TSA, the auditors must have a firm understanding of ambient air monitoring, quality
systems, and what direction is regulatory and what is considered guidance. Regulation may be defined
as language specifically written in CFR, whereas guidance is language written in guidance documents or
technical directives. Regulation may contain several terms which define how a regulation is to be
followed. To aid in this distinction, the following terms are used consistently throughout CFR and the
quality assurance guidance documents:
shall, must - when the element is a requirement in 40 CFR or the Clean Air Act
should -	when the element is recommended. This term is used when extensive
experience in monitoring provides a recommended procedure that would help
establish or improve the quality of data or a procedure. The process that
includes the term is not required, but if not followed, an alternate procedure
should be developed that meets the intent of the guidance.
may -	when the element is optional or discretionary. The term also indicates that what
is suggested may improve data quality, that it is important to consider, but it is
not as important as those that have been suggested using the term "should".
In summary, CFR language that contains "shall" or "must" is a requirement that must be followed. CFR
language that uses "should" or "may" are recommendations that are encouraged to help ensure the
quality of the data, but they are not requirements. CFR citations using the term "should" are not
requirements, but the organization is responsible for implementing a procedure that meets the intent of
the citation. By definition, language in guidance documents is guidance and not a requirement.
However, guidance documents are companion documents to the CFR and provide the best practices for
implementation of a program.
Examples of CFR language and interpretation:
40 CFR Part 50, Appendix L ง 10.14 - "The exposed filter containing the PM2.s sample shall be re-
conditioned in the conditioning environment in accordance with the requirements specified in
section 8.2 of this appendix."
This citation from CFR uses the term "shall" and is an example of a requirement that must be
followed.
40 CFR Part 58, Appendix E ง 3 - "Particulate matter sites should not be located in an unpaved
area unless there is vegetative ground cover year round, so that the impact of windblown dusts
will be kept to a minimum."

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 4 of 105
This citation from CFR uses the term "should" which emphasizes that it is recommended to
ensure data quality; however, it is not a requirement.
1,4 CFR Monitoring Requirements
CFR requirements provide the backbone for all ambient air monitoring networks collecting data for
comparison to the NAAQS. The specific requirement for conducting TSAs is located in 40 CFR Part 58,
Appendix A, ง 2.5. 40 CFR Parts 50, 53, and 58 provide the formal regulatory requirements for
implementing and maintaining ambient air monitoring programs and the related quality assurance
requirements that shall be followed.
A summary of the content in 40 CFR Parts 50, 53, and 58 is below.
40 CFR Part 50 - Contains the primary and secondary National Ambient Air Quality Standards
(NAAQS), Reference Methods for the NAAQS pollutants, and interpretations of the NAAQS.
40 CFR Part 53 - Contains general provisions, testing procedures, comparability determinations,
performance testing, and physical design characteristics for reference and equivalent ambient
air monitoring methods. Much of Part 53 contains manufacturing specifications for vendors;
however, configuration specifications for ambient monitoring are also included. Monitoring
organizations must ensure that the configuration for each analyzer meets the requirements of
Part 53.
40 CFR Part 58 - Contains general provisions, monitoring network requirements, special
purpose provisions, NAAQS comparability, data reporting, federal monitoring, and quality
assurance for ambient air quality monitoring.
Direction in 40 CFR Parts 50, 53, and 58 must be followed by the monitoring organizations. All audit staff
members should be well versed in these CFR requirements before conducting a TSA. The CFR references
listed above encompass the majority of the requirements for ambient air monitoring; however, other
relevant CFR references may be applicable. The CFR requirements can be viewed and downloaded from
the Internet4.
1-5 PQAO Quality Assurance Project Plans (QAPP)
Each PQAO conducting ambient air monitoring for NAAQS compliance must develop a QAPP for each of
its ambient air monitoring programs. The QAPP is an essential and required component of the quality
system and is defined in 40 CFR Part 58, ง 2.1.2. The section states:
The QAPP is a formal document describing, in sufficient detail, the quality system that must be
implemented to ensure that the results of work performed will satisfy the stated objectives.
PQAOs must develop QAPPs that describe how the organization intends to control measurement
uncertainty to an appropriate level in order to achieve the data quality objectives for the EDO
(Environmental Data Operations). The quality assurance policy of the EPA requires every EDO to
4 www.ecfr.gov

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 5 of 105
have a written and approved QAPP prior to the start of the EDO. It is the responsibility of the
PQAO/monitoring organization to adhere to this policy. The QAPP must be suitably documented
in accordance with EPA requirements (reference 3 of this appendix) and include standard
operating procedures for all EDOs either within the document or by appropriate reference. The
QAPP must identify each PQAO operating monitors under the QAPP as well as generally identify
the sites and monitors to which it is applicable either within the document or by appropriate
reference. The QAPP submission and approval dates must be reported to AQS either by the
monitoring organization or the EPA Region.
The QAPPs must illustrate how a monitoring organization will meet all regulatory requirements
described in 40 CFR Parts 50, 53 and 58. In addition, the monitoring organization should attempt to
conform to the recommendations in the QA Handbook unless an alternative is proposed that produces
data of acceptable quality (as described in the regulation and the QA Handbook). A QAPP may include
requirements that are more stringent than regulatory requirements, but it cannot be less stringent. Each
PQAO is responsible for developing the QAPPs and submitting them to an approving authority, which is
typically the EPA RO. QAPPs should also be updated annually, but minimally every five years, to reflect
regulatory and/or organizational changes within the monitoring organization, and approval dates must
be reported to AQS.
The QAPP is a written commitment between the approving authority and the monitoring organization
that the plan will be implemented and followed in the monitoring organization. As such, the EPA QA
staff must review the QAPPs during the TSA to determine if the monitoring organization has adhered to
these commitments.
1.6 Guidance Documents
Guidance documents present best practices for methods, interpretation of regulatory language, data
quality and defensibility recommendations, or other information for implementing or clarifying CFR. It is
important to note that guidance does not overrule CFR requirements. Guidance documents should
accurately reflect CFR requirements; however, it is good practice to verify guidance against CFR. If
discrepancies are found between the two, the auditor must always follow CFR direction. Guidance
documents may be written in the following forms including:
•	Quality Assurance Guidance Documents
•	Technical Assistance Documents
•	Technical Memoranda
•	Newsletters
The QA Handbook for Air Pollution Measurement Systems, Volume II is a guidance document designed to
ensure that the Ambient Air Quality Surveillance Program: (1) provides data of sufficient quality to meet
the program's objectives, and (2) is implemented consistently across the nation. The QA Handbook is a
one-stop source of quality assurance/control, network, sampling, and data handling information. The QA
Handbook also contains a list of other key guidance documents located within its Appendix B (Ambient

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 6 of 105
Air Monitoring Quality Assurance Information and Web Addresses). These guidance documents provide
additional detail and summarize CFR requirements. TSA staff should be familiar with these documents
before conducting a TSA.
The CFR is routinely revised over time as regulations are changed or sections are updated. In some
cases, the CFR may change but related guidance documents may be delayed or not updated at all.
Auditors must exercise caution when reviewing CFR or guidance documents to ensure the correct
revisions are being referenced.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 7 of 105
2.0 Technical Systems Audit Objectives
The primary objective of a TSA is to assess the suitability and effectiveness of the quality system being
implemented on behalf of EPA. For data collected for comparison to a NAAQS, the assessment of the
quality system includes investigating whether such data meet the established regulations to be used for
decision-making purposes. Decision makers require data that is of adequate quality, is defensible, and
that can stand up to public and legal scrutiny. TSAs are also an opportunity for the monitoring
organization and EPA to collaborate and address findings and vulnerabilities in the organization's quality
system and data. Cooperation between the monitoring organizations and EPA is an essential component
in conducting a successful audit. This document provides guidance on how to conduct the TSA and
includes tools for an auditor to use to assess various areas of the ambient air monitoring program.
Because of limited time and resources, it is not reasonable to expect that a TSA will uncover and address
every data quality issue, quality system weakness, or programmatic issue; however, a properly
conducted TSA will provide a thorough overview of the data quality of the network and a gauge of the
effectiveness of the quality system. If a TSA uncovers a weakness that results in the flagging or
invalidation of significant amounts of data, the quality system is compromised. For example, a large
amount of data from PM2.5 sampling is invalidated because method criteria are not followed. The quality
system should document a data review process that ensures method criteria are met as well as
provisions for training staff. If staff do not understand the method, and data review does not identify
the problem, then there are significant weaknesses in the quality system. The TSA should identify these
weaknesses and trigger a corrective action process to strengthen the organization's quality system,
thereby bolstering the quality and defensibility of the data.
TSAs are required to be completed every three years for PQAOs and are recommended every six years
for monitoring organizations within the PQAO; however, these audits are only one component of a
much larger quality assurance system. Monitoring organizations, EPA Regional Offices, and EPA
Headquarters (OAQPS) should be assessing the monitoring data against the regulations and methods on
a routine basis to ensure enough quality data is collected for decision-making purposes. A TSA by itself
every three or six years is not enough to ensure data quality.
2,1 Scope
A TSA should focus on the broad picture of the monitoring organization as a whole. While there may be
many elements of a TSA, there should be a balance in assessing the quality of the monitoring data
collected and reported by an organization. Monitoring organizations may have the following common
functional areas that should be investigated during a TSA:
•	Monitoring Headquarters
•	Maintenance and Repair Shop
•	Monitoring Data Review and Validation Group
•	Quality Assurance Team

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 8 of 105
•	Analytical Laboratories
•	AQS Submittal Group
•	Ambient Air Monitoring Sites
The areas listed above comprise the core of an ambient air monitoring network that should be included
in a TSA. If too much time or resources are devoted to one pollutant or area of the organization, other
important areas in the monitoring program may not be assessed properly, giving an incomplete picture
of the quality of an organization's data. An incomplete TSA has the potential to leave an organization's
monitoring data vulnerable. There may be situations where more emphasis in some areas may be
needed to address current priorities; however, this should not be used as a rationale to omit other parts
of the monitoring organization. A TSA is not solely a data review, document review, or a series of
performance evaluations; rather, it is a conglomerate of activities that should verify that an organization
has met its overall quality system requirements, as dictated by CFR and the organization's QA
documents (QMP, QAPP, SOPs), and has produced, and is producing, quality data.
2.2 Resources
TSAs are resource-intensive. Monitoring organizations differ in size and complexity, and each present
unique challenges for auditors. Various types of resources are required to conduct an effective TSA
regardless of complexity. A small local agency may require only a few months to prepare, audit, report
and follow-up on corrective action, while a large organization may require much longer to complete.
An important point when discussing resources and TSAs is that the audit is not complete when a final
report is written. Follow-up and corrective action for an organization can take significantly more
resources to complete than the preparatory work, on-site activities, and report generation. If resources
are omitted for the follow-up, it is realistic to assume that corrective action within the organization may
show minimal progress and the same findings will appear in the next audit with no significant progress.
For this reason, the follow-up could arguably be the most important component of the audit process,
and it should not be overlooked or understated.
The TSA may be the only time the auditors routinely visit the physical location of the monitoring
organization and have the opportunity to observe the actual practices that occur. For this reason, it is
important to ensure that the auditors have every resource available on hand. Items listed and described
in the following sections are required when conducting a TSA. The appendices of this document also
contain example checklists and templates that may assist the audit team.
2.2.1 Staffing
Trained staff is a necessity to conduct an effective TSA. Staff participating in TSAs should have applicable
program knowledge and technical knowledge of ambient air monitoring programs. For each TSA, a team
of auditors is highly recommended that, as a group, contains the knowledge and skills needed to address
the different areas of an organization and the monitoring network. Too few staff or staff without the
appropriate expertise can result in an audit where data quality issues are overlooked. More information
regarding staffing assignments and requirements for an audit are detailed in Section 4.2 of this QAGD.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 9 of 105
2.2.2	Funding
Funding is a key resource in conducting complete and effective TSAs. NAAQS monitoring programs
within a PQAO that are operated for compliance purposes are required to be audited by 40 CFR Part 58
Appendix A. While TSAs primarily focus on these NAAQS programs, PQAOs generally have other non-
regulatory monitoring programs, such as air toxics or PM2.5 Chemical Speciation, that should be included
in the TSA. The EPA Regional Office should provide funding to cover all NAAQS programs, at a minimum,
and all non-regulatory programs should be included in the TSA if possible. Regional Offices should
consider the size and scope of the organization when allocating funding for TSAs. Inadequate funding
and resources can result in an audit that is conducted at a level where issues and trends within a PQAO
are not identified, leading to potentially large issues at the organization in the future. It is important to
have an overview of the PQAOs and the scope of the audit to be conducted prior to allocating funds for
the TSAs.
Very simply, if adequate funding is not available to provide equipment, transportation, and to keep the
auditors on site for as much time as needed to audit the monitoring organization, the quality of the TSA
will suffer and important components of the audit may be rushed or omitted, resulting in a poor or
incomplete audit. Costs to consider may include:
•	Hotel reservations
•	Per diem
•	Airline flights
•	Rental cars
•	Fuel
•	Cameras
•	Computers
•	Field equipment (range finders, inclinometers, measuring tapes, etc)
2.2.3	Time
Time is a critical resource that must be considered for each audit independently. No two monitoring
organizations are alike, therefore, each organization will require different amounts of time to complete
an audit. When planning, it is essential to consider the size and scope of the organization that is being
audited in order to determine what is required.
It is also very important to understand that a TSA encompasses much more than the on-site audit itself.
Weeks, up to months, may be devoted to the review of an organization and its monitoring data prior to
arriving on-site. By the same token, upon completion of the on-site visit, more time will be required to
review the audit notes, make decisions, and write a report. Finally, follow-up on corrective action can be
difficult to project and can range from a couple months to over a year depending on the audit findings. A
TSA is a much lengthier process than a week of travel on a calendar may seem to represent.
2.2.4	Tools and Resources
A variety of tools and resources are required to complete a TSA. AQS monitoring data, quarterly data
reviews, and field equipment should be available to the auditors to assist in preparing for and

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 10 of 105
conducting the TSAs. Each audit will require all or many of these tools to complete the various
components of the audit. Without these tools and resources, the auditors will not be able to accurately
assess all components of the monitoring organization. Some of the most common tools and resources
that should be available to the audit team are described below, but depending on the monitoring
organization, other resources may be required.
2.2.4.1	Air Quality System (AQS)
AQS is the electronic database maintained by OAQPS as a repository for all ambient air monitoring
network metadata and monitoring data submitted by monitoring organizations. The data housed in AQS
includes:
•	Monitoring data
•	Site metadata
•	Network metadata
•	QA/QCdata
•	Data submission histories
AQS is designed to receive uploaded transactions, organize data, and store information for various
internal and external data assessments. AQS itself provides standard reports for extracting a variety of
information from raw data to quality assurance reports. These reports can help the auditor organize and
sort data efficiently in preparation for an audit. Several pertinent AQS standard reports are listed in
Section 4.4.1 of this QAGD that are helpful in preparing for a TSA.
Extracted raw data from AQS can also be a resource and used to generate valuable assessments of data
for external analysis outside of the standard AQS reports. Data handling programs, such as those
available on AMTIC5, can be used to create reports to compile and filter data that AQS currently cannot
report. The AQS reports and assessments of raw data should help highlight areas such as data
completeness, timeliness of data submission, and precision and accuracy acceptability when preparing
for TSAs. Furthermore, routine data assessment and analysis is essential to limit large amounts of data
loss from quality issues.
2.2.4.2	Quarterly Data Reviews
According to Section 2.4.3 of the FY16-17 OAR National Program Manager Guidance6, EPA Regional
Office staff should already be reviewing AQS data quarterly. Auditors should have these reviews
available as a resource for the TSA. These reports are a snapshot of the organization's monitoring
network and can identify problems which direct the focus of the TSA. Quarterly reviews can also limit
the amount of data that could be impacted by quality issues and can help initiate corrective action
before a TSA. As a warning, quarterly reviews will only include extracts and reports from data that
5 https://www3.epa.gov/ttn/amtic/qareport.html

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 11 of 105
resides in AQS, so many checks and/or datasets that are not required to be submitted to AQS will not be
accessible for review. As a practice, it is unacceptable for data review to only occur every three years as
a part of the TSA. Data review should be completed on a much more frequent basis.
2.2.4.3 Field Equipment
Auditing ambient air monitoring sites is a major component of any TSA, and it is important to have field
equipment on-hand and properly maintained to support these activities. Assessing monitoring sites may
require specialized measuring devices such as laser range finders and inclinometers for measuring
distances and heights of obstructions. These devices, and others like them, require routine service to
function properly and should be maintained so that they are ready for use at all times. Although not a
core activity of the TSA, performance audits may be conducted in conjunction with the TSA. These types
of audit equipment are more expensive, require a high level of expertise and training, and must be
operated according to a set procedure and regulatory requirements.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 12 of 105
3.0 Roles and Responsibilities
3.1	EPA Regions
The EPA Regions bear the majority of the TSA responsibilities and must have trained staff, resources, and
expertise to conduct TSAs effectively and on schedule. Furthermore, the EPA Regional Offices are
responsible for the oversight of all TSA corrective action activities and should have the resources to
follow up with the monitoring organizations to complete these actions. In the event that significant
findings arise in a TSA, the EPA Regions are also responsible for communicating these findings to
OAQPS to determine what affect the issues will have on designation decisions. More information about
elevating audit findings can be found in Section 8 of this QAGD.
As a whole, EPA Regional Office responsibilities for TSAs generally will include:
•	Identifying PQAOs and monitoring organizations within the region
•	Setting a TSA schedule for the PQAOs and monitoring organizations within a PQAO
•	Conducting the TSAs
•	Completing and reviewing the TSA reports
•	Following up on corrective action
•	Submitting TSA audit and close-out dates to AQS
•	Informing OAQPS when audit findings could effect a significant amount of data
Ideally, and as staffing allows, the audit staff who conduct TSAs will be independent from the
program/monitoring staff who are responsible for the NAAQS regulatory determinations. The EPA QA
staff should have extensive expertise in both quality assurance and ambient air monitoring. To augment
the audit team, the QA staff may request for other groups within EPA to participate and play a role in
the audit, such as database experts, programmers, or other technical experts. These groups have the
responsibility to answer questions and support the QA staff as requested.
The EPA QA staff will also have the primary role of ensuring that the corrective actions required to
resolve issues found in a TSA are on track and are sufficient to address the audit findings. In
coordination with the QA staff, program and grants staff from the Regional Office may also have a role
in ensuring that corrective action is completed. Corrective action may require a concerted effort in some
cases to affect change within an organization and these different groups together may be more effective
towards that end.
3.2	Office of Air Quality Planning and Standards (OAQPS)
OAQPS's role in the TSA process is to support the EPA Quality System by developing quality assurance
policy in the CFR, promoting consistency by developing guidance, aiding in implementing the quality
assurance policy, and acting as an arbiter for the EPA Regions in resolving issues related to data quality.
OAQPS provides the backstop for the EPA Regions in supporting decisions made regionally, but OAQPS
also ensures that a message and/or decision is consistently communicated to all regions. The EPA
Regions have the responsibility to communicate any issues discovered in TSAs that may have an impact

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 13 of 105
on NAAQS design values to OAQPS, and OAQPS has the responsibility to communicate these issues to
the QA and policy staff to determine the impact on regulatory decision-making.
Consistency is an important priority in OAQPS's role in the TSA process. It is OAQPS's goal that all
monitoring organizations are treated equally and fairly during a TSA. OAQPS provides direction in the
form of guidance and interpretation of CFR to help ensure that regional audit staff are using the same
guidance to make decisions. Consistency in invalidating data, flagging data, or recognizing improvements
or achievements in an organization is an important and essential component in conducting an effective
and fair audit program.
In accordance with 40 CFR Part 58, Appendix A, ง1.2.3, EPA reserves the authority according to "use or
not use" monitoring data when making regulatory decisions based on the assessment of data quality.
Therefore, issues that are uncovered during TSAs that have the potential to impact NAAQS attainment
or other important decisions should be discussed between OAQPS and the EPA Regions so that decisions
are consistent across the nation.
3,3 Contractors
Contractors may be used in conducting TSAs according to the EPA Quality Manual for Environmental
Programs (CIO 2105-P-01-0). Section 2.2.2.7 of the Quality Manual discusses discretionary functions
that may be performed by non-EPA personnel. The citation is below:
Discretionary functions that may be performed by either EPA personnel or non-EPA personnel
include:
Performing technical assessments of environmental data producing activities, both intramural
and extramural (on-site and off-site) according to a specific plan approved by the QAM.
Preparations for such assessments may include the acquisition or development of audit materials
and standards. Results (findings) are summarized, substantiatedand presented to the QAM or
authorized EPA representative.
A determination of whether an authorized Agency representative should accompany a
contractor's personnel should be made on a case-by-case basis only after coordination between
the responsible organization and contracting officer. Such coordination should include
consideration of the purpose of the accompaniment and clear definition of the Agency
representative's role and responsibility during the contractor's performance of the audit or
technical assessment to avoid the appearance of a personal services relationship.
First and foremost, if a contractor is used for a TSA, there must be a specific plan that is approved by the
Quality Assurance Manager. Next, the manual indicates that "A determination of whether an authorized
Agency representative should accompany a contractor's personnel should be made on a case-by-case
basis only after coordination between the responsible organization and contracting officer." In all cases
where contractors are used to assist in completing TSAs for NAAQS compliance, the contractor must be
under the guidance of the EPA audit lead.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 14 of 105
EPA Regions are expected to have the expertise and resources in-house to conduct TSAs without the
need for a contractor. However, contractors may be utilized to add a level of expertise that is missing
from an EPA audit team, but they should not be used to lead the audit. If contractors are added to an
audit team, the region retaining the contractors must ensure, prior to the audit, that there are no
conflicts of interest. If a region uses a contractor to add a level of expertise that is missing within the EPA
audit team, the region should shadow the contractors to gain expertise during the audit so that they
may be able to perform that function independently in the future.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 15 of 105
4.0 Pre-Audit Activities
EPA Regional Office staff will begin preparing for the on-site TSA three or more months in advance of
the on-site visit. Preparations include such activities as scheduling the TSA, assigning staff members to
participate in the TSA, and outreaching to the monitoring organization to confirm the TSA schedule and
audit plans. However, in addition to travel plans, the audit team will spend many hours reviewing the
monitoring organization's network configuration, programs, and quality system documentation, in
addition to reviewing and analyzing the organization's data in AQS. This section describes in detail the
pre-audit activities of the TSA process. The following flow chart (Figure 4.1) provides a visual summary of
these pre-audit steps and the timelines that the audit team should follow, which will be discussed in
more detail throughout this section.
Assign Audit Team
Determine Time Required for Scope of the Audit
O
Initial Contact with the Monitoring Organization

90 days
prior to audit
Determine Which Programs
and Facilities will be Audited
Develop Audit Strategy
^5-
^5
Distribute Audit Itinerary
Determine Which Field Sites
to Inspect
Quality System Document Review
Data Review
TSA Questionnaire Receipt and Review
1 Month
Prepare Preliminary Findings
prior to audit
Figure 4.1 TSA Pre-Audit Activities Flow Chart

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 16 of 105
4.1 EPA Regional Office TSA Scheduling
As previously discussed in Section 1.2 of this QAGD, 40 CFR Part 58, Appendix A, ง 2.5 requires TSAs of
each PQAO to be conducted by the appropriate EPA Regional Office every three years. TSAs of the
monitoring organizations operating under the oversight of a larger PQAO should be audited every 6
years. Each EPA Regional Office has multiple PQAOs within its jurisdiction. However, the number of
PQAOs, and subsequently, TSAs, varies by EPA Region; larger EPA Regions typically have more TSAs to
conduct over the course of the 3-year or 6-year period. With that in mind, each EPA Regional Office will
have a unique process regarding how it plans, prioritizes, schedules, and tracks the multiple TSAs it is
required to perform. Each EPA Regional Office is encouraged to develop an SOP that describes and
formalizes this process.
For planning purposes, it is suggested that each EPA Regional Office prepare a schedule at the beginning
of each fiscal year (FY), identifying those monitoring organizations requiring a TSA during the FY, and
establishing the tentative dates/time frames as to when these audits may occur. For example, the
schedule may specify the month of each audit, but leave the specific dates open for discussion and
coordination between the Regional Office and the monitoring organization. Another approach may be
that a specific week is tentatively selected for each TSA. Regardless of the scheduling method, it is
recommended that the EPA Regional Office share the tentative FY schedule with the monitoring
organizations as soon as it is formulated, so that all PQAOs have been given notice of their upcoming
TSAs.
4.1.1 Assigning the Audit Team
At the time when the yearly EPA Regional Office TSA schedule is formulated, it is recommended that one
QA staff member be assigned the role of lead auditor (project lead) for each TSA. The TSA lead auditor
will be responsible for leading and managing the specific TSA (project) throughout its duration, from the
initial, pre-audit planning stages through close-out of the TSA after all corrective action measures have
been completed. As a best practice, the EPA Regional Office should rotate the assignments of audit lead,
so that the same monitoring organization does not have the same lead auditor multiple TSAs in a row.
The rotation of lead is recommended because it eliminates biases, as well as provides the opportunity
for a "new set of eyes" to review the organization's quality system. Similarly, the practice of lead
rotation is a way of building the skill set and experience of the EPA RO QA staff - because each team
member, over time, will be afforded the opportunity to audit each PQAO - and each PQAO has its own
unique circumstances from which the auditor will learn and grow.
Another important element in TSA planning includes assembling an audit team to support the lead
auditor and assist with conducting the TSA. As stated previously in Section 2.2.1 of this QAGD, the best
practice for conducting a TSA is to assemble a team of QA staff members, ideally a group with expertise
in various facets of the monitoring program. Although TSAs are resource-intensive, the benefits of
assigning an audit team to conduct the TSAs should outweigh the associated costs. Benefits of an audit
team include the following:

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 17 of 105
•	Assembling a team of auditors will provide a larger knowledge base during the on-site TSA,
which should result in a more thorough and accurate audit.
•	Allows for non-conformances to be witnessed, assessed, and documented by more than one
EPA auditor. (Note: A non-conformance, in general terms, is a failure to comply with a required
standardsuch as a procedure formalized in an SOP.)
•	A team can cover more aspects of the monitoring program during the TSA in less time, which
promotes efficiency. For example, if the audit team consists of 4 staff members, two members
could audit field sites while the other 2 staff members simultaneously audit records in the
central office.
•	Provides the opportunity for immediate collaboration on site, which results in more efficient
and accurate assessments of non-conformances.
o The inefficiencies associated with having to call and/or email another EPA staff member
during a TSA, and potentially wait for a response, are minimized,
o Similarly, with other staff members available for consultation, the lead auditor will
spend less time researching subject areas that are not his/her strength, in order to
determine if the non-conformances identified during the on-site audit are significant.
•	Safety measure. Auditors are not traveling long distances or visiting remote field locations alone.
•	The TSA itself becomes more manageable for the lead auditor because certain responsibilities
can be delegated to other teammates. This, in turn, promotes a more focused and successful
audit.
•	Allows the opportunity for a team member(s) to be designated as a scribe.
o Provides the lead auditor more time to focus efforts on conducting the TSA and
interviewing monitoring organization staff, as opposed to taking notes,
o Usually results in more thorough documentation at the end of the TSA, which should
help with report writing.
•	Allows for multiple EPA auditors to develop a knowledge base about the monitoring
organization and its quality system, which can carry forward to the next audit (as well as
potentially assist EPA staff with other job responsibilities). If only one auditor is familiar with an
organization and that auditor leaves the Agency, then the knowledge gained during the previous
TSA(s) is lost.
•	Provides for a better quality audit overall. For example, when 1 auditor attempts to cover all
aspects of a TSA in a short amount of time, findings will be missed. This could have huge
implications for both the audited organization and EPA, especially given the 3 to 6-year time
period between audits. Therefore, having a second person present during the audit, at a
minimum, will decrease the chance of missing significant findings.
At a minimum, each TSA should be assigned 2 team members. For TSAs of large PQAOs - especially
those organizations which serve as PQAO for multiple smaller organizations - audit teams of 3 or more
QA staff members may be necessary. The ideal situation would be for audit teams to be even-
numbered, so that all auditors could work in pairs. It is acceptable for an audit team of 2 members to
separate during the TSA and perform certain inspections individually, in order to save time while on site.
However, the best practice is to work in pairs as much as possible during the TSA. Each auditor will see

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 18 of 105
things differently, and having two individuals available to observe procedures and review records will
provide a more thorough assessment, as well as decrease the likelihood of missing significant findings.
Working in pairs also provides the opportunity for one team member to lead and the other to scribe
during each phase of the audit.
Each Regional Office is encouraged to tentatively assign team members for each TSA when formulating
the FY audit schedule; although, assigning team members to accompany the lead auditor can occur any
time prior to the TSA. The complexity of the organization scheduled for the audit should be considered
when assigning the audit team. For example, if the organization has an analytical laboratory for criteria
lead analysis or NATTS, or if the organization operates a PMi0or PM2.5 gravimetric laboratory, an auditor
with expertise in those laboratories should be assigned to the team, if possible. During scheduling, team
members should be appointed with the understanding that assignments are subject to change.
However, for planning purposes, the ideal situation would be for all staff assigned to a TSA - whether as
audit lead or as a team member - to know those assignments in advance, so they can immediately mark
their calendars and begin to plan accordingly.
Thoroughly conducted TSAs can provide long-term benefits to both the EPA Regional Office and the
monitoring organization. Although each TSA is different, as a general rule each lead auditor should be
allowed at least 1 business week to conduct the TSA. Trade-offs between the number of staff
conducting the TSA versus the amount of time needed to conduct the TSA may need to be considered
when allocating resources. If the EPA Regional Office can assign only 2 staff members to conduct the
TSA, those two staff members should be allowed a longer period of time to complete the necessary on-
site inspections.
4,2 TSA Planning
The TSA is a multi-phase project. The first phase of the project involves conducting the TSA - which is
accomplished through audit planning in the office, followed by the on-site investigation, and culminates
with the generation of a TSA report. This phase of the project is completed solely by the EPA Regional
Office audit team. The second phase of the project involves corrective actions-which will be completed
by the monitoring organization (i.e., auditee). During the second phase of the project, it is the EPA
Regional Office's responsibility to remain engaged with the auditee and communicate regularly, in order
to oversee and track the progress of the corrective actions. This communication and oversight should
continue until the issues identified in the TSA report are successfully resolved, at which time the TSA
itself is "closed out". With that in mind, the TSA is a major undertaking that will take months to
complete.
The EPA staff member assigned to lead the TSA is charged with ensuring that the audit is conducted as
efficiently and effectively as possible. Towards that end, the lead auditor must dedicate a significant
amount of time to pre-audit planning activities, in order to formulate the specific audit itinerary, as well
as determine who on the audit team will conduct specific tasks. The lead auditor is also the primary
decision maker and point-of-contact for the overall project. To complete these responsibilities, the lead

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 19 of 105
auditor manages the TSA, but also leads the audit team, providing guidance and delegating
responsibilities as appropriate.
Preparation for an individual TSA should begin at least three months prior to the on-site visit of the air
monitoring organization. During the initial planning stages, the lead auditor should determine the
approximate length of the on-site audit (i.e., number of days), which laboratories and/or facilities will be
audited, as well as which specific monitoring stations will be inspected in the field. The size of the team
and the number of days allotted are critical factors that must be taken into consideration when planning
the individual TSA; previous audit findings should also be considered because on-site follow-up on
some issues may be warranted. The following will describe the general process that occurs "behind the
scenes" to formulate an efficient audit plan. Activities include:
•	Basic information gathering (pre-planning)
•	Contact and coordination efforts with the auditee
•	Formulating the specific audit plan
•	Audit plan submittal to auditee
4.2.1 Basic Information Gathering (Pre-Planning)
To begin planning the audit, the TSA lead auditor should first clarify the scope of the specific TSA by
asking three basic questions:
•	What aspects of the monitoring program will be covered during the on-site visit?
•	What time period is under review?
•	What issues were identified during the last TSA?
To answer these questions, the lead auditor should become familiar with the size and composition of
the monitoring organization's air monitoring network and facilities. The monitoring organization's last
EPA TSA report should also be reviewed.
The time period under review for the TSA can be defined quickly. The previous EPA TSA report should
identify the 3-year time period reviewed (e.g., calendar year (CY) 2011-2013). From that, the lead
auditor can project forward to the next 3-year block of time (e.g., CY 2014-2016). If the previous TSA
report did not clearly state the time period audited, the lead auditor can review data and records
maintained in the TSA's project file in order to determine the years covered. (Note: The time period
under review should be clearly stated within the TSA report; see Section 6 of this QAGD for more
information. Section 6 also contains information about project files.) Ideally, the TSA should cover the
previous 3 calendar years in their entirety, such that the auditor can review 3 years of certified data (for
criteria pollutants). However, depending on scheduling, if the TSA falls before May 1, then the
monitoring organization may not have completed their certification process for the previous year's data.
TSAs can be conducted any time, so audits conducted early in the calendar year are acceptable -
however, in these instances, the lead auditor must remain cognizant during the TSA that the monitoring

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 20 of 105
organization's final data validation and reporting processes may not be complete. If the audit is a follow-
up audit, the time period under review may be limited to only 1 -2 years.
Note: Follow-up audits are typically conducted when a TSA reveals major issues within an organization
that result in the need for increased EPA oversight during the corrective action phase of the project
Planning for a follow-up audit adheres to the same principles as described in this section of the QAGD,
except that the time period under review is reduced, and the focus-areas of the audit are driven primarily
by the findings from the recent TSA.
The monitoring organization's Annual Monitoring Network Plan (AMNP) is a good document to
reference regarding the composition of the monitoring network, and may contain much of the
information needed for strategy planning. The organization's website may contain charts or maps of the
air monitoring network's sites/locations. The website may also contain an organizational chart, which
may highlight any district or regional offices that may need to be visited during the TSA.
Focusing on the field component of the monitoring program, the auditor should identify the number and
types of monitoring stations the organization operates, the specific pollutants monitored, as well as any
special programs it may participate in. The auditor should be aware of which monitors are considered
SLAMS or SPMS, as well as which monitors are designated as regulatory or non-regulatory. Examples of
special programs to be cognizant of include NCORE, Near-Road, NATTS, UATMP, CSN/IMPROVE, and
PAMS. However, the auditor should also be aware of the monitoring program's centralized facilities.
With that in mind, the auditor should determine whether or not the monitoring organization operates
any of the following laboratories, and consider the physical location (i.e., address) of each when
planning the audit.
ฆ	PM2.5 and/or low-volume PM10 gravimetric laboratory
ฆ	Hi-volume PM10 gravimetric laboratory
ฆ	Lead laboratory
ฆ	CSN laboratory
ฆ	Toxics laboratory
ฆ	PAMS laboratory
ฆ	Instrument maintenance and repair facility ("shop")
ฆ	Testing/Standards/Certification facility ("QA lab/shop")
The lead auditor should also investigate whether or not the monitoring organization outsources any of
its monitoring activities, such as laboratory services, when planning the TSA. Although the monitoring
organization is asked questions about outsourcing in the TSA Questionnaire (see Appendix A of this
QAGD), it is best to know the answers in advance for planning purposes. For example, if the monitoring
organization's laboratory services are fulfilled by the EPA's National Contract Laboratory, and that
laboratory is audited by EPA Headquarters (or its contractors), then the Regional Office auditors may not
need to perform this audit directly. Instead, the lead auditor may be able to accompany the headquarter
auditors/contractors and serve as a witness to their audit. At a minimum, the lead auditor should obtain
copies of and read any previously-issued EPA audit reports of the contract lab, in order to become

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 21 of 105
familiar with their operations and be aware of the documented areas of concern. Also, knowing in
advance that the organization contracts its PM2.5 analyses to a laboratory in another state, for example,
helps the auditor focus the audit plan: the gravimetric analysis of filters will not be directly assessed
during the TSA, but the auditor knows time will have to be allotted to review the organization's
shipping/receiving activities, as well as its data validation procedures. It may also be prudent, in this
instance, for the lead auditor to request copies of the contract laboratory's QAPP and SOPs from the
monitoring organization - in order to ensure that the monitoring organization has these documents in
their possession and is well versed in their content.
The previous TSA report is also a good document to review to provide insight into the organization's
structure and facilities, such as its laboratories, off-site shops/facilities, regional/district offices, and/or
other local monitoring organizations operating within the PQAO. The previous TSA report can provide
insight into former problem areas that the lead auditor may want to follow-up on while on site, in order
to verify the success of the corrective action measures implemented. The previous TSA report can also
identify areas that were not reviewed during the last audit, which may result in the need for the auditor
to focus efforts on those specific areas during the current TSA. For example, if the previous TSA report
indicated that the only field inspection conducted was that of a proposed location for a new monitoring
site, then the lead auditor may decide that several air monitoring site inspections should be conducted
during the current TSA. Another example: If the previous TSA report indicated that the air toxics
network was not reviewed due to resource limitations, the lead auditor may decide to put more
resources towards air toxics in the current TSA, so that that component of the monitoring organization's
network is not overlooked two TSAs in a row. Furthermore, the prior TSA report and corrective action
follow-up will identify the findings that were not resolved and direct focus to these areas for the current
TSA.
After completing this basic research, the lead auditor should outreach to the monitoring organization to
officially begin coordination efforts.
4.2.2 Initial Contact with the Monitoring Organization
In order to best facilitate coordination efforts for the on-site TSA, the lead auditor should establish
communications with the monitoring organization (i.e., auditee) early in the planning process. The lead
auditor should complete the basic information gathering described above first, in order to have a "rough
draft" of the plan available to discuss with the organization's contact. It will also give the lead auditor a
rudimentary knowledge of the monitoring organization, which may help the auditor ask more informed
questions during the initial discussion.
The lead auditor should contact the appropriate individual at the monitoring organization (typically the
monitoring network's program manager or Quality Assurance Officer (QAO)) for this introductory call
approximately 3 months (~90 calendar days) prior to the scheduled visit. It is recommended that the
initial contact be made by telephone in order to more readily exchange information. During this
conversation, the auditor should offer introductions about himself and the TSA process. The lead auditor
should also use this opportunity to confirm that the dates initially selected to conduct the on-site TSA

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 22 of 105
are agreeable to the auditee. If there are conflicts, the lead auditor should discuss these concerns and
work out an alternate schedule. (The auditor will need to follow-up with EPA Regional Office
management about this scheduling change after the call.) If the tentatively scheduled dates had not
been previously shared with the monitoring organization, then this telephone call is needed to establish
the specific dates for the on-site TSA.
During the call, the lead auditor can learn whether there have been significant changes within the
monitoring organization since the last TSA, such as key personnel changes, changes in physical addresses
of specific facilities, or the outsourcing of any field or laboratory operations, among others. The lead
auditor should inquire at this time about the appropriate points of contact for arranging air monitoring
station inspections and laboratory audits. For example, the monitoring organization may prefer the
individual site operators be contacted directly to schedule air monitoring station inspections. Similarly,
laboratories may be located off-site and operated under a different management structure.
The lead auditor should also request specific information or documents needed to complete audit
preparation, if such information and documents are not readily available at the EPA Regional Office.
Requests may include items such as:
•	A listing/map of field monitor locations (sites)
•	Organizational chart(s)
•	The organization's QMP
•	All QAPPs relevant to the time period under review
•	All SOPs relevant to the time period under review
•	QAPPs and SOPs utilized by contractors who provide services for the organization
•	Copies of any waivers granted to the monitoring organization that should be taken into
consideration
Note: Some of this requested information may be available on the monitoring organization's website -
web-addresses or hyperlinks should be obtained, if the lead auditor doesn't already have access.
The conversation should also include a discussion about the TSA questionnaire that will be provided
to the auditee following the phone call. The lead auditor should explain its purpose to the auditee, and
a deadline for the return of the completed questionnaire should be established. At a minimum, the
questionnaire should be returned to the lead auditor one month prior to the scheduled on-site visit.
This is necessary in order to allow the lead auditor (and audit team) a sufficient amount of time to
review the monitoring organization's responses.
Finally, the lead auditor should explain to the auditee that all of the information provided will be further
reviewed in order to formulate a more specific audit itinerary. The refined audit plan will be provided
to the auditee approximately one month prior to the on-site TSA. By sharing the specific audit plan in
advance, the monitoring organization can review the itinerary and alert the auditor to any known
conflicts. In this manner, the monitoring organization can also make arrangements to ensure necessary
personnel are available on specific days to better facilitate the audit.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 23 of 105
Following this conversation, the lead auditor should summarize the discussion in an email (or similar),
recapping the scheduled date of the on-site audit and the date agreed upon for the return of the
completed questionnaire. The lead auditor should document the request for specific documents (QAPPs,
SOP, etc) in this email as well, requesting the monitoring organization submit them as soon as possible.
The lead auditor should also attach the EPA Technical Systems Audit Questionnaire (see Appendix A of
this QAGD) to the email. It is suggested that the lead auditor copy the audit team members on this
correspondence, to facilitate information sharing, as well as keep the audit team informed of audit plan
development.
4.2.3 Refining the Audit Strategy
After the lead auditor has completed basic information gathering and talked to the monitoring
organization's designated TSA contact to ask clarifying questions, the complexity of the monitoring
network should be clear. At this point, the lead auditor can make decisions regarding which aspects of
the program will be focused on during the TSA, based on the available resources. Generally speaking,
the regulatory SLAMS monitors should be given highest priority when formulating the audit plan;
however, the audit should cover as much of the monitoring network as reasonably possible. As stated
previously, a minimum of 1 week should be allotted for each TSA, which may necessitate weekend
travel in order to allow the audit team 5 full business days on-site.
If management has not already assigned the audit team, the lead auditor should make a
recommendation to management at this time regarding the assignment of team members to assist with
the audit. Moreover, it is also at this time that the lead auditor may request to management an increase
in resources (more time or people) because the basic information gathering has shown that the PQAO is
more expansive or complex than originally thought at the time when the FY audit schedule was
formulated. The lead auditor should identify to management which program area(s) or facilities cannot
be audited without additional resources. It is especially important to alert management of any specific
programs/areas that have not been reviewed for a number of TSAs. The additional resources requested
may not be available, but the lead auditor should still ensure that management is informed.
In order to maximize TSA coverage and efficiency during the time-frame that has been allotted to
conduct the audit, the lead auditor may decide to divide the audit team into smaller groups. Consider
the following scenarios as examples to illustrate this aspect of audit planning.
~ The lead auditor determines that the monitoring organization is a state PQAO with 16 field
stations that monitor all criteria pollutants (except for lead) and air toxics. The network includes
NATTS, NCORE, Near-Road, and PAMS stations. Sites are spread out across the state, such that
drive times between sites will be formidable. However, all laboratory services are outsourced,
so there are no analytical or gravimetric laboratories to inspect. The NATTS site is subject to an
upcoming audit by EPA Headquarters (or its contractors). If the auditor is allowed 5 business
days to conduct the TSA and the assigned audit team is only 2 members (i.e., the lead auditor
and a partner), the lead auditor may determine that the air toxics, NATTS, and PAMS sites will
not be reviewed during the TSA. However, if the lead auditor is allowed 5 business days to

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 24 of 105
conduct this same audit and provided an audit team of 4 members, then the lead auditor can
broaden the scope of the audit by formulating a different strategy. For example, the lead
auditor can divide the audit team into 2 groups of two, with one team auditing the monitoring
organization's central office (headquarters) and the other team auditing field sites, including
SLAMS and non-NATTS air toxic sites.
~	The lead auditor determines that the monitoring organization is a large state PQAO with 5 local
organizations operating under its jurisdiction. Across this large PQAO network, all 6 criteria
pollutants are monitored, as well as air toxics, and the organizations participate in NATTS,
NCORE, and Near-Road programs. All laboratory services for particulates, including lead, are
conducted in-house, but air toxics analytical services are outsourced. For this audit, the goal
would be to assess the monitoring network directly maintained by the PQAO, but also to
evaluate the monitoring organization's effectiveness as lead PQAO for the multiple
organizations. If the auditor is allowed 5 business days to conduct the TSA and the assigned
audit team is only 2 members (i.e., the lead auditor and a partner), the lead auditor may
determine that the monitoring network is generally too big to audit, given the allotted
resources. The lead auditor may need to request two weeks to conduct the audit, which may
still be inadequate. However, if the lead auditor is allowed 5 business days to conduct this same
audit and provided an audit team of 6 members, then the lead auditor can formulate a different
strategy that allows for multi-tasking. For example, the lead auditor can divide the audit team
into 3 groups of two, and the smaller teams can then audit different facets of the program
simultaneously throughout the business week. One team can audit field sites across the PQAO
throughout the week, whereas the other two teams can audit the central office and the
particulate laboratories. As a result, both SLAMS and toxics programs can be reviewed to some
extent, and the lead auditor can assess the overall effectiveness of the organization as PQAO.
~	The lead auditor determines that the monitoring organization is a small local program which
serves as its own PQAO. The monitoring network is comprised of 5 SLAMS monitoring stations
which collect ozone and PM2.5 data. The PQAO participates in no special programs, and it
outsources its PM2.5 gravimetric analyses to a contract laboratory in another state. In this
scenario, the lead auditor clearly knows the TSA will cover only NAAQS pollutants. If the auditor
is allowed 5 business days to conduct the TSA and the assigned audit team is 2 members (i.e.,
the lead auditor and a partner), the auditor can formulate a strategy that reasonably covers this
entire monitoring network. In fact, after assessing travel times between field sites, the lead
auditor may determine that only 3 or 4 business days are needed to complete the on-site TSA.
After prioritizing which programs and facilities will be assessed during the on-site TSA, the lead auditor
must then decide when and how the following activities will be accomplished, and by whom:
•	Entrance Briefing
•	Questionnaire Discussion with Auditee
•	Monitoring Station (Field) Site Inspections

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 25 of 105
•	Laboratory Inspections
•	Other Facility Inspections (such as the OA/Standards Lab and Maintenance/Repair shop)
•	Review of Standards' Certification Records
•	Data Review
•	Exit Briefing
When planning which days/times these activities will occur, the lead auditor will need to consider
whether the listed activities will be conducted jointly by the entire audit team, or if certain activities can
be performed separately by individual member(s) of the team. Towards that end, the lead auditor may
delegate some activities/inspections to audit team members, ideally based upon their knowledge and
expertise. For example, team members with expertise in laboratory operations should be assigned to
audit the analytical and/or gravimetric laboratories. Similarly, team members with expertise in
instrument operations may be assigned field site inspections, in order to ensure monitoring sites are
appropriately plumbed and instruments operated in the correct manner.
4.2.4 Selecting Specific Field Sites to Inspect
If resources allow, the lead auditor's goal should be to visit all monitoring stations in the monitoring
organization's network. However, if this is not possible, the lead auditor will need to select a percentage
of sites to inspect that reasonably reflects the composition of the monitoring organization's network.
The number of sites to visit will ultimately be determined by the number of days allotted for the audit,
the size of the audit team, and the physical distances between sites (i.e., travel time). Google Maps™ (or
similar) can be used to estimate distances and drive times.
When narrowing down site selection, the lead auditor (or delegated team member) should consider the
following questions/factors, and prepare a list of targeted sites. The lead auditor may want to consult
with the audit team when selecting these sites. Generally speaking, the highest priority sites are those
used for NAAQS regulatory decision-making purposes. The following bullets are listed in the suggested
order of priority.
•	Select sites such that field inspections will include at least one of each type of pollutant
monitored by the monitoring organization (particularly the NAAQS pollutants).
•	Select SLAMS sites over SPMS.
•	Consider capturing a variety of geographic locations across the network. Avoid centralizing the
field inspections to one primary location, such as those stations located in closest proximity to
the organization's headquarters.
•	Consider inspecting sites operated by different site operators. This is especially important when
the monitoring organization is structured such that site operators are located in different
district/regional offices which operate under a different management structure.
•	Review the previous TSA report to determine which sites were inspected during the last TSA.
Select different sites from the last audit, unless the previous audit report identified significant
issues at the site(s) that warrant a repeat visit(s).

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 26 of 105
•	Are there any potential planning actions or other decisions (e.g., designations, exceptional
events) in the next three years that could rely on the data from a specific monitor? If so, it may
be worth the lead auditor's time to ensure there are no significant issues with the operations of
the particular monitor/site.
•	Review the monitoring organization's data in AQS. Do the reports generated raise any concerns
about the data quality or operations at a specific site? If so, the lead auditor may want to
consider visiting the specific site to investigate these issues more closely. (Note: The selection of
site(s) based on AQS data review may not evolve until later in the planning process and,
resultantly, necessitate subtle changes to the travel itinerary.)
•	Consider inspecting newly established sites, in order to verify they are in good working order at
"start up" and meet 40 CFR Part 58, Appendix E requirements.
•	Review the most recent site photographs available, which may be found in the organization's
AMNP or possibly from photographs taken during NPAP audits (or similar). Do any sites appear
to have potential violations of Appendix E siting criteria? If so, these sites may need to be visited
on the ground in order to take measurements and confirm whether or not siting criteria is met.
o The AMNP may contain precise measurements obtained by the monitoring organization
during in-house Appendix E evaluations,
o Google Earth™ (or similar) can also be used to view sites aerially and estimate
measurements from trees and roads, etc.
•	Have any sites been granted an EPA waiver (e.g., for siting/Appendix E requirements)? If so, the
lead auditor may want to visit the site(s) in order to determine if the exception (waiver) is still
warranted.
•	Consider inspecting any site scheduled for shutdown, if possible, in order to evaluate and
document the site's final operating conditions.
•	Consider visiting the NCore, Near-Road, PAMS, and NATTS monitoring stations, if possible.
After considering the factors above, the lead auditor should have compiled a list of multiple sites
targeted for potential inspection. At this point, the lead auditor should review the site locations on a
map and determine the most logical, efficient travel route, as well as estimate travel times between
sites. The auditor must also factor in the time needed to complete the field inspections, once on site.
The field inspection should include the monitoring station's interior and exterior conditions (see Section
5 of this QAGD for more information). For planning purposes, the lead auditor should allot 1 hour for
each site inspection when planning the travel itinerary - at a minimum. More time is recommended per
site, if possible. With this information in hand, the total amount of time needed to audit the targeted
site list can be estimated. From this, the lead auditor may determine it is possible to visit all of the sites
selected. However, the lead auditor may determine that it is not feasible to audit all of the targeted
sites, given the resources allocated for the audit. In case of the latter, the site list must be further
prioritized and reduced.
The size of the audit team is also a critical factor when finalizing the number of sites to inspect. When
considering team member safety, it is best for auditors to work in pairs and visit the field sites together.
With that in mind, if the audit team size is limited to 2 people, then the lead auditor has two options to

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 27 of 105
consider for conducting the field inspections (and, therefore, finalizing the number of sites to audit).
First, the lead auditor and his/her teammate can travel to the monitoring stations together and conduct
the site inspections as a team. Under this scenario (and considering the minimum ~1 hour allotted for
each site), it is suggested that the lead auditor divide the field inspection activities in order to maximize
coverage in the limited amount of time. For example, one team member can assess the shelter exterior
and measure probe heights/distances to verify Appendix E siting criteria, while the other team member
inspects the interior shelter conditions and equipment. The second option would be for the lead auditor
to stay at the central office while the other team member conducts the site visits (or vice versa),
provided that the individual auditor inspecting the field sites is accompanied by personnel from the
monitoring organization. However, under this scenario, the lone auditor should be afforded more time
at each individual site, so that the site inspections can be completed without haste. The lead auditor
must consider these two scenarios when determining the number of sites that can be reasonably
audited, given the amount of time available. Using either approach, the number of sites selected for
field inspection will be kept to a minimum, because these same individuals remain responsible for
completing all other facets of the TSA.
If the audit team consists of more than 2 individuals, however, it may be possible to assign a smaller
team to separate from the main team and focus all efforts on monitoring station inspections. In this
manner, the "field team" can travel from site to site and inspect monitoring stations during each day of
the on-site TSA. Under this scenario, a larger number of sites can be selected for inspection, which will
provide the audit team a more well-rounded assessment of the organization's monitoring network and
field operations. If a field team is assembled that will travel separately throughout the duration of the
on-site TSA, the lead auditor should appoint one member as "field lead," responsible for ensuring
communications with the TSA lead auditor. The lead auditor can delegate the planning of the separate
travel itinerary to the field lead, if needed. When field inspections will be conducted separately from the
main audit, additional efforts must be made in advance to coordinate times and availability of site
operators to ensure site access/entry throughout the course of the week.
After the selection of field sites has been finalized, the lead auditor (or delegate) should prepare an
audit logbook to document the field inspections. Although any ledger-style logbook can be used to
record the auditor's observations in the field, a logbook prepared specifically for air monitoring site
inspections, prompting the necessary checks and measurements (both interior and exterior), will ensure
consistency during the inspections from site to site (and ultimately, from TSA to TSA). Appendix B of this
QAGD contains an example template that could be used as the basis for a field inspection logbook. The
auditor can print the template provided in Appendix B for each site selected for inspection. Or, the
auditor could create a template similar to the one found in Appendix B. It is recommended that the
auditor print AQS Monitor Description Reports (AMP 390) for each site selected for inspection, so that
the auditor can verify the accuracy of GPS coordinates, measurements, and other details entered in AQS
while on site. After printing these templates, it is recommended that the printouts be bound and page-
numbered to form a logbook.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 28 of 105
4.2.5 Finalized Audit Itinerary
Once the lead auditor has completed initial planning and developed an itinerary for the TSA, the auditor
should share that itinerary with the monitoring organization in advance of the on-site audit. It is
recommended that the finalized audit itinerary be provided to the organization ~ 1 month prior to the
on-site TSA. Having an agreed upon itinerary with the monitoring organization provides notice and also
allows the agency to allocate proper time, personnel, and resources towards the upcoming audit. It is
also recommended that the lead auditor submit the audit plan to the monitoring organization in writing
- via email or in a more formal format. The following tables give examples of audit plans for two
different monitoring organizations, for illustration purposes.
Example 1: Local PQAO, Audit team of 2
Date
Activity
Time
Day 1
Entrance Briefing;
Questionnaire Discussion;
General Tour of Facilities;
PM10 Gravimetric Laboratory
9 am
Estimate lab audit in the PM
Day 2
Monitoring Site Visits:
Sites ID #s A & B
8:30am - 5:00 pm
Day 3
Certification Records Review;
Data Review
8:30am - 5:00pm
Day 4
Exit Briefing
9am
Example 2: State PQAO, Audit team of 6, with 2 teams traveling separately. Two audit
itineraries developed to reflect the activities of the two teams.
CENTRAL OFFICE ITINERARY - 4 staff members
(Note: Team will divide into 2 smaller teams and separate to audit the PM2.5 Laboratory)
Date
Activity
Time
Day 1
Entrance Briefing;
Questionnaire Discussion;
General Tour of
Facilities/Grounds
8:30am - 5pm
Day 2
Standards (QA) Laboratory;
Maintenance/Repair Lab;
PM2.5 Gravimetric
Laboratory;
Begin Certification Records
Review
8:30am - 5pm
Estimate labs in the AM and
records in the PM

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 29 of 105



Day 3
Complete PM2.5 Gravimetric
Lab; Complete Certification
Records Review; Begin Data
Review
8:30am - 5pm
Estimate labs/certs in the AM
and data in the PM
Day 4
Data Review
8:30am - 5pm
Day 5
Exit Briefing
9am
FIELD SITE INSPECTION ITINERARY - 2 staff members
Date/Time
Activity
Location
DAY 1


6:00 AM-12:00 PM
Drive
Work Station to Site 1
12:00-1:00 PM
LUNCH

1:00-2:00 PM
Site Evaluation
Site 1
2:00-4:00 PM
Drive
To vicinity of next site
4 PM
Hotel




DAY 2


8:00-9:00 AM
Site Evaluation
Site 2
9:00-11:00 AM
Drive

11:00 AM -12:00 PM
Site Evaluation
Site 3
12:00-1:00 PM
LUNCH

1:00-3:00 PM
Drive

3:00-4:00 PM
Site Evaluation
Site 4
4:00-5:30 PM
Drive
To vicinity of next site
5:30 PM
Hotel




DAY 3


7:00-8:00 AM
Site Evaluation
Site 5
8:00-10:00 AM
Drive

10:00- 11:00 AM
Site Evaluation
Site 6
11:00 AM -12:00 PM
LUNCH

12:00-3:00 PM
Drive

3:00-4:00 PM
Site Evaluation
Site 7
4:00 PM
Hotel




Day 4


7:00 AM -9:30 AM
Drive
To Site 8

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 30 of 105
9:30-10:30 AM
Site Evaluation
Site 8
10:30- 11:00 AM
Drive

11:00 AM -12:00 PM
Site Evaluation
Site 9
12:00-1:00 PM
LUNCH

1:00-4:00 PM
Drive
Return to work station
The audit plan depicted in Example 1 shows a full day allotted for monitoring station inspections, but it
is not as detailed as the field itinerary depicted in Example 2. In the first example, with only a two sites
selected for inspection, the precise travel times may not need to be specified prior to the on-site TSA.
Additionally, the monitoring organization staff may have suggestions regarding the most efficient travel
routes, being more familiar with the area's traffic patterns and geography. The travel route can be
discussed with the monitoring organization prior to the on-site TSA, or it can be fine-tuned during the
entrance briefing when audit logistics are final-reviewed.
4.3 Quality System Document Review
A quality system is the means by which an organization manages the quality of the monitoring
information it produces in a systematic, organized manner. It provides a framework for planning,
implementing, assessing, and reporting work performed by an organization, including carrying out
required QA/QC activities, and improving work processes via the corrective action process. Pursuant to
40 CFR Part 58, Appendix A, ง2.1, all PQAOs must develop a quality system that is described and
approved in quality management plans (QMPs) and quality assurance project plans (QAPPs) to ensure
that the monitoring results provide data of adequate quality for the intended monitoring objectives. As
stated in Section 2 of this QAGD, a primary objective of a TSA is to assess the suitability and
effectiveness of the quality system being implemented by the monitoring organization on behalf of EPA.
Although the CFR and the consensus-built data validation templates in Appendix D of the QA Handbook
can be used as basic guides to assist the auditor, in order to truly assess the monitoring program's
quality system, the auditor must evaluate the organization against the specific policies and practices
established within its quality documents. With that in mind, the auditor must be familiar with the
contents of the documents in order to conduct a successful audit.
4.3.1 Document Status
During initial contact within the monitoring organization, the lead auditor should request copies of the
monitoring organization's active quality system documents (if these documents are not already available
at the EPA Regional Office). If the monitoring organization maintains its quality documents on its
website, the lead auditor should request a hyperlink(s) and/or a listing of the active SOPs, so that the
auditor downloads the correct files. Upon receipt of the quality system documents, the lead auditor
should check their status in order to determine whether they are current and approved.
Monitoring organizations should review their quality system documents on an annual basis and
document and track the reviews. QMPs and QAPPs should be revised every 5 years, minimally, or

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 31 of 105
revised as soon as possible after significant changes occur (if before the expiration of the 5-year period).
40 CFR Part 58, Appendix A, ง2.1.2 states that QAPPs must include SOPs, either within the document or
by appropriate reference. Similarly, the EPA document Requirements for Quality Assurance Project Plans
(EPA QA/R-5) indicates that SOPs are part of the QAPP and states, "Current versions of all referenced
documents must be attached to the QAPP itself or be placed on file with the appropriate EPA office and
available for routine referencing." As such, SOPs should, ideally, be revised every 5 years along with the
QAPP. However, minimally, SOPs should be reviewed annually and revised when procedures change.
The specific revision schedule for these quality system documents may be included in some EPA
Region's grant commitments. Therefore, the auditor should review the grant commitments in order to
confirm the specific time frames for each document type. The revision of the documents must be
completed in accordance with the grant requirements.
The lead auditor should observe the revision dates stated on the quality documents in order to estimate
whether or not they have been revised within the expected time frames. The QMP, QAPP, and SOPs
should contain both a revision number and revision date, either on a cover page or within the document
header. If documents are observed that appear to be more than 5 years dated, the lead auditor should
earmark them for further discussion with the monitoring organization.
Similarly, the lead auditor should determine the approval status of each document. As a best practice,
each document should include an approval page, which contains the signatures of the individuals within
the monitoring organization's chain-of-command, and/or the EPA's designated approving official, who
have reviewed and approved the document. If the document does not contain a signature/approval
page, the lead auditor should determine whether or not a separate approval letter was issued by the
monitoring organization or EPA Regional Office to document that the QMP/QAPP/SOP had been
reviewed and approved. If this information is not readily available, the lead auditor can request copies
of approval letters for any documents in question from the monitoring organization during the on-site
TSA. If any document appears to be "unapproved", the lead auditor should earmark it for discussion
with the monitoring organization. Similarly, any QAPP that is "conditionally approved" should also be
earmarked for closer review and additional discussion with the monitoring organization. The QA policy
of the EPA requires every environmental data operation (e.g., air monitoring project) to have a
written and approved QAPP prior to the start of the project (see 40 CFR Part 58, Appendix A, ง2.1.2).
As such, unapproved or partially approved QAPPs are non-conformances that should be identified in
the written TSA report, should the on-site investigation confirm the unapproved or conditionally-
approved document status.
A final step during the document status review, specifically in regards to QAPPs, is to verify the
status/approval dates of the documents supplied by the monitoring organization with the
status/approval dates found in AQS. Any discrepancies between AQS and the documents should be
noted for discussion and subsequent reconciliation.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 32 of 105
4.3.2 Document Content
The monitoring organization's QMP should illustrate the quality system infrastructure that will be
described in the monitoring organization's QAPP. As such, the lead auditor should complete a cursory
review of the QMP. However, the documents that will be most valuable to the audit team during the
TSA are the monitoring organization's QAPPs and SOPs - because those documents contain the specific,
technical procedures that drive the air monitoring program's daily operations. The monitoring
organization's QAPPs are like a contract - and the organization is held accountable to policies and
procedures established therein. During the TSA, the lead auditor is charged with assessing whether or
not the monitoring organization is implementing its QAPP, and its referenced SOPs, as written. With that
in mind, the lead auditor's goal should be to become well versed in those procedures prior to the on-site
TSA, in order to effectively assess how those procedures are implemented on the ground. As a result of
that goal, an important audit preparatory activity is prioritizing the review of the monitoring
organization's numerous quality documents, and then completing the content review - which may
involve delegating some responsibilities to audit team members. The following describes this process.
Ideally, all of the monitoring organization's quality documents should be reviewed at some level prior to
the on-site TSA. However, depending on the size and complexity of the organization's network, the
number of documents to review may be too great, given the time and resources allotted for the audit.
At a minimum, the lead auditor should attempt a cursory review on all documents, while completing a
more in-depth review on the higher priority documents. When prioritizing the document review, the
lead auditor should consider the monitoring organization's QAPPs as the highest priority documents.
SOP review should occur after QAPP review.
QAPPs should be reviewed closely by all audit team members, so that everyone on the audit team is well
versed in the monitoring organization's quality system requirements. It is important to review the
QAPPs that cover the 3-year time period specific to TSA. For example, if the time period covered by the
TSA is 2014-2016, then the lead auditor should review the QAPP(s) that was "in effect" during that
specific time period. When reviewing the content of the QAPPs, recognizing the regulations and EPA
guidance that were in effect during the specific 3-year time period is also of primary importance. If the
auditor observes what appears to be a discrepancy between the organization's procedures in
comparison to regulatory requirements, the auditor should review the regulations that were in effect
during the time period specific to the TSA in order to verify the discrepancy. For the 2014-2016 time
period, for example, the auditor would need to be cognizant of two different sets of regulatory
requirements, since 40 CFR Part 58 was revised in March 2016 (effective April 2016). As such, the lead
auditor would expect to see revisions to the monitoring organization's QA/QC requirements in mid-
2016, with implementation anticipated most likely by the 4th quarter of that year.
It is important to note that monitoring organizations often submit the most recent versions of their
quality documents to the lead auditor prior to the TSA; at times, the submission includes documents the
organization has recently revised while preparing for the TSA. If the QAPP submitted to the auditor is a
new version and not truly applicable to the time period of the TSA, the lead auditor should contact the

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 33 of 105
monitoring organization and request they submit the previous (older) QAPP for review as well. If time
allows, the auditor can review the newer QAPP, in order to ensure the revised procedures are
acceptable and appropriately implement current regulatory requirements.
After the QAPPs, the order of priority for the remaining quality documents (SOPs) depends on the
composition and complexity of the monitoring organization's network. Like with the QAPPs, monitoring
organizations often revise SOPs immediately prior to a TSA, so it is important for the auditor to be aware
of the effective dates of the documents. The goal should be to review all of the SOPs for the criteria
pollutant monitors, at a minimum. The lead auditor may delegate the review of SOPs such that this goal
is accomplished. However, if the number of documents is too great, given the size of the audit team,
then the SOP review should be prioritized. The following is the recommended approach for prioritizing
the review of the monitoring organization's SOPs.
•	Ozone SOPs should be reviewed for all audits.
•	PM2.5 SOPs should be reviewed for all audits, including SOPs for the gravimetric laboratory (if
operated in-house by the monitoring organization).
•	The data handling/validation SOP should be reviewed for all audits. If the monitoring
organization doesn't have a data handling/validation SOP, ask why during the on-site TSA.
•	Will the audit include an inspection of any laboratories (beyond PM2.5)? If so, the SOP(s) for
those specific analytical (Pb or toxics) or gravimetric (PM10) procedures will need to be reviewed.
•	Will any specific data set be under scrutiny due to upcoming designations or similar actions? If
so, the lead auditor may want to review the SOP(s) for that particular pollutant.
•	For the remaining documents, prioritize them by age (revision date). Older, active SOPs may be
more pertinent to focus on, because of the chances that procedures are outdated or in need of
improvement.
•	Review SOPs for new instrumentation or programs. For example, if the monitoring organization
switched from intermittent to continuous PM2.5 samplers during the time period under review, it
may be prudent to ensure the monitoring organization has started out on the right foot with the
different technology.
•	Review the SOP for any monitor/methodology in which the EPA Regional Office is aware that the
monitoring organization is struggling. The auditor can read these documents in order to offer
assistance to the organization during the on-site TSA.
After determining which documents need to be reviewed, the audit team members should begin
studying the documents. The sooner the audit team starts this process, the more time available to read
and examine them prior to the start of the on-site TSA. It is suggested that the audit team read the
documents prior to reviewing the monitoring organization's data in AQS; the SOPs should shed light on
the coding that will be observed in the data sets, as well as the frequency of QC checks, calibrations, and
maintenance activities. As a goal, each document should be closely read, in order for the audit team to
become well versed in the monitoring organization's procedures. At a minimum, though, each
document should be given a cursory review.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 34 of 105
When reviewing the monitoring organization's QAPPs and SOPs, the auditor should look for the
following:
•	Are the QAPPs/SOPs current and properly reflect the organization's network? (For example, are
the SOPs written for monitors actually deployed, as opposed to models that are no longer in
use?)
•	Are there any QAPPs or SOPs missing, given the monitoring network composition and expected
instrumentation?
•	Do the QAPPs and SOPs contain specific, technical details (i.e., more direction than simply "See
the User Manual")?
Note: Although the majority of information found in vendor-developed user manuals
may be appropriate to reference, many times the manuals provide more than one option
for method implementation and are not specific to the organization implementing the
method. Therefore, monitoring organizations are encouraged to utilize these methods,
but edit them to make them specific to the organization.
•	Do QAPPs and SOPs correctly implement the monitoring regulations?
Note: This is especially important for QAPPs and SOPs that are impacted by recent
regulatory changes. Note the date of the SOP revision in comparison to the regulatory
modification(s). Has the monitoring organization made necessary and appropriate
modifications to its procedures and quality documents, in accordance with the
new/revised regulatory requirements?
•	Do any stated SOP procedures contradict the QAPP, regulatory, or method requirements?
•	Do any procedures go against EPA guidance - if so, is justification provided within the
document?
•	Are SOPs readable, so that site operators easily understand the process?
The auditors should earmark any concerns noted during the document review for follow-up with the
monitoring organization during the on-site TSA.
4.3.3 Important Note about QAPP Approvals
As stated previously in Section 4.3.1, the QA policy of the EPA requires every environmental data
operation to have a written and approved QAPP prior to the start of the monitoring project. Generally
speaking, the submittal and approval of the QAPP - and the SOPs it contains - constitutes an agreement
between EPA and the monitoring organization.
For most monitoring organizations, the QAPP will be submitted directly to the EPA Regional Office for
review and approval prior to the start of the monitoring project. With that in mind, during pre-audit
activities, the EPA staff assigned to the audit team may have previously reviewed the document when it
was initially submitted for approval. If that submittal/approval was recent, the audit team may only
need to conduct a cursory review of the document as a refresher.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 35 of 105
However, some monitoring organizations are delegated the authority to self-approve their QAPPs. In
these cases, the EPA Regional Office must receive a courtesy copy of the QAPP, but a staff member may
not be assigned by Regional Office management to review and approve its content. However, in order to
assess the effectiveness of the quality system being implemented on behalf of EPA, the lead auditor
must assess the monitoring organization against the policies and procedures formalized in its QAPPs and
SOPs. This is accomplished in two parts: first, through review of the monitoring organization's quality
system documentation during pre-audit activities; and second, through observation of the procedures
and practices implemented by the organization during the on-site TSA. For the monitoring organizations
that self-approve their QAPPs, the pre-audit activities for the EPA Regional Office auditor must include
an in-depth review of the QAPP to ensure it meets EPA specifications. 40 CFR Part 58, Appendix A, ง2.1.1
states, "Where a PQAO or monitoring organization has been delegated authority to review and approve
their QAPP...The QAPP will be reviewed by the EPA during systems audits or circumstances related to
data quality." With that in mind, the auditor should ensure that the QAPP, at a minimum, conforms to:
1)	all regulations;
2)	the validation templates (critical criteria, at a minimum); and,
3)	generally agrees with the QA Handbook and/or applicable methods (such as the QAGD 2.12) or
provides an acceptable alternative.
If the auditor determines that the QAPP is inadequate, this issue must be addressed with the
monitoring organization during the on-site TSA and identified in the TSA report. In this situation, it is
likely that other findings will be identified during the TSA that stem from the issues noted within the
monitoring organization's QAPP.
4,4 Data Review
During the TSA, the lead auditor is charged with performing an audit of data quality. An audit of data
quality includes reviewing supporting documentation and records, maintained by the auditee and not
available in AQS, in order to ensure the data reported to EPA is accurate, traceable, and defensible.
While on site, the lead auditor will have limited time to complete the audit of data quality, which is
typically a lengthy process. Therefore, in order to maximize efficiency during the on-site TSA, it is
imperative for the lead auditor to evaluate the organization's data in AQS, to the best extent possible,
prior to arriving on site. During three years, a monitoring organization will generate a large quantity of
ambient concentration and QA/QC data; it will not be possible for the audit team to verify the accuracy
and integrity of all data points. With that in mind, during audit preparatory activities, the lead auditor is
tasked with evaluating the organization's data, and then selecting only a limited number of critical data
points to scrutinize during the on-site TSA. Through this process, the lead auditor will evaluate whether
or not the monitoring organization is validating its data in accordance with its QAPP/SOPs and EPA
requirements. These few data points, then, will be used to generally surmise the quality of the overall
data sets, through confirmation of the effectiveness of the organization's quality system and its
documentation and recordkeeping practices. When preparing for this phase of the TSA, the lead auditor
may delegate the review of specific pollutant data sets, or specific AQS reports, to audit team members.
Given the size of the monitoring organization's network and the quantity of data produced, the

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 36 of 105
delegation of data review activities may help make this particular TSA task more manageable and
focused.
Because time is a limiting factor while on-site, it is important for the lead auditor to select a reasonable
number of data points to represent the different pollutant data sets, and select those points for scrutiny
that will be most effective in challenging the monitoring organization's quality system. To begin this
process, the lead auditor (or delegate) should extract the organization's data from AQS using a variety of
standard reports that are available (see Section 4.4.1 below). Each report has a different purpose and
will highlight during facets of the organization's data. After the data has been queried, the auditor
should look for recurring or systemic issues within the data sets, and highlight any issues or anomalies
observed. The null value codes and AQS qualifier flags utilized by the monitoring organizations when
reporting concentration data to AQS will "tell a story", when used properly. By reading the null codes
and flags in the AQS data, the auditor can generally surmise whether or not SOPs appear to be followed
(which will be confirmed later during the on-site audit). The auditor will be able to see through the data
coding, for example, an analyzer malfunction, followed by maintenance/repair activities by the site
operator, followed by a recalibration of the analyzer before ambient data collection resumes. This would
be the anticipated sequence of events following a malfunction (and, thus, should be specified in the
organization's SOP). Similarly, the codes for calibrations and QC checks should be visible in the data and
spaced at the frequencies established in the SOPs. If the data coding illustrates unusual events, or
anticipated codes are missing (such as those for the QC checks), the auditor may decide that further
investigation into the associated data points is warranted.
During data review, it is extremely important for the auditor to remain cognizant of the 3-year time
period under review. Any issues uncovered in the data should be compared to the regulations and
guidance which were in effect during the specific time period under review. For example, if EPA
regulations recently changed (e.g., in 2016), but during the time period of the TSA (e.g., 2013-2015) the
organization was successfully implementing previous regulations, then it is inappropriate to cite the new
regulations as justification for a non-conformance (i.e., audit finding). Similarly, the auditor should be
cognizant of the requirements established in the monitoring organization's QAPPs and SOPs, and judge
the organization's performance based upon the specific quality documents that were effective during
the TSA review period.
4.4.1 AQS Reports
The most common data assessment tools accessible to all EPA auditors - and the monitoring
organizations - are the standard reports available for query in the AQS database. Of the numerous
standard reports available in AQS, several are particularly helpful when preparing for and conducting
TSAs. The auditor should select the appropriate AQS reports to pull, based upon the scope of the
individual TSA. For example, if the TSA will focus only on regulatory NAAQS pollutants, the lead auditor
may not need to pull reports for non-regulatory data, such as air toxics or non-FEM continuous
particulate samplers.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 37 of 105
The following list suggests common AQS reports that may be helpful to the audit team, providing a brief
description of the report's purpose and potential use. It is important to note that this list is not all-
inclusive, nor does it suggest that each report listed should be queried during TSA preparation. Other
reports are available for review which may be beneficial to the audit team. Similarly, due to the
structure and organization of some EPA Regional Offices, the staff who conduct TSAs may or may not be
the same individuals who review and approve AMNPs or apply concurrence flags to data as a result of
the annual data certification process. Therefore, some reports listed below may or may not have been
previously reviewed by EPA staff members as part of other job responsibilities, and, therefore, may or
may not need to be reviewed during the TSA.
•	AMP251 QA Raw Assessment Report
This report lists the results (i.e., % difference) of each individual QC check performed for
the pollutant of interest. It includes the results of performance evaluations, NPAP/PEP
audits, lead audit strip analyses, and collocation assessments as well.
•	AMP256 QA Data Quality Indicator Report
This report calculates summary statistics at both the monitor and PQAO levels for the
QC checks performed by the organization. It is used to determine whether or not the
organization is meeting the established data quality objectives for the criteria
pollutants.
•	AMP350 Raw Data Report
This report shows hourly ambient concentrations for the continuous analyzers and
samplers in the monitoring network, as well as concentrations for the intermittent
particulate matter samplers (i.e., 24-hour samples) collected by the organization.
•	AMP350MX Raw Data Max Values Report
This report provides the highest concentration value for each day for the pollutant of
interest. This report is helpful for reviewing 5-minute S02 data, particularly for those
organizations that report only the highest 5-minute average from each hour. (Compare
to AMP 501 below.)
•	AMP390 Monitor Description Report
This report lists descriptive information on the location and configuration of monitoring
sites and monitors. Information includes geographic descriptions, probe configuration
descriptions, and locations of other items that may have an impact on the data collected
by the site (such as nearby streets and obstructions).
•	AMP430 Data Completeness Report
This report calculates monthly and annual data completeness for each monitor in the
specific pollutant network.
•	AMP480 Design Value Report
This report calculates the design value for each site in the network for each criteria
pollutant. From that, the auditor can determine the overall highest concentration sites
in the network.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 38 of 105
•	AMP501 Extract Raw Data
This extract produces a text file that contains the organization's ambient concentration
data in raw format. This extract may be helpful for reviewing S02 5-minute data,
particularly for those organizations that report 12, 5-minute blocks of data for each
hour.
•	AMP503 Extract Sample Blank Data
This extract produces a text file that contains the organization's PM2.5 field or trip blank
data, if the monitoring organization operates a manual (FRM) PM2.5 network. Field blank
data is required to be reported to AQS in accordance with 40 CFR 58.16; however, the
submittal of trip blank data to AQS is optional. Review the text file as is, or import into
Excel, depending on its size, so that it can be more easily assessed.
•	AMP504 Extract QA Data
This extract produces a text file that contains the organization's QA/QC data in raw
format. This extract is helpful for reviewing data from organizations with large
monitoring networks. The text can be imported into Excel, in order to more easily sort
and review the data; or, the data can be imported into a similar data assessment
program that will sort the data, such as the 504 QA Data Assessment Tool (see Section
4.4.2 below).
•	AMP600 Certification Evaluation and Concurrence Report
This is the primary report generated by organizations during annual data certification.
The report shows whether or not data quality objectives have been met for the criteria
pollutants, provides dates for QAPP approvals, and illustrates other QA considerations.
4.4.2 Review Criteria
When querying a large PQAO over a 3-year time period, many of the AQS reports listed above will
produce hundreds of pages of data results. After these reports have been obtained, or raw data
extracted, the lead auditor (or delegate) should examine these large data sets, with a goal of producing
a narrowed list of specific, individual data points that will be further evaluated during the on-site TSA.
During this data review process, the lead auditor can utilize additional data assessment tools to mine the
AQS data. For example, on the AMTIC website, a 504 QA Data Assessment Tool (Excel)7 is readily
available to help EPA Regional Offices (and the monitoring organizations) evaluate the pollutant data
sets. The 504 QA Assessment Tool processes AMP 504 extracts, quickly evaluating the text files for
specific monitoring criteria. Other assessment tools readily available on AMTIC are box and whisker
plots, which may assist the auditor in seeing trends or issues within data sets.
Prior to beginning this phase of data review, the auditor should read the monitoring organization's
QAPPs and SOPs, particularly the data validation SOP(s) - as suggested in Section 4.3.2 of this QAGD.
The monitoring organization's QAPPs and SOPs should clarify the coding that will be observed in the AQS
reports. As stated previously, the data points selected for review during the on-site TSA will be used to
7 https://www3.epa.gov/ttn/amtic/qareport.html

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 39 of 105
assess whether the data has been handled in accordance with the monitoring organization's quality
documents. These data points will also be used to verify whether or not the monitoring organization has
sufficient documentation available on site to support the decisions made regarding its validity (i.e.,
whether the data point was reported, null coded, or flagged appropriately). With that in mind, it is
important for the auditor to be familiar with requirements detailed in the organization's SOPs,
particularly those related to data validation.
The following bullet list summarizes general criteria and/or issues the lead auditor should look for when
examining the organization's AQS data. It is important to note that this list is not all-inclusive. As a
general guide, the data validation templates found in the QA Handbook itemize key criteria against
which ambient air monitoring data should be validated; the auditor can refer to those templates for
more extensive, pollutant-specific criteria. The auditor should also look for any unique criteria specified
in the organization's QAPPs and SOPs that may not be listed below.
•	Determine if the 75% data completeness requirement has been satisfied.
•	Assess whether or not 1-point QC checks meet the critical % difference acceptance criterion
established per pollutant (e.g., 7% difference for ozone).
•	Determine whether 1-point QC checks have been conducted at the appropriate frequency (i.e.,
every 14 days, minimum), and within the concentration range specified in CFR.
•	Determine if the appropriate number of PEs have been conducted, and whether those audits
have been conducted at the concentrations specified in CFR.
•	Assess whether or not flow rate verifications meet the critical % difference acceptance criteria
(e.g., 4% of transfer standard, 5% of design flow for PIVh.s)-
•	Determine whether or not the flow rate verifications have been conducted at the appropriate
frequency.
•	Determine whether semi-annual flow rate audits have been conducted at the appropriate
frequency and meet the acceptance criterion.
•	Determine the highest concentration site for each pollutant. (Note: These particular monitors
may need to be inspected on site during the TSA, especially if any regulatory actions are
pending.)
•	Identify any sites that have incomplete design values. (Note: These monitors may not have been
in operation throughout the entire 3-year time period under review; check start-up dates in AQS.)
•	Review 1-hour S02 data against its associated 5-minute data. Verify that 5-minute S02 data has
not been reported during time periods when the hourly value has been invalidated.
•	Examine PM2.5 blank data for exceedances. (Note: Some organizations will report both field
blank and trip blank concentration values to AQS, but the auditor may not be able to discern
from the AMP 503 extract which blank is which. The acceptance criterion for field blanks is <ฑ30
lig; the criterion for trip blanks is < ฑ15 /ug.)
•	Examine ambient concentration data for adherence to SOPs, as well as for anomalies. Examples
of what to look for include the following:
a. Missing data (i.e., gaps in reported concentrations).

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 40 of 105
b.	Repeating concentration values, such as consistently negative concentrations over long
periods of time.
c.	Unusual maximums concentration values. (For example, maximum ozone
concentrations at midnight: could the data be the result of an automated span check
that was erroneously reported as ambient data?)
d.	Consistency in application of null value codes and qualifier flags.
e.	Do the null value codes, through the "story" they convey when read, indicate that SOPs
have been followed?
The auditor should earmark data that does not meet these criteria, as well as any data that appears to
be anomalous, miscoded, or potentially deviates from the organization's SOPs. An important step to
determine whether or not data has been correctly coded is to reconcile the AMP 350 report against the
AMP 251 (or AMP 504). Using these reports, the auditor can surmise whether or not ambient data has
been invalidated when 1-point QC checks (or flow rate verifications) have exceeded acceptance criteria.
The AMP 350 will allow the auditor to observe whether or not the data appear to have been
"bracketed" appropriately - meaning data have been invalidated back to the last acceptable 1-point QC
check, as well as invalidated forward until corrective action has been successfully implemented. If valid
QC checks have been reported to AQS that exceed acceptance criteria and the associated ambient
data has not been invalidated, the auditor should earmark the example(s) for further review and
discussion with the monitoring organization.
Additionally, a few data points that do not appear to be anomalous or miscoded should be tagged for in-
depth review during the TSA (see Section 5.3.3.1 of this QAGD). For these specific data points that will
be used for data audit trail tracking, it is suggested the auditor select a few data points at random from
the pollutant data sets. Also, because part of the goal of the on-site data review is to ensure the
organization's documentation supports data validity, it is recommended that the lead auditor select at
least a few data points that appear to be valid high concentration values for the pollutants of interest.
For example, the maximum valid lead concentration reported at a monitoring site may be an ideal
sample for in-depth review; the laboratory data package would need to be obtained and evaluated, in
order to ensure calibration techniques (including dilutions) were performed in accordance with the
method; and the field sampling records and calculations would need to be reviewed and confirmed as
well. In this manner, the auditor can assess the effectiveness of the monitoring organization's quality
system, as well as ensure the reported high concentrations are valid. However, given the significant
amount of time needed to track the audit trail for a single data point, the number of points selected for
this particular activity during the TSA should be limited.
After completing this preliminary data review, the amount of data earmarked should be compared to
the amount of time allotted for data review activities on site. The lead auditor will then have to decide
whether or not there is sufficient time to cover all of the selected data points, or if the number of data
points selected will have to be further reduced due to limited resources (i.e., time). It is important to
note that, for the data review activities that occur during the on-site TSA, the size of the audit team is
not the critical factor. Instead, the factor that typically drives the speed of the data review activities on-

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 41 of 105
site is the number of staff available at the monitoring organization to assist the auditor(s) with reviewing
their in-house data, and whether or not the organization's monitoring database(s), such as AirVision™,
can be accessed in multiple locations within their facility (which would allow the audit team to break
into smaller groups to cover different pollutants simultaneously). With that in mind, time should be
considered the most important resource during this phase of the TSA, and the auditor should anticipate
at least 15 minutes to review each individual data point.
If the auditor determines that the quantity of data earmarked for on-site data review is greater than the
time available at the monitoring organization, it is recommended that the lead auditor create an
abridged list of data from those earmarked, in order to have a "top priority" list. From that, the lead
auditor can maximize the time available on site by reviewing the most critical data points first. To create
an abridged, "top priority" data review list, the lead auditor should re-review the discrepancies and
concerns first earmarked in the data sets. Of those highlighted, the auditor should determine whether
any issues appear to be recurring or systemic. Similarly, the lead auditor should assess which issues
appear to be of the greatest concern regarding data validity. For example, if the auditor notes any issues
that could result in significant data invalidation, those data points should be prioritized. Ultimately, the
auditor should select from the highlighted data only a limited number of points that best represent the
major concerns identified, and include a few select points that do not appear to be anomalous in any
manner, but will be used for data tracking purposes.
It is important that the lead auditor not discard those data points which were earmarked, but not final-
selected as "top priority." Instead, it is recommended that the auditor keep all of the earmarked data
points on hand during the on-site TSA. If data review activities progress at a faster speed than
anticipated, such that all top priority data points are reviewed and time still remains on site, the lead
auditor can then select from the other tagged data points to utilize the time that remains.
4.4.3 Additional QC Data Requests and Review
There may be additional data the TSA lead auditor wants to request in advance because the QAPP/SOP
review alludes to unique processes which warrant verification by the audit team. Moreover, some
supplementary program data may not be available in AQS that would be valuable to review in advance.
In order to obtain this data for review, it has to be requested directly from the auditee. Requesting this
information up front allows the auditor to be better prepared and/or save time while on site. However,
if the auditor is unable to review this data in advance of the audit, the information can be requested
while on site and reviewed at that time.
Examples of supplementary information and data that may be requested in advance includes the
following:
•	Copies of audit reports, such as the internal systems audits, if applicable, or internal
performance audits
•	Third-party audit reports, such as performance or systems audit reports by the PQAO (if the
monitoring organization is a local monitoring program) or a contractor

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 42 of 105
•	PM2.5 Lab Blank Data - Request if the monitoring organization operates an in-house gravimetric
lab. Review the data for excursions >ฑ15 ng.
•	Control charts maintained by the organization, particularly from the PM2.5 or PM10 weigh labs.
Common control charts generated by gravimetric laboratories include those used for tracking
check weight standards, lab blanks, duplicates, and laboratory temperature and relative
humidity conditions.
•	Shelter temperature data for some sites or monitors.
•	PT reports/results for air toxics.
4.5	Questionnaire Response
The monitoring organization should submit its completed questionnaire to the lead auditor one
month prior to the on-site TSA. Upon receipt of the completed questionnaire, the lead auditor should
disseminate the document to the entire audit team for review. All audit team members should review
the monitoring organization's responses and earmark any areas of concern requiring additional
discussion and/or on-site investigation during the TSA.
Items to note in the questionnaire response include, but are not limited to, the following:
•	Review the responses in comparison to EPA regulatory requirements. Are discrepancies noted?
•	Review the responses in comparison to the monitoring organization's QAPP and SOP
requirements. Are discrepancies noted?
•	Observe if the response to any question is marked "No" when the anticipated answer is "Yes."
•	Mark any sections that are left completely blank, or sections in which the given response is too
ambiguous.
4.6	Preliminary Findings Summary
Activities described in Sections 4.3 - 4.5 of this QAGD may result in identification of preliminary findings.
The lead auditor should review all notes and earmarked documents/data, in order to compile a
summary list of the issues that will be investigated during the on-site TSA.
It is suggested that the lead auditor assemble the audit team prior to the on-site TSA to discuss their
notes and earmarked documents/data as well. Such discussion can illuminate recurring themes or
systemic issues within the organization that may not be apparent when focusing on one auditor's notes
alone. The team discussion, in turn, will help the lead auditor prioritize the most significant areas of
concern identified through these pre-audit activities. The audit team should also discuss their notes and
concerns regarding the completed questionnaire responses. The lead auditor should take the team's
concerns into consideration when prioritizing which responses will be discussed in person during the
formal questionnaire discussion, and which responses may be left to verify through facility inspections
and/or records/data review activities.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 43 of 105
4.7 Audit Team Communications and Planning
Effective communication and planning among the audit team is paramount; it should be frequent during
the planning stages of the TSA, and ongoing throughout the duration of the on-site TSA audit. The lead
auditor is responsible for coordinating and planning the communication protocol between audit
members, audit teams (if working separately), and other EPA regional stakeholders. Some EPA Regional
Offices are organized such that staff who conduct TSAs are located in different divisions from other Air
program staff. In these situations, the lead auditor should contact their Regional Office partners when
preparing for the upcoming TSA. This outreach can provide notice of the upcoming audit, as well as
serve as a forum to exchange information about the monitoring organization, discussing any specific
areas of concern that should be investigated during the TSA. For some TSAs, the monitoring organization
audited may share CBSAs/MSAs with monitoring organizations located in different states or EPA
Regions. Under some circumstances, it may be appropriate to outreach to the other EPA Region(s) to
inform them of the upcoming audit.
Audit team meetings are useful to enhance communication and promote good audit planning during
pre-audit activities, as well as during the on-site TSA. The best practice is to implement at least two
planning meetings during TSA preparation: one at the start of the project, to discuss team assignments
and logistics; and the other immediately prior to the start of the on-site TSA, to finalize logistics and
discuss the team's preliminary findings (as described in Section 4.6 above). Once on-site, briefings
among the audit team members and with the monitoring organization are essential. Briefings with the
monitoring organization should include both an entrance and exit briefing, which are discussed in detail
in Section 5 of this QAGD. Briefings amongst the audit team members, however, should occur daily, and
are described below.
4.7.1 Team Meetings Prior to the On-Site TSA
As stated previously, the lead auditor is responsible for coordinating and planning the communication
protocol between audit members. At the start of the TSA planning process, it is recommended that the
lead auditor assemble the audit team in order to discuss the upcoming TSA. The lead auditor should
ensure all team members have the correct date(s) scheduled for the on-site TSA and have received
copies of the documents needed for review. It is suggested that the team create a medium to share
documents, records, and data pertaining to the TSA; a shared folder on a Local Air Network drive can
serve this purpose, or use of an approved, secured, web-based file-sharing program. During this
preliminary meeting, the audit team can discuss the monitoring organization and formulate a tentative
plan on how to most efficiently complete the audit. At this time, the lead auditor should delegate
responsibilities to specific team members. For example, the lead auditor should designate a team
member(s) as a scribe, who will be responsible for extensive note-taking during the on-site audit,
especially during staff interviews when the lead auditor may be involved in lengthy and complex
discussions. If the lead auditor intends to divide the audit team into smaller teams in order to make the
on-site TSA more manageable, then he/she should appoint one individual in each team as the primary
contact (or "lead"). For example, the "field team" described in Section 4.2.4 of this QAGD should have a
designated lead; similarly, audits of laboratories conducted separately should have a lead as well. By

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 44 of 105
making these assignments in advance, all members of the audit team have a better understanding of
their roles and responsibilities while preparing for the upcoming audit. It is suggested that the lead
auditor also use this time to tentatively schedule follow-up meetings with team members. As the
specific audit itinerary is finalized, those schedule(s) will need to be shared not only with the monitoring
organization (as discussed in Section 4.2.5), but also with the audit team members. Travel authorizations
will need to be completed, as well as reservations for hotels and transportation.
It is suggested that approximately one week prior to the on-site TSA, the lead auditor reassemble the
audit team for a final planning meeting. During this time, the lead auditor should discuss results of the
pre-audit document and data review activities (as described in Section 4.6), and go over the final audit
logistics. The lead auditor should also ensure that all team members have gathered the necessary audit
tools needed for the TSA, such as laser range finders, compasses, GPS units, cameras, and logbooks. As a
best practice, all team members should carry individual logbooks (bound and page-numbered), created
specifically for the audit and uniquely labeled to reference the audit. These logbooks act as supporting
documentation for the audit and can be given to the TSA lead auditor upon completion of the field
investigation to assist with the post-site visit activities described in Section 6 of this QAGD. The logbooks
and documentation collected during the audit will play an integral part in the development of the TSA
audit report. When teams are traveling separately, it is also important to designate someone as a scribe
within the smaller team to ensure that all relevant observations and discussions are captured regarding
the field/laboratory investigations. Logbooks, audit notes, and other supporting documentation should
be retained for inclusion or reference in writing the audit report. These records should also be included
in the final project file, after the TSA is closed-out (see Section 6.5.1 of this QAGD).
4.7.2 Daily Briefings During the On-Site TSA
Effective and frequent communication between audit team members is vital during the on-site TSA. As
stated previously, depending on the scope and extent of the audit, audit team members may divide the
activities among the team to ensure all aspects of the audit are sufficiently covered. If the team is
separated, there must be a communication plan between the lead auditor and the primary contact(s) of
the smaller team(s) to ensure that findings and developments are communicated and discussed. This
communication may be in the form of meetings, telephone calls, texts, or emails. Ideally, when the on-
site TSA is in progress, the lead auditor should be updated daily by all team members, as well as by the
smaller team leads. It is especially important to share significant findings to allow all team members
time to discuss the implications and consequences of the identified findings, and decide each day
whether further investigation into specific findings is needed.
At the completion of each work day, the lead auditor should assemble team members for a daily
briefing. If possible, team members in remote locations should call in to the daily briefing. During this
team meeting, all findings from the day should be summarized, discussed, and categorized according to
severity (see Section 6 of this QAGD). The daily briefing is very important because the recollection of the
auditors will be the best immediately following the day's events. The daily team meeting can also be
used to plan the next day's activities, especially if any priorities have changed due to the nature of the
day's findings. The lead auditor can also use these daily briefings to build a list of major findings, which

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 45 of 105
can later provide topics for the exit briefing. Compiling the significant findings daily will save time at the
conclusion of the on-site audit, and help prepare the audit team for a smooth, well organized exit
briefing.
It is essential that there is good communication between audit team members so that a consistent
message is delivered to the monitoring organization staff at all times during the on-site TSA. Auditors
should never provide conflicting information to the monitoring organization, which could send mixed
messages and cause confusion. The audit team must communicate in one voice.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 46 of 105
5.0 On-Site Audit Activities
Although the audit team has already spent many hours reviewing documents and data prior to traveling
to the monitoring organization, generally speaking, the TSA "begins" when the audit team arrives on-
site for the face-to-face interaction with the monitoring organization. The audit team should be well
organized through the preparatory process and each team member should have a clearly defined role in
the audit. From the onset, it is essential that each member understand his or her responsibility on-site
to efficiently move through the audit process since time is limited at the monitoring organization. The
lead auditor is responsible for keeping the audit on schedule and routinely collecting the observations
from the group.
The on-site component of the TSA is a multi-day event that investigates the ambient air monitoring
process including: sampling and monitoring, data collection, quality system functionality, quality
document (QAPPs, SOPs) implementation, data review, and data submission to AQS. An important
component to the success of the TSA is communication and planning between audit team members
and monitoring organization staff during the audit itself. Section 2.1 of this QAGD defines the scope of
the functional areas within a monitoring organization that should be examined during a TSA. Monitoring
organizations may have all or several of these areas, and the audit team should already have a planned
approach to inspect these areas prior to arriving on-site. The on-site audit should begin at the
monitoring headquarters where many of these areas - such as the monitoring data review group,
quality assurance team, and AQS submittal group - reside. Audit team members will visit other facilities,
such as the analytical laboratories and ambient air monitoring stations in the field, over the course of
the audit. The team's schedule on-site should be flexible to allow for additional investigation if needed.
During the audit, several meetings should occur to keep the audit team members and the monitoring
staff informed about the audit progress. Routine meetings and communication will keep the audit on
schedule and will assist in presenting clear audit findings to the monitoring organization.
Monitoring organizations can be relatively complex. However, several important activities should occur
during each TSA to consistently and efficiently cover the same material in each organization from audit
to audit. No monitoring organization should be treated differently than another, and no bias should be
reflected in the audit process. The audit activities described in this section are essential to the TSA
process, should cover the functional areas of the monitoring organization defined in the scope, and
provide the communication necessary for the audit participants. The functional areas are reiterated
below.
•	Monitoring Headquarters
•	Maintenance and Repair Shop
•	Monitoring Data Review and Validation Group
•	Quality Assurance Team
•	Analytical Laboratories
•	AQS submittal Group
•	Ambient Air Monitoring Sites

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 47 of 105
These functional areas, at a minimum, should be inspected during the on-site TSA. Monitoring
organizations may have other areas that warrant inspection as well, and these additional areas should
be identified in the TSA planning process. At the conclusion of each day, the audit team members should
meet to discuss their progress with regard to completion of these necessary inspections; see Section 4.7
of this QAGD for more information regarding the audit team's daily briefings. This section will describe
these core activities in detail and provide best practices for the audit team to follow. Figure 5.1 below
shows a diagram of a typical TSA which includes the aforementioned activities.
Audit Day 1
Entrance Briefing
Field Site Inspections
TSA Questionnaire Review
Monitoring Data Review
Daily Briefing
Audit Day 2
	V	
Field Site Inspections
Laboratory Inspections
Monitoring Data Review
Daily Briefing
	xz	
Facilities Inspections
	V2	
Field Site Inspections
Audit Day 3
Quality Assurance Group
Daily Briefing
Audit Team Only
Activity
	V2—
Exit Briefing
Audit Team and
Monitoring Organization
Activity
Audit Day 4
Figure 5.1 TSA On-Site Activities Flow Chart
Figure 5.1 illustrates the basic flow of TSA activities that may occur at a small to mid-sized monitoring
organization. The audit begins with a briefing to kick-off the TSA and ends in the same manner to
present the audit findings. All of the functional areas of this example organization are covered in the
TSA, and the order and timing have been determined during pre-audit planning. Note that during this
example TSA, the audit team splits into two teams to efficiently use time and expertise to investigate

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 48 of 105
these areas. No two audits will be the same, but using the example above as a guide will help promote
consistency across EPA Regional Offices and auditors.
As discussed in Section 4.4 of this QAGD, the auditor must always remain cognizant of the requirements
established in the monitoring organization's QAPPs and SOPs, and judge the organization's performance
based upon the specific quality documents that were in effect during the TSA review period.
5.1 Entrance Briefing
The on-site audit should begin with an entrance briefing to introduce the audit team, and inform the
monitoring organization staff about what to expect during the on-site audit process. This briefing is an
opportunity for audit staff to meet monitoring organization staff and set the stage for the activities that
follow. The entrance briefing is best conducted face-to-face at the beginning of the audit. A face-to-face
meeting promotes participation, engages the participants, and encourages efficiency. The entrance
briefing may be conducted via teleconference prior to arrival on-site if necessary; however, this is not
recommended. The briefing should be short, typically no longer than an hour, and should focus on
overall objectives of the audit. Ideally, the entrance briefing should be the first order of business after
arriving on-site at the organization's central office. The entrance briefing should be conducted with the
organization's air monitoring director or other senior staff designee, monitoring managers, supervisors,
QA staff, and any other monitoring staff who will be engaged with the auditors during the audit. All staff
present should introduce themselves and provide a brief description of their particular role(s) and
responsibilities.
The entrance briefing should set the tone for the audit and serve as an open forum where the audit
team and monitoring organization staff can comfortably ask questions and exchange information. It is
important to discuss the audit itinerary and logistics to ensure that all plans are still viable and that
pertinent staff are available for the field, laboratory, and facilities visits. Contact information should be
shared/confirmed during the entrance briefing as well. If any unforeseen conflicts or problems regarding
the audit schedule have arisen, modifications to the itinerary should be made during the entrance
interview. If the pre-audit activities identified areas where the audit team may want to focus additional
attention, it is important to ensure the appropriate contacts who can discuss those issues will be
available during the audit. The projected time and location for the exit briefing should also be discussed
during this meeting; it is important to schedule the exit briefing so that senior management will have
opportunity to attend. This may not possible in all cases, but it is a priority in closing out the on-site
phase of the audit. The audit team should make arrangements for a conference call line for the exit
briefing, if necessary.
The lead auditor should discuss the overall TSA process with the organization, especially its purpose,
regulatory basis, and goals. The auditor should remind the monitoring organization staff that a TSA is a
conglomerate of activities intended to assess whether the organization has operated the network in
compliance with regulatory requirements, has a functioning quality system, and as a result, has
produced quality monitoring data that is sufficient for decision-making purposes. The TSA is also an
opportunity for the auditor to offer suggestions for improvements, and potentially assist the

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 49 of 105
organization in discovering ways to implement needed enhancements. Explaining these objectives at the
beginning of the TSA is especially important for newer monitoring organization staff where the TSA may
be their first. The auditor should also explain the different phases and timelines of the TSA process,
including audit preparation, TSA report generation, and post-audit follow-up activities.
5,2 TSA Questionnaire Review
The TSA Questionnaire is an important tool that helps EPA audit staff walk through important checks
during the on-site TSA. The audit team should have received the completed TSA Questionnaire from the
monitoring organization prior to arriving on-site and completed a comprehensive review of the
documented responses. Specific areas of interest, based upon the monitoring organization's responses
(see Section 4.5) should be identified, and the audit team should be prepared to address those
preliminary concerns during the TSA.
The TSA Questionnaire meeting can occur at any time during the audit. Some auditors may choose to
discuss the TSA Questionnaire immediately following the entrance briefing, so that all responses are
clarified at the beginning. The auditor can then use the days that follow to verify those responses. On
the other hand, the auditor may choose to discuss the questionnaire at a later time, after he/she has
had an opportunity to visit some/all of the organization's facilities and/or review records. Regardless of
when this discussion takes place, it is important for the auditor to use the time on-site to verify the
organization's responses. Ideally, this meeting should occur on-site with the organization's monitoring
staff, but a conference call or webinar could be acceptable if resources are not available. Conducting the
meeting at the organization is the best practice because the organization can easily provide
documentation to support their answers on the questionnaire. Face-to-face meetings also facilitate
better interaction between the two parties and tend to be more productive. The meeting may be
conducted with the same personnel who attended the entrance briefing, but should include monitoring
staff who contributed to the responses on the TSA Questionnaire. Conducting this meeting with only the
monitoring manager and/or QA Officer present should be discouraged. While the monitoring manager
and/or QA Officer may know what should be occurring within the program, it is the auditor's role to
verify that the TSA Questionnaire reflects the actual practices in the organization. To develop a more
accurate understanding of the activities in the organization, it is strongly recommended that staff who
perform the specific activities documented in the TSA Questionnaire participate in these discussions.
The best practice would be to work through the TSA Questionnaire by section and meet with only the
monitoring staff who work in those particular areas.
Confirmation of the TSA Questionnaire responses may be determined during the TSA Questionnaire
meeting or during the other parts of the audit such as interviews of staff, observations of procedures, or
visits to the various organizational departments or sections. Given the length of the TSA Questionnaire
and the amount of time needed for the monitoring organization staff to document their responses,
some organizations may check the "yes" or "no" columns without providing detail, or leave sections
completely blank. Similarly, given the amount of information to review during the TSA, the auditor may
inadvertently overlook questions where the organization responded "yes". The auditor should confirm
the responses of the organization during the on-site TSA. The auditor should ask monitoring

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 50 of 105
organization staff directly about any questions that do not contain supporting detail or have been left
blank. If the organization answered "no" to any questions, the auditor should discuss the responses if
supporting documentation does not justify the "no". For any response on the questionnaire that is not
addressed and/or verified during the TSA, the auditor should clearly document in the audit notes that
he/she did not verify the organization's response(s) for that particular item. The TSA Questionnaire is a
record generated during the audit and may be included with the final TSA report. As such, it should be
accurate in its content.
5.3 Auditing the Monitoring Organization
Each ambient air monitoring organization is unique and will contain different functional areas within the
organization. Afunctional area may be defined as a group of staff tasked with a specific objective or
responsibility within the monitoring organization. Most of these objectives and responsibilities are
common to all ambient air monitoring organizations, therefore, the areas should be similar even if the
groups use different names. The common groups and their general activities and responsibilities that an
audit team may encounter within an ambient air monitoring organization are:
•	Monitoring Headquarters - Organization management and site operators conduct day-to-day
oversight and operation of the network
•	Maintenance and Repair Shop - Maintenance and support staff perform routine monitor or site
maintenance activities.
•	Monitoring Data Review and Validation Group - Data reviewers assess monitoring data from
the network.
•	Quality Assurance Team - Auditors and support staff conduct QA activities and manage the
quality system.
•	Analytical Laboratories - Laboratory analysts perform analyses of ambient air samples.
•	AQS Submittal Group - Data handling staff creates and uploads data transactions to the AQS
database.
Some functions may also be performed within the same group. For example, the Monitoring Data Group
may review monitoring data and also handle AQS submittals. It is important to note that some
organizations house these groups in one centralized headquarters, while other organizations spread
these groups out across different locations, separate from the organization's headquarters. It is
imperative that the audit team have a clear understanding of the organizational structure before
arriving on-site. Accordingly, the audit plan prepared by the lead auditor should have identified the
groups and their locations prior to the on-site visit, so that audit team members know who will go where
and when.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 51 of 105
Inspecting the monitoring organization's facilities provides the auditor with a better understanding of
how the organization functions on a day-to-day basis. Monitoring organization staff typically escort
auditors to each area or facility, and while doing so should be encouraged to talk about that particular
area or group and the processes that occur there. Typically, an audit tour includes visits to the
organization's primary office spaces (where key monitoring personnel work), as well as to records
rooms, storage areas (e.g., long-term particulate filter storage), warehouses, sample shipping and
receiving areas, technical repair/maintenance facilities ("shops"), quality assurance facilities
("QA/Standards labs"), and analytical laboratories. When visiting each facility, the auditor should:
•	Determine if there is adequate space to conduct the work
•	Investigate if there is adequate storage to support the operations
•	Identify what equipment is stored in each facility
•	Determine what records are stored on site and ensure the records are tracked and secured.
Wholesale changes of the facility itself may not result from the audit because of cost and competing
priorities within the monitoring organization, but it is important to note any observed deficiencies to
assist in future planning.
Additional monitoring staff may be present during these facility inspections and should be engaged in
the audit discussions, when applicable. The auditor should review each area to ensure that it is
maintained in a manner consistent with its function, as described in the organization's QAPP and SOPs.
While visiting each facility, the auditor should check to see if staff members have ready-access to the
organization's quality documents. Technical discussions with the monitoring staff at each facility can
alert the auditor to potential quality system issues. If the auditor observes an inconsistency in practice
within a particular facility, it could indicate staff are not following their QAPP/SOPs or have not been
properly trained. Similarly, if an auditor asks questions that staff cannot answer, it could also signal a
weakness in the organization's quality system.
The following subsections provide summaries about these general groups that are common among
monitoring organizations. The subsections highlight some important areas of interest for the audit team
within these groups, but do not represent the full scope of what should be audited.
5.3.1 Monitoring Headquarters
The monitoring headquarters is typically the "main office" of the monitoring organization. The
monitoring staff and often the senior management reside at this location. The monitoring headquarters
may contain the offices for the monitoring staff, records rooms, storage areas, computer network
locations, and shipping and receiving areas, among others.
Interviews of the monitoring staff should be conducted at the monitoring headquarters. Staffing levels
are an important component of any monitoring organization and play an important role in the success of
the organization and in the quality of the data. The auditor should examine schedules to see how many
sites are being operated per staff member, which sites are run by whom, and other logistical concerns to

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 52 of 105
understand how the network is operated. If an organization is stretched too thin the quality of data will
suffer. The audit team should identify these instances and document them during the audit. Interviews
with the monitoring staff will also give an auditor insight on how well the staff knows SOPs, QAPPs, and
other components of the quality system. The monitoring organization is bound to follow all applicable
CFR regulations. In addition, the QAPP documents the framework and requirements for the quality
system of a monitoring organization, which is contractual in nature between the monitoring
organization and EPA. If, through the interview process, a monitoring staff member is unaware of CFR
regulation or correct procedures or requirements in the QAPP, this illustrates a potential weakness in
the quality system and a potential for non-conformance to these requirements.
A training program and training documentation are commonly located at the monitoring headquarters
and are an important indicator of quality system implementation and organization sustainability. As
regulations and quality documents change, a monitoring organization must train its staff accordingly. A
robust training program is the foundation of a good monitoring organization and this should be
described in the QAPP. The audit team should examine the training curriculum that is required for new
hires and for existing staff to see if it meets the QAPP description and is adequate. There should be an
established curriculum for new hires and a continuous training program for existing staff. There should
also be information that documents the training for all staff members. This is an important aspect of the
monitoring group as a whole and the audit should allow time to focus on this requirement.
The majority of data acquisition is usually found at the monitoring organization's headquarters. The data
acquisition systems employed by a monitoring organization may differ dramatically from organization to
organization. Configurations of these systems are complex and include many details. The configuration
should allow the system to collect data that is required for AQS submittals, as well as allow the QA and
data review groups to assess the raw and QC data. Instrument performance (diagnostic) data may be
included in the configuration to aid in troubleshooting remotely as well. In many cases, data channels
are configured to make automated calculations that should be verified during the audit. Data security is
an important issue and the system must be protected against unauthorized access or manipulation.
Data acquisition systems should always be configured in such a way that there is redundancy in data
capture in case of system failure. Moreover, the entire system should always be archived off-site, to
prevent data loss in event of catastrophe. The monitoring data is the culmination of all of the monitoring
work, and its integrity should be protected. An audit team member should have technical knowledge of
data acquisition systems to ensure that the data acquisition system is functioning properly and is
capturing and processing the required data accurately.
5.3.2 Maintenance and Repair Shop
The maintenance and repair shop contains the staff and resources to maintain, repair, and store the
monitoring equipment, parts, and standards required to keep the monitoring equipment in good
working order. This shop typically contains workbenches and storage areas that are utilized with relative
frequency to maintain the monitoring equipment. The shop should be well-organized and arranged in a
manner to promote efficiency and quality workmanship. The maintenance and repair shop is an
important part of the organization as a whole since good maintenance and quick repairs minimize

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 53 of 105
monitoring down-time and promotes a higher level of data completeness and data quality. The audit
team should ensure that the space, equipment, and staff are adequate for maintaining the equipment
necessary for supporting the monitoring program.
Documentation is a priority in the maintenance and repair shop. All activities that occur in the shop
must be documented so that there is a record, present and historic, for all equipment involved in the
collection of monitoring data. This includes calibration standards, gas standards, field analyzers,
samplers, and testing equipment. If the shop uses a piece of equipment that produces a measurement,
there should be documentation on file that describes its traceability to a standard of higher authority.
Documentation for these standards is typically a certificate or a test summary that shows the equipment
has been tested and has passed the acceptance criteria. A certification may only be valid for a
prescribed time period. The audit team should investigate equipment records and may choose to copy
documentation of interest as part of the audit file.
Each piece of field equipment should also have a uniquely identifiable logbook or record that
accompanies it in the shop and into the field. The maintenance, repair, and field staff should make
entries into this logbook to record the maintenance or service that is performed on a monitor. This
logbook serves to inform the site operator and the shop technician of the status and history of the unit
and can also aid in troubleshooting. The logbook should contain records involving general maintenance
and repairs, but could also include calibrations, QC checks, and field observations. The audit team
should review the logbooks to ensure that a proper accounting of the history of the monitor is
documented.
The maintenance and repair shop typically contains spare monitors and storage space for spare
equipment and replacement parts. The audit team should inventory the spare equipment and parts to
determine if there is an adequate stock to support the monitoring network. If the stock is not adequate,
the organization risks lengthy downtimes and poor data completeness. The audit team should also take
note of the age of the equipment and the FRM/FEM status of the equipment. A monitoring organization
should be phasing out older technology and investing in newer technology that is appropriate for the
ambient concentrations observed in the network. For example, if concentrations have fallen to levels
that are at or near the detection levels of older generation monitors, the monitoring organization should
be investing in equipment with higher sensitivity. It is also important to note that older models of
analyzers and samplers may not be supported with service or spare parts by the manufacturer after a
period of time or after a newer model is released. If a monitoring organization operates unsupported
equipment in the network this may result in lost data and possible completeness issues. It is important
for the audit team to promote the use of reliable and sustainable monitoring equipment.
5.3.3 Monitoring Data Review and Validation Group
Review and validation of the monitoring data is a central focus of the ambient air monitoring
organization. Therefore, the audit of data quality must be a central focus of the TSA. A monitoring
organization may have volumes of documentation that should illustrate the quality of the monitoring

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 54 of 105
data and support that it meets the monitoring requirements. The QA Handbook defines data review
below:
"Data validation is a routine process designed to ensure that reported values meet the quality goals of
the environmental data operations. Data validation is further defined as examination and provision of
objective evidence that the particular requirements for a specific intended use are fulfilled."
The Monitoring Data Review and Validation Group, or data review group, is responsible for reviewing
data using the monitoring organization's QAPP. The group should also have a data review SOP that
describes, in detail, the data review process that is to be followed. This group should contain staff who
are well versed in the QAPP requirements, the data review SOP, and the regulations. They should also
be proficient with software and tools to perform various data assessments. Some data review
responsibilities may be shared by the Quality Assurance Group depending on the organizational
structure or the distribution of responsibilities.
Each monitoring program retains an archive of monitoring data records from its monitoring network.
This data includes raw data, QC data, calibration data, diagnostic data, and other supporting data. The
data collected should be sufficient to assess program MQOs and to document the history of the data
point, thereby demonstrating its quality. The audit team should devote significant resources during the
TSA to monitoring data review and determining its quality and defensibility.
Monitoring records may be electronic or hand-written. Electronic records from data acquisition systems
are typically designed to contain all pertinent records for the user. However, hand-written records, such
as logbooks, that are generated and maintained by the monitoring organization tend to be more
informal. The audit team should review these types of records to ensure that they have the following
characteristics. The records should be:
•	Maintained - Consistent with the organization's QAPPs and SOPs
•	Current - Reflects the current operations of the organization
•	Legible - Clear and readable entries
•	Complete - Contains all pertinent information and meets the minimum requirements with
no data gaps
•	Defensible - Entries made in indelible ink; no white-out or pencil
•	Signed - Contains proper signatures/initials, dates
•	Accessible - Easily retrieved for reference or review
5.3,3.1 Tracking Data History
Tracking a monitoring organization's data is an important component of the audit because it can reveal
systematic issues in the data collection process, quality system, and identify weaknesses in the data
review process. Because of time limitations, the audit team cannot trace the history of each data point
collected by the monitoring organization; however, the pre-audit activities described in Section 4.4
should have identified a list of data points of interest for investigation. The audit team should

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 55 of 105
methodically trace the history of these data points to confirm that all steps in the collection process pass
QC checks, calibration requirements, traceability requirements, and collection requirements. For
example, the general flow of tracking the history of an S02 data point may be:
1)	Monitor is an approved FRM/FEM
2)	Sample residence time has been met
3)	S02 calibration gases are in certification
4)	The monitor has been calibrated
5)	Calibrator MFC has been calibrated/certified
6)	Electronic data records and strip charts confirm correct calibration
7)	The calibration was documented correctly
8)	Particulate filters in the sample train were changed according to the maintenance schedule
9)	Sample line and probe were cleaned according to the maintenance schedule
10)	QC checks, before and after the collection date, passed acceptance criteria
11)	Electronic data records and strip charts confirm correct QC checks
12)	Logbook shows normal analyzer operation
13)	Raw data from the organization matches the measurement in AQS (were adjustments made
to the data following collection?)
14)	Relevant data codes or flags are represented in the data, if applicable
Depending on the organization, the above steps may be different; however, most of these steps should
be included in a monitoring organization's data collection process. The audit team should be able to
follow the data collection path and examine documentation for each step in the process. The auditor
should document the findings and note the areas where the data history could not be followed or
confirmed.
5.3.3.2 Data Review Process
The audit team should have already reviewed monitoring data retrieved from AQS and should be
prepared to discuss relevant findings with the data review group. The data review group should have an
approved data review procedure in place and a prescribed timeframe specified to routinely review data.
Different techniques or tools may be used by the monitoring organization to assess the monitoring data
to investigate anomalies, outliers, or other data that may be suspect. Tools such as data acquisition
features, control charts, or spreadsheets may be employed in the organization's review process.
Through the audit planning phase, the auditors should have a grasp of the methodology used by the
data review group.
The audit team should look for evidence on-site that the organization has reviewed and validated data
in accordance with its QAPPs and SOPs. Common evidence to look for during an audit are hardcopy data
review packages with signatures and dates, electronic spreadsheets with electronic signatures and
dates, or functionality within a data acquisition system that can be signed electronically. Data validation
will always require supporting documentation to detail and justify a change or to support the data as it
was recorded. Common supporting documentation that may be reviewed include:

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 56 of 105
•	Minute data (electronic or paper strip chart records)
-	Helps illustrate data validity, as well as show graphically issues that hourly averages will
mask
-	Shows the stability of QA/QC checks
-	Can confirm concentrations
•	Calibration records or spreadsheets
•	Logbook entries
•	Maintenance records or spreadsheets
•	QC check records or spreadsheets
•	Performance audit reports
•	Corrective action reports
The audit team should review the supporting documentation to determine if the data has been handled
appropriately. Is there enough detail in the records that can easily recreate events and justify null value
codes or data qualifiers? If the supporting documentation does not detail the timeline of data collection
accurately or it does not exist, the defensibility or validity of the data may be compromised.
Other data may also be reviewed as discussed in Section 4.4.3 of this QAGD, including PM2.5 QC data
(e.g. lab blank data, room temperature and humidity data, balance checks), analytical laboratory data
(e.g. CCV, calibration curves, media blanks), and monitoring shelter temperature data. These data may
be reviewed by the data review group or a different group within the organization. Often, laboratory
data is reviewed by a separate group with specialized processes and expertise. It is important for the
auditor to determine who has the responsibility of reviewing all the data collected in a monitoring
organization and include it in the audit.
5.3.4 Quality Assurance Group
Each monitoring organization should have an independent group that is responsible for ensuring that
the quality system defined in the monitoring organization's QAPP is implemented and is functioning
correctly. A core concept in any quality system is to continually assess a program or operation to
improve the process and ensure that requirements are consistently being met. This group assesses the
program through a variety of quality assurance and quality control activities, including quality document
development (QMP, QAPPs, SOPs), internal systems audits, performance audits, corrective action
assessment, and QC data review. The audit team must ensure that the QA Group is functioning
effectively and is enforcing the prescribed requirements in the monitoring organization's QAPP.
One of the most important aspects of the QA group that the audit team must investigate is its
independence. 40 CFR Part 58, Appendix A, ง 2.2 discusses the role of the QA group and its
independence in the following way:
"The quality assurance management function must have sufficient technical expertise and management
authority to conduct independent oversight and assure the implementation of the organization's quality

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 57 of 105
system relative to the ambient air quality monitoring program and should be organizationally
independent of environmental data generation activities."
The quality assurance group should be independent so that it may oversee the quality system and make
decisions without bias. The audit team should ensure that the quality assurance staff, or those who
conduct quality assurance activities, are not the same staff who are engaged in the day-to-day
monitoring or sampling activities.
One of the QA group's responsibilities is the development of the quality documents including the QMP,
QAPP, and SOPs. All of these documents typically originate in the QA group and should be maintained by
this group as well. Prior to the audit, the audit team should have reviewed these documents in
preparation for the on-site audit. The approved documents should be on file with the QA group. During
the audit, the auditors should determine if the quality documents are up-to-date with the current
equipment and procedures. These documents should also be controlled, meaning that the documents
distributed throughout the network are the current versions and not editable. Out-dated versions of the
quality documents that are found in the organization can result in data that does not meet the current
QC requirements and can cause confusion or inconsistency in management processes or equipment
operation. For example, a regulation change results in a QC check for a pollutant being changed from 10
ppb to 5 ppb, and the organization's QAPP and related SOPs are revised to reflect the change. If the old
QAPPs and SOPs are not removed and replaced with the new documents, there is a risk that operators
in the field will continue using the old check requirements, which could put data at risk of invalidation or
flagging. If the quality documents are out of date, not being followed, or do not exist, the audit team
must investigate why. If the quality documents are neglected, the auditors should investigate further to
determine whether adequate resources have been given to the QA group or if the necessary expertise is
missing.
A primary role of the QA group is to perform audits of the monitoring organization. Annual performance
audits and internal systems audits (ISA) are the two most common types of audits that are typically
conducted by the QA group. Direction for performing the annual performance audits is provided in 40
CFR Part 58, Appendix A ง 3.1.2. ISAs are not specifically required by CFR, but they should be included in
the QAPP as part of a robust quality system. In assessing the annual performance audits, the audit team
would have observed the number and results of the audits in AQS during the planning phase of the TSA.
While on-site, the audit team should review the annual performance audit documentation and the
procedures that were followed. The documentation should match what was found in AQS, and the
methodology for the audit must be consistent with the CFR direction and the monitoring organization's
QAPP. If ISA s are conducted by the QA Group, they should be reviewed during the audit. These audits
vary in scope, but as a best practice, they should cover the majority of the ambient air monitoring
program of the monitoring organization. The ISAs are useful in providing the audit team with an
overview of how the monitoring organization and the quality system functions and how well the
organization implements corrective action. The audit team should investigate the ISA procedures and
reports and identify shortcomings and recommend improvements.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 58 of 105
Another important responsibility of the QA group is corrective action oversight. An ineffective corrective
action process is an indicator of a weak quality system and may result in poor data quality. Findings in
audits, failed QC checks, or lost data, for example, should trigger corrective action. The QA group is
responsible for developing a corrective action program and ensuring that corrective action has been
implemented and has been successful in correcting problems that were identified. The audit team
should investigate first-hand how this process functions on-site. Documentation should exist that
describes the problem, shows action taken, and provides evidence that the action has been successful.
The auditors should spend enough time with the corrective action process to ensure that the process is
functioning effectively. If there is no, or limited, corrective action, the same deficiencies will continue
long-term and affect data quality.
The QA group may also have the responsibility of reviewing data from the monitoring network. Data
review may be completed by several groups in different ways, but the QA group is responsible for
monitoring QC checks and assessing their impact on the routine monitoring data. These reviews may be
completed using spreadsheets, control charts, or other means. Minimally, the QC data should be
reviewed monthly, but weekly or even daily is recommended to reduce the amount of data that may be
affected by a quality issue. The audit team should ensure that the QC review is occurring in a timely
manner. Also, the audit team should ensure that the correct data qualification or data invalidation
decisions are being made.
5.3.5 Analytical Laboratories
An analytical laboratory audit can be a complex and in-depth process. There may be numerous analytical
methods employed by a laboratory to provide analytical support for the different air pollutants. This
section offers general guidance for the NAAQS and other monitoring programs that may be encountered
during the TSA.
During the pre-audit phase, the audit team identified the laboratories that exist in an organization and
developed a plan to audit them. There are a variety of analytical methods and analytical equipment that
may be used in a laboratory. Because of the complexity of laboratory audits, it is essential that the audit
team contain at least one member with technical knowledge and expertise in the specific laboratory
operation/method under review. When auditing each of the methods, the audit team should engage
with the analyst as they perform the analytical method, observing and questioning the analyst
throughout the process. It is also important to listen to the analyst as he/she describes the process to
determine if what is being done in the laboratory corresponds with the method and the SOP. If the audit
team lacks expertise in analytical laboratories, they may investigate hiring a contractor to conduct the
laboratory portion of the audit. If the use of a contractor is necessary, the audit team should observe
the contractor during the audit with the intent of gaining the expertise necessary to conduct the
laboratory audit in the future.
It is critical that the auditor ensure that the laboratory is, and has been, following the applicable CFR
federal reference and/or equivalent methods required for the analyses. However, some air analytical
methods, such as those used to analyze air toxics pollutants, have no CFR requirements and are guided

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 59 of 105
by a Technical Assistance Document (TAD). In either case, the monitoring organization must commit to
following an applicable method of analysis in their QAPP for each analyte. The language in the QAPP
binds the monitoring organization to the referenced method. The monitoring organization may also
have specific language in their grant requirements, and those stipulations must be referenced and
adhered to as well. The auditor must be aware of these commitments and verify they are being met
during the TSA.
EPA typically develops guidance documents to assist the monitoring organization in the implementation
and quality assurance of a specific method. It is strongly recommended and is considered best practice
to follow the guidance. Regardless of guidance, the CFR federal reference methods must always be
followed and are not negotiable.
Laboratories within an ambient air monitoring organization that may be audited during the TSA include:
•	PM2.5 and/or Low-Volume PM10 Gravimetric Laboratories
•	Hi-Volume PM10 Gravimetric Laboratory
•	Lead (Pb) Laboratory
•	PM2.5 Chemical Speciation Laboratories
•	Air Toxics Laboratories
Table 5.1, Pollutants and Corresponding Reference Methods and Guidance Documents, lists the
pollutant/analyte and reference information regarding the specified method and EPA guidance that has
been developed to support the reference method. It is strongly recommended that the audit team
member charged with investigating the laboratory(s) be familiar with these methods and guidance and
have some experience in a laboratory environment.
Table 5.1 Pollutants and Corresponding Reference Methods and Guidance Documents
Pollutant
CFR Reference
Method1
Guidance Document
Guidance Document Web Links
PM2.5 mass
40 CFR Part 50,
Appendix L
QAGD 2.12
https://www3.epa.gov/ttn/amtic/file
s/ambient/pm25/qa/m212.pdf
PM10 (low-
volume)
40 CFR Part 50,
Appendix L
QAGD 2.12
https://www3.epa.gov/ttn/amtic/file
s/ambient/pm25/qa/m212.pdf
PM10 (high-
volume)
40 CFR Part 50,
Appendix J
QAGD 2.11
https://www3.epa.gov/ttn/amtic/file
s/ambient/qaqc/m2.11.pdf
Pb (low-
volume)
40 CFR Part 50,
Appendix Q
Generic ICPMS FEM SOP Pb
PM10
https://www3.epa.gov/ttn/amtic/file
s/ambient/pb/EQL-Q512-2G2.pdf
Pb (high-
volume)
40 CFR Part 50,
Appendix G
Generic ICPMS FEM SOP Pb TSP
https://www3.epa.gov/ttn/amtic/file
s/ambient/pb/EQL-0512-201.pdf
PM2.5
Chemical
Speciation
None
Laboratory Standard Operating
Procedures (SOPs)
https://www3.epa.gov/ttn/amtic/spe
csop.html

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 60 of 105
PAMS
None
Under development
Under development
Air Toxics
None
Technical Assistance Document
for the National Air Toxics
https://www3.epa.gov/ttn/amtic/file
s/ambient/airtox/nattsTADRevision2


Trends Stations Program
508Compliant.pdf
VOC
None
VOC-TO-15 Determination Of
VOCs In Air Collected In
Specially-Prepared Canisters
And Analyzed By Gas
Chromatography/Mass
Spectrometry (GC/MS)
https://www3.epa.gov/ttn/amtic/file
s/ambient/airtox/to-15r.pdf
Carbonyls
None
Carbonyls -TO-11A
Determination of Formaldehyde
in Ambient Air Using Adsorbent
Cartridge Followed by High
Performance Liquid
Chromatography (HPLC)
[Active Sampling Methodology]
https://www3.epa.gov/ttn/amtic/file
s/ambient/airtox/to-llar.pdf
Metals
None
Metals -10-3.5 Determination
of Metals in Ambient Particulate
Matter Using Inductively
Coupled Plasma/Mass
Spectrometry (ICP/MS)
https://www3.epa.gov/ttn/amtic/file
s/ambient/inorganic/mthd-3-5.pdf
PAH
None
PAH -TO-13A Determination of
Polycyclic Aromatic
Hydrocarbons (PAHs) in Ambient
Air Using Gas Chromatography
/Mass Spectrometry (GC/MS)
https://www3.epa.gov/ttn/amtic/file
s/ambient/airtox/to-13arr.pdf
1 — CFR methods may be found at www.ecfr.gov
The following sections summarize, in limited detail, each of these laboratories. The intent of the
summaries is to provide the auditor with a general overview of a laboratory's function, the types of
equipment that may be observed, and important points of interest in the laboratory.
Every sample that is collected and transported requires a chain-of-custody. The chain-of-custody
provides the legal documentation of the life of the sample and provides a trail tracking its custody over
the duration of the sample. For example, a sample may be collected in the field by an operator, brought
to a shipper, received at a laboratory, and finally analyzed at a laboratory. The chain-of-custody tracks
and documents this process. Typical information required on a chain-of-custody includes:

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 61 of 105
•	Sample collection location
•	Sample collection date and time
•	Sample matrix/media
•	Sampler/Site operator
•	Sampler/Site operator contact information
•	Sample relinquished to contact, date and time
•	Laboratory name
•	Laboratory contact information
•	Sample analysis requested
•	Sample condition
•	Additional instructions
•	Sample returned to laboratory contact, date and time
For each laboratory, the audit team should pull several chain-of-custodies at random to determine if:
•	they are completed in accordance with the organization's QAPPs and SOPs;
•	they are complete (e.g. signatures, times);
•	they adequately track sample custody;
•	they are legible;
•	they are completed in indelible ink;
•	strikethroughs are initialed and dated with the correct entry; and,
•	they are easily accessible.
Chain-of-custodies that do not meet the criteria above, at a minimum, place the samples' integrity and
defensibility at risk. The audit team should carefully examine these documents to ensure that they can
stand up to legal scrutiny if necessary.
5.3.5.1 PM2.5/L0W-Volume PM10 Gravimetric Laboratory
The PM2.5 gravimetric laboratory includes the necessary space for an analyst(s) to receive, inspect,
equilibrate, weigh, and ship low-volume Teflonฎ filters. 40 CFR Part 50, Appendix L contains the federal
reference method that the PM2.5 weighing laboratory must follow, and the Quality Assurance Guidance
Document 2.12 is the technical guidance for the method. The technical memorandum "Clarification on
Use of PM2.5 Field and Laboratory Requirements for Low Volume PM10 Monitoring to Support PM10
NAAQS8" requires all low-volume PM10 laboratories supporting NAAQS sites to follow 40 CFR Part 50,
Appendix L as well.
Generally, work in a low-volume PM laboratory is divided between two phases: pre-sampling and post-
sampling. Pre-sampling includes the activities that precede the filter being shipped to the field and being
placed in the sampler, while the post-sampling activities begin after the filter is received from the field.
The following is a brief description of both phases of activity.
8 https://www3.epa.gov/ttn/amtic/fjjes/policv/pml0-jow-vol.pdf

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 62 of 105
Pre-sampling:
1.	Filters are received from EPA and examined for integrity.
2.	Filters are logged into the laboratory data management system.
3.	Filters are equilibrated and pre-weighed according to SOPs.
4.	Filters are loaded into sampling cassettes and shipped to field offices along with accompanying
COC requirements and documentation.
Post-sampling:
1.	Filters are received in the laboratory facility, checked for integrity, and logged in.
2.	Filters are kept in cold storage until ready for weighing.
3.	When ready for weighing, filters are brought into the weighing facility and equilibrated for at
least 24 hours.
4.	Filters are post-weighed according to SOPs, and data are entered into the data management
system where a concentration is calculated.
5.	After weighing, filters are stored in archive for 5 years, with the first year under cold storage.
Because of the small amounts of particulate collected on the filters, the laboratory should be
constructed and maintained as a "semi-clean room". This means that the laboratory operates with many
protocols generally seen in clean rooms, but some clean room protocols are relaxed. These laboratories
will contain microbalances that are capable of weighing to the microgram, which require specialized
equipment and conditions. One of the most important requirements of a gravimetric laboratory is
climate control. To weigh the small amounts of particulate collected on the filters, the effects of relative
humidity and temperature on the filters and the weighing equipment must be controlled. In the same
manner, static electricity will affect the filter weights on the balances and must be controlled. Quality
control procedures and weighing equipment may require certifications or calibrations at various points
throughout the year. Because of these QC steps, equipment, and requirements in the filter weighing
method, the gravimetric laboratory will have a large amount of documentation that the lab is, and has
been, following the method. The audit team should plan enough time to track the data through the
laboratory using these records.
In auditing this lab, the audit team should select staff members who have expertise in low-volume PM
labs. 40 CFR Part 50, Appendix L, the 2.12 guidance document, and Appendix C of this QAGD provide
more detail regarding the method and illustrate the entire scope of the laboratory operations.
5.3.5.2 Hi-Volume PMio Gravimetric Laboratory
The high-volume PMio gravimetric laboratory resembles the low-volume PM2.5 laboratory, but many of
the lab design parameters and acceptance criteria are less stringent than the "semi-clean room"
requirements specified for PM2.5. As in low-volume weighing labs, the laboratory provides space for a
gravimetric analyst(s) to receive, inspect, equilibrate, weigh, and ship high-volume filters. These filters

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 63 of 105
may be glass or quartz fiber. 40 CFR Part 50, Appendix J contains the federal reference method for high-
volume PMio filter weighing, and the Quality Assurance Document 2.11 supports the method.
Work in a high-volume PMio laboratory is divided into pre-sampling and post-sampling activities as in
low-volume PM labs. The following is a brief description of both phases of activity.
Pre-sampling:
1.	Filters are received from EPA and examined for integrity.
2.	Filters are logged into the laboratory data management system.
3.	Filters are equilibrated and pre-weighed according to SOPs.
4.	Filters are loaded into envelopes or sample transport modules and shipped to field offices along
with accompanying COC requirements and documentation.
Post-sampling:
1.	Filters are received in the laboratory facility, checked for integrity, and logged in.
2.	Filters are placed into a conditioning chamber until ready for weighing; equilibration is for a
minimum of 24 hours.
3.	Filters are post-weighed according to SOPs, and data are entered into the data entry system,
where a concentration is calculated.
4.	After weighing, filters are stored in archive.
The high-volume PMio laboratory includes all areas and equipment needed to complete the steps above
according to regulations and guidance. The laboratory should be clean and well organized. Climate
control tolerances are less stringent in the high-volume PMio laboratory; however, temperature and
relative humidity are still controlled so that the particulate collected on the filters can be measured
without the influence of water vapor. Static electricity could affect the filter weights on the balances,
but to a lesser degree than in low-volume gravimetric laboratories. High-volume glass fiber or quartz
filters are delicate in nature and lack the filter-support rings that are common on low-volume Teflon
filters. Additionally, the analyst typically handles these larger filters with gloved hands, as opposed to
using Teflon forceps. The analyst should exercise care in handling these filters in order to avoid loss of
particulate; auditors should make efforts to observe the filter handling techniques of the analyst during
the TSA. Quality control procedures and weighing equipment require certifications or calibrations at
various points throughout the year. Because of these QC steps, equipment, and requirements in the
filter weighing method, the gravimetric laboratory will have a large amount of documentation that the
lab is, and has been, following the method. The audit team should plan enough time to review these
records.
As a best practice, the PMio high-volume and low-volume gravimetric laboratories should not occupy
the same space. However, it can be done, if necessary, if care is exercised to limit the potential for
contamination. If both labs share the same space, it is highly recommended that each activity - high and

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 64 of 105
low-volume filter weighing - be conducted in different areas of the weighing room and, ideally, on
different days. Sharing of equipment should be discouraged.
In auditing this lab, the audit team should select staff members who have expertise in high-volume PM
labs. 40 CFR Part 50, Appendix J and the 2.11 guidance document contain more detail regarding the
method and provide the entire scope of the laboratory operations.
5.3.5.3 Lead (Pb) Analysis Laboratory
The Pb analysis laboratory includes the facility that provides the space and equipment for an analyst(s)
to receive and analyze filters used for Pb sampling in the monitoring network. Pb can be sampled using
two different methods, high or low-volume sampling. High-volume sampling requires the use of glass
fiber filters and low-volume sampling requires the use of Teflonฎ filters. An FRM analytical method is
available for each collection method (high and low-volume) and several FEM analytical methods are
available for use as well. If an organization chooses to develop an FEM, then ORD must approve the
method before use. A listing of all approved FEM Pb methods is located on AMTIC9. The auditors should
already know which method is being used through the audit preparation process and should verify its
use and correct implementation in the lab. The low-volume Pb FRM is located in 40 CFR Part 50,
Appendix Q, and the high-volume FRM is located in 40 CFR Part 50, Appendix G. There are generic ICP-
MS FEM SOPs provided as guidance for both Pb in TSP and Pb in PMi0. Links to these SOPs can be found
in Table 5.1.
All Pb methods typically have a filter digestion and digestate analysis component. However, there are
many ways to conduct these processes and many instruments that may be involved in the analysis. For
this reason, it is of utmost importance to have an audit team member who is experienced in a
laboratory environment and who understands multiple analytical methods and types of
instrumentation.
The Pb analysis laboratory contains instrumentation, standards, data collection systems, and
preparatory areas to perform the analytical method chosen by the monitoring organization. Regardless
of method, best laboratory practices should be used at all times during the preparation and analysis of
the filters. All methods will have QC steps or checks that the auditor must review, and the QC checks
may be different depending on the method. The auditor should consult the FRM/FEM method and the
QAPP to determine which QC checks should be investigated. The instrumentation used to analyze Pb
samples, such as an ICP-MS, generates a large amount of data including calibration curves, QC checks,
and chromatograms. Again, it is important for the audit team to include a chemist who understands
these types of data packages and can assess this information quickly.
Appendix D of this QAGD contains a Laboratory Systems Review checklist for Pb in Air developed to aid
the auditor in performing an audit of a Pb laboratory. As described above, Pb laboratories have the
option to conduct analyses using the FRM, an FEM, or develop and implement their own FEM through
9 https://www3.epa.gov/ttn/amtic/files/ambient/criteria/AMTIC%20List%20Dec%202016-2.pdf

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 65 of 105
ORD's approval process. The differences in each of these methods can make Pb laboratory audits
difficult since it would require an auditor to have knowledge of all published methods. The checklist
does not include specific QC criteria or detailed procedural steps that will be unique to each method.
However, commonalities exist in all of the methods, and laboratory best practices are consistent
throughout the methods. The checklist provides a compilation of these method consistencies that the
auditor can use to assist in auditing the laboratory. To use the checklist effectively, the auditor must be
familiar with the requirements of the specific FRM or FEM that is currently in use in the laboratory and
supplement the checklist with its specific method criteria. Using the checklist in conjunction with the
laboratory's approved analytical method and quality system requirements will assist the auditor in
effectively auditing the Pb laboratory to ensure it is operating effectively and producing quality data.
53.5.4 PM2.5 Chemical Speciation Laboratories
PM2.5 chemical speciation laboratories generally support the PM2.5 chemical speciation network and
analyze for an array of cations, anions, carbon species, trace elements, and semi-volatile organic
particles. These analytes are collected on different types of media and analyzed using different methods
such as ion chromatography, X-ray fluorescence, and gas chromatography/mass spectroscopy. Most
monitoring organizations use the PM2.5 CSN contract laboratories; however, a few monitoring
organizations have chosen to conduct these analyses in-house. The PM2.5 chemical speciation
laboratories are audited every three years by OAQPS (or its contractors) as part of the national QA
oversight responsibilities. Regional audit teams are encouraged to audit the PM2.5 chemical speciation
labs during the TSA if the audit team has the resources and expertise. Audit materials that are used
nationally are available through OAQPS if the audit team chooses to include this lab. Also, audit team
members are encouraged to accompany the national auditors if possible.
Many laboratories that support the PM2.5 CSN network have posted SOPs online that monitoring
organizations may use as reference or as an audit template. There are no FRM or FEM methods
specified for PM2.5 chemical speciation laboratories. However, monitoring organizations that choose to
perform the analyses in-house must have their own approved SOPs and QAPP for their specific program.
The audit team should use these documents to conduct the TSA. As suggested with other laboratories
that employ complex analytical methods, the audit team should include a team member that
understands the methods, best laboratory practices, specific QC, and can review the data from the
various instrumentation used in the analysis.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 66 of 105
5.3.5.5 Air Toxics Laboratories
Air toxics laboratories support the NATTS, PAMS, UATMP, and community scale studies. Air toxics
monitoring typically involves analysis for VOCs, carbonyls, PAHs, and metals. VOCs and carbonyls are the
only compounds that are required in the PAMS program that may require an analytical laboratory. The
methods for air toxics sampling and analysis are dictated in the NATTS TAD, which are based on EPA TO
and 10 methods. NATTS labs are required to follow the TAD, while other toxics programs are encouraged
to comply with the NATTS TAD. These analytical methods are listed above in Table 5.1.
Air toxics laboratories that support the NATTS and PAMS programs are audited by OAQPS (or its
contractors) as part of national QA oversight. The air toxics TSAs typically do not occur at the same time
as the regional TSAs of the PQAOs. Regional auditors are invited to participate in these audits if
resources allow. During years where audits are not performed by OAQPS (or its contactors), regional
audit teams are encouraged to include these labs in their TSAs if possible. Air toxics labs that do not
support the NATTS program are not audited through the NATTS program. In the audit planning activities,
the audit team should have identified if the toxics laboratory is part of the national audit program or if it
needs to be included in the TSA.
NATTS laboratories are required by the NATTS program to have EPA-approved QAPPs and SOPs that are
based on the NATTS TAD and its included methods. The audit team should be aware that the NATTS TAD
is a performance-based document, but it has some specific requirements that must be followed. For the
performance requirements, the auditor should be prepared to review data and processes that verify
that the laboratory is meeting the requirements. Because of the complexity of the methods, it is
essential that a skilled chemist is included on the audit team who is well acquainted with the NATTS
TAD, its QC requirements, and understands the concept of performance-based requirements.
5.3.6 AQS Submittal Group
Every monitoring organization must submit its ambient monitoring data to the AQS. Ambient monitoring
data includes:
•	Monitoring data (measurements)
•	Quality control data
•	Site metadata
•	Network metadata
Each monitoring organization typically has a group dedicated to generating and uploading the data
transactions to AQS. This process can be relatively complicated and the staff should have the necessary
skills and training to accomplish this task. Pre-audit review should have verified that the monitoring data
is being submitted completely and in a timely manner. Additionally, the audit team should have
reviewed the data to ensure that it has been coded or flagged appropriately.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 67 of 105
Along with the measurements from the analyzers and samplers, site and network metadata must be
submitted and maintained in AQS. This information includes information from the sites themselves and
the equipment operating at the sites. AQS requires the use of codes and other specific AQS parameters
for the data. Some examples of this metadata are:
•	Network affiliation
•	Parameter codes
•	Monitor types
•	Unit codes
•	Ozone seasons
•	Site addresses
•	Site names
•	GPS coordinates
•	Collection frequencies
•	NAAQS exclusion codes
This monitoring metadata is used for various types of assessments and the audit team should verify that
that this data is correct in AQS. Issues discovered during the planning phase of the audit should be
discussed during the visit to the AQS Data Submittal Group. While some of the metadata review can
take place during the audit planning phase, the actual verification of equipment must occur on-site.
Reviewing AQS codes and site configurations can be an arduous task and should be completed by an
audit team member that possesses knowledge and experience using AQS.
The audit team should also ensure that the AQS submittal staff has the appropriate supporting materials
and information from the monitoring and QA staff to properly code the data for upload to the AQS. The
AQS staff should have records that corroborate data qualifiers and null codes, and the audit team should
review these records to ensure that the coding is correct. Additionally, the audit team should compare
the monitoring organization's records to the AQS reports to ensure consistency. It is essential that the
data and any qualifiers in AQS accurately reflect the supporting documentation on-site.
5.4 Standards Certification Records Review
In support of the network's quality system, monitoring organizations use various types of precision
equipment and standards to perform QC checks, calibrations, and certifications. Each piece of
equipment or standard used in the network must have certification documentation that demonstrates
its traceability back to a reference standard of higher authority, such as a NIST or a primary standard.
The equipment used in the network must compare to the reference standard within acceptance criteria
documented in the QAPP and/or CFR. The monitoring equipment and standards include, but are not
limited to, gaseous standards, photometers, calibrators, MFCs, flow standards, manometers, orifices,
and other equipment. Each of these devices must have specific certification documentation that the
audit team should examine.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 68 of 105
The audit team should review certification records for all standards in use by the organization during the
3-year audit period. Each standard used in the network should be traceable, and the standard must have
been in certification throughout the three-year time period. If the organization has an in-house
certification program in the maintenance and repair shop, the audit team should ensure that this
equipment has also been calibrated against standards of higher authority or accuracy. For example, a
flow meter used in the field for QC checks with an accuracy of 0.3 LPM should be calibrated or certified
using a certified device with higher accuracy, such as 0.1 LPM. For small organizations, the audit team
may be able to review all standards certifications, while only a representative subset may be examined
for a large organization. For large organizations with district or local offices, the audit team should
request documents from these locations during the on-site audit.
The audit team should verify that the certification records contain the following information:
•	Dates and environmental conditions at the time of certification
•	Device receipt condition (in tolerance/out of tolerance)
•	Device returned condition (in tolerance/unable to meet tolerances)
•	Statement of traceability
•	Standard(s) used to conduct certification
•	Calibration method performed
•	Calibration interval/expiration
•	Contact information of certifying authority
Typically, the information above is standard in a vendor certification; however, in-house certification
program certificates should be closely scrutinized to ensure this information is documented. If the
information listed above is not included in the certificates, the audit team should note these findings.
5,5 Monitoring Site Inspections
Monitoring data collection begins in the field. Therefore, data quality depends on good field practices.
During a TSA, it is essential that the audit team visits and inspects the organization's air monitoring
stations. While on site, the auditor should inspect both the exterior and interior of the monitoring
station. Routine operations, site housekeeping, safety, quality control activities, documentation, and
siting criteria are areas that should be reviewed.
Safety is always the top concern when auditing monitoring sites. Two auditors are recommended to
efficiently and safely inspect monitoring sites. Using a second auditor to record measurements and
identify potential hazards is the best practice when the inspection requires an auditor to work on a
rooftop or ladder taking measurements, tracing sampling lines, or other activities where safety may be a
concern. Also, no two auditors have the same level of expertise on all aspects of monitoring; therefore,
a combination of auditors can conduct a more thorough and complete audit.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 69 of 105
The size of the network and resources allotted for the TSA may dictate how many monitoring sites can
be visited. While it may be possible to visit all monitoring sites in a small organization, it may not be
possible to visit all sites in a larger organization. The audit team should, at a minimum, select enough
monitoring sites to observe all sampler types that are operated in an organization, and select sites that
are maintained by different site operators and/or via different management structures. This approach
will help the auditor to determine if the standard operating procedures are being consistently followed
and to acquire general information on the maintenance of the monitoring sites from operator to
operator.
Accurate data is contingent upon instruments (monitors, samplers) that are properly calibrated,
operated, and maintained by a trained and conscientious site operator. The site operator is responsible
for the collection and handling of manual air samples (particulates, toxics) and the operation of
continuous air monitors. The site operator is also responsible for conducting routine QC checks,
conducting varying degrees of maintenance, and implementing corrective action. Proper documentation
of these activities and the resulting data is essential. Therefore, the audit team should reserve time on-
site to interact directly with the site operator to discuss the above activities, as well as to inspect the
field sites, equipment, and available documentation. The audit team should also attempt to observe the
site operator's sample handling techniques and any other routine activities at the monitoring site. One-
on-one interaction with the site operator is desired during these activities, without the involvement
from supervisors or other management, to encourage candid feedback.
The organization's QAPPs and SOPs should detail field procedures and specify the acceptance criteria for
instrument QC checks. Section 4 discusses the importance of reviewing the organization's QAPPs and
SOPs prior to conducting the TSA. Reviewing these documents prior to the on-site audit will alert the
auditor to any potential issues that may need to be inspected or clarified during the on-site TSA. In the
absence of an in-depth review of the quality documents, technical discussions with the site operator(s)
and review of documents and data in the field can alert the auditor to potential quality system issues (or
issues with quality systems documents). For example, if the auditor observes an inconsistency with field
practice(s) across a network, it could indicate the site operators are not following their QAPP/SOPs or
have not been properly trained. Similarly, if an auditor asks the site operator what their QAPP/SOP says
about a particular procedure or requirement and he/she does not know the answer, then it could be a
sign that the site operator is not being trained or isn't using the quality documents. It could also be a
sign that the quality documents need revision, because they lack critical information to help the site
operator do an adequate job.
Additionally, the auditor should discuss issues raised during the TSA Questionnaire meeting with field
staff to further clarify the responses received on the questionnaire. This can provide insight into how
consistently the organization is operating.
While not a core component of the TSA, some audit teams may choose to conduct performance audits
on air monitoring instruments during the site visit, such as sampler flow rate, barometric pressure, and

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 70 of 105
temperature checks. The auditors may also examine PM2.5 PEP, Pb-PEP, and NPAP data to gauge
instrument performance.
5.5.1 Inside the Monitoring Station
Inside the monitoring station is where the majority of the ambient monitoring work occurs, and most
continuous analyzers and support equipment will be found inside the station. The construction of a
monitoring station can range from a stand-alone shelter to a room inside of a larger building, such as a
school. Monitoring sites can also have varying amounts of equipment, from a single particulate sampler
to an NCore site measuring all NAAQS pollutants. Nevertheless, all monitoring sites include common
items and are operated in a similar fashion. An experienced auditor can enter any monitoring station
and quickly recognize what is being monitored and the methods that are employed. A novice auditor
may need assistance in understanding the methodologies. Common audit activities or items to inspect
and document inside an air monitoring station could include:
•	Sample lines and manifold material (appropriate probe material for pollutant of interest)
•	Sample lines and manifold condition (clean, condensation-free)
•	Manifold designs (properly functioning blower, no uncapped calibration lines or ports)
•	Housekeeping (clean, organized)
•	Potential sources of sample/measurement contamination (gasoline-powered weed-trimmers,
dust, cigarette smoke)
•	Properly labeled chemicals in secure containers
•	Safety issues (electrical issues, compressed gases unsecured, etc)
•	Availability of safety equipment and first-aid kits
•	Appropriate sampling equipment (FRM/FEM status)
•	Examination of maintenance records (samplers, monitors, support equipment, station)
•	Examination of an analyzer using a specific monitor checklist (if available)
•	Cylinder certification dates (in certification, certification document available)
•	Calibrator certification dates (in certification, certification document available)
•	Other standards certification dates (in certification, certification document available)
•	Monitoring and support equipment age (analyzers, samplers, calibrators, data loggers)
•	Residence time calculations for sample paths
•	Documentation (logbooks, spreadsheets, bench sheets)
•	SOPs and user manuals
•	Availability of maintenance supplies (o-rings, scrubber refills, common replacement parts) and
tools
While inside the site, the audit team should work with the site operator to access the data acquisition
system and retrieve monitoring data and chart measurements over a specified period. This exercise
demonstrates that the site operator has the expertise to access the monitoring data and assess the
operation of the equipment at the site. The site operator should also be able to easily move through the
menus of the instruments to access diagnostic data. During the audit planning phase in Section 4.4.3,

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 71 of 105
the audit team may have identified datasets for inspection that are not accessible using AQS, such as
shelter temperature and calibration data. The monitoring site inspection provides an opportunity for the
audit team to see these data and collection methods. If the site operator cannot operate the data
acquisition system or is unfamiliar with basic operations, this is an indicator that site operator has not
been properly trained.
The age of monitoring equipment at the site (including analyzers, samplers, calibrators, and data
loggers) is important to highlight. EPA monitoring cost estimates include an amortization schedule of
seven years for the replacement of monitoring and sampling equipment. It is very important to ensure
that the equipment used on-site is technologically capable of performing the tasks intended, as well as
being currently supported by the vendor. Older equipment, while still being functional, may not have
the ability to deliver the required precision at current concentration levels or integrate properly with
modern data logging systems. Monitors operating at a higher range may not be appropriate for areas
where trace concentrations are regularly observed. Older monitors that are no longer supported by the
vendor may result in excessive downtime if parts and support are not readily available. Because of these
potential issues, the age and capability of monitoring equipment on-site is paramount to a successful
monitoring program.
5.5.2 Outside the Monitoring Station
External influences at a monitoring location, such as vegetative growth and nearby sources, can have an
impact on the quality of data collected at a site. With regard to the station's surroundings, the site visit
presents the best opportunity to confirm that monitors/samplers are sited in accordance with 40 CFR
58, Appendix E. A checklist can be found in Appendix B of this QAGD to assist the auditor with verifying
site conformance to the Appendix E requirements. Common audit activities or items to inspect outside
an air monitoring station could include the following.
•	Particulate samplers and their operation (setting up or collecting samples)
•	Sampling probes and rain shields (acceptable material and construction)
•	Obstructions around the site (buildings, food trucks, walls, parapets)
•	Impact of trees (10 meters from the drip line, primary wind direction)
•	External sampler maintenance (sampler cabinets, inlets, separators)
•	Emission sources (restaurants, exhausts, chimneys, gas stations, etc)
•	Distance from roadways
•	Paving or lack thereof (grass, gravel, concrete, etc.)
•	Inspection of grounds (safety issues)
•	Grounds maintenance (is vegetative growth an issue?)
•	Security
5.6 Concluding the Audit
After the on-site investigation has been completed, the audit team must begin the process of concluding
the audit. For all practical purposes, the exit briefing is typically the end of the investigative phase of the

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 72 of 105
TSA. The exit briefing brings all of the staff together to hear and discuss the results of the audit. This
briefing is the culmination of all of the preparatory work and the on-site activities, and it is the most
anticipated event of the audit. The audit team must make the proper preparations and deliver the
results in an unbiased and understandable manner for all in attendance. The following sections describe
the preparatory activities and provide direction in conducting the exit briefing.
5.6.1 Exit Briefing
The exit briefing is one of the most important and potentially stressful activities during the TSA. It is at
this point that the audit team formally communicates the audit results to the monitoring organization.
The exit briefing should be conducted as quickly as possible following the audit, preferably immediately
following the TSA itself and on-site with the monitoring organization's staff.
5.6.1.1 Exit Briefing Preparation
In preparation for the exit briefing, the audit team should meet to compile all final information. The
audit team should have been conducting daily briefings during the TSA as described in Section 4.7.2 of
this QAGD. The final daily briefing before the exit briefing should only include reviewing the last day's
information and drawing final conclusions regarding the overall results of the audit. However, if daily
meetings have not occurred, then the meeting prior to the exit briefing will be of higher importance
since it will be the only opportunity for the audit team to talk at length, compare notes, and discuss all
findings and their implications. In addition, without daily briefings, preparation for the exit briefing may
be lengthy and all audit results may not be captured completely or accurately. In short, daily briefings
shorten the preparatory time for the exit briefing and will help the audit team be better prepared.
The audit team should compare observed non-conformances and deficiencies during the audit process
including: the organization's quality documents, results of interviews and discussions with key
personnel, records review, and data quality issues. Significant issues should be categorized based on
severity (e.g., finding, concern, observation, as discussed in Section 6 of this QAGD), and the audit team
should look for systemic issues or "themes" within the findings. The audit team should also collaborate
on potential recommendations to address the identified issues to prepare for questions from the
monitoring organization's management. It is vitally important for the audit team to be prepared for the
exit briefing with the organization's senior management and to be ready to answer questions. The audit
team should have supporting information on hand to support all findings including regulatory citations,
method requirements, and scientific rationale. The lead auditor should assemble this information and
prepare a briefing sheet or speaking notes for the upcoming meeting. The lead auditor should arrange
the speaking points in a logical manner that will help the information flow and be easier for the
organization's senior management to digest. All findings should be presented in a professional manner
that is accurate, clear, and constructive.
In addition to non-conformances, the audit team should identify and list areas where the monitoring
organization has excelled or where progress has been demonstrated. These kinds of observations help
paint a more complete picture of an organization, rather than focusing solely on the deficiencies.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 73 of 105
However, the auditors must exercise caution when issuing commendations. Commending an
organization should not be used to balance out non-conformances or overshadow obvious data quality
or quality system issues. All deficiencies and commendations should be factual and without bias. Section
6 contains examples of commendations.
5.6.1,2 Conducting the Exit Briefing
The TSA exit briefing should be conducted at the monitoring organization's main office, if possible. At a
minimum, the audit team, monitoring organization staff, and their senior management should be in
attendance. The exit briefing should include the following elements:
•	Introductions of staff in attendance
•	Presentation of audit results
•	Discussion of the TSA report process
•	Discussion of the required Corrective Action Plan and follow-up process
•	Closing dialogue to address questions and comments from the monitoring organization.
The exit briefing may vary in length depending on the results of the audit and the interaction with the
organization staff, but the lead auditor should attempt to keep the exit briefing under an hour and a
half. The audit scribe should plan to take notes throughout the exit briefing so that the lead auditor may
focus on delivering the findings and answering questions from the monitoring staff. In the event that the
organization's key monitoring personnel or senior management officials are unable to attend the exit
briefing, a follow-up conference call should be offered during the meeting and scheduled to occur after
the audit team returns to the Regional Office. It is imperative that such a discussion occur as soon as
possible following the on-site investigation, so that the monitoring organization is made aware of the
audit findings and can begin necessary corrective action measures.
The lead auditor should conduct the meeting and direct the presentation of audit results during the exit
briefing. The primary focus of the exit briefing will be the discussion of the audit findings and areas of
needed improvement. The lead auditor should guide this dialogue, asking audit team members to
augment the discussion when appropriate. There should be no dissension during the briefing within the
audit team, and the audit team should be unified in presenting the results of the audit. The tone should
be professional and courteous; however, the team should be prepared to defend the audit results and
have supporting material on-hand. Exit briefings have the potential to be contentious, and it is
important for the audit team to listen and not become reactionary. Audit team responses must be
rooted in facts and avoid opinions that can cause confusion.
As discussed above, the lead auditor should also offer commendations during the exit briefing. The lead
auditor should discuss improvements since the last TSA, outstanding accomplishments that deserve
special recognition, and procedures that the organization implements that are especially innovative or
efficient.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 74 of 105
The exit briefing also serves as an opportunity for the monitoring organization's staff to ask questions,
request clarification, and present any additional evidence that may refute a finding. For example, if a
critical piece of information was missing on the first day of the audit, but staff found the misplaced
document/record later in the week, that information can be presented to the auditors during the exit
briefing. Similarly, the exit briefing provides an opportunity for the monitoring organization staff to
address inaccuracies in the findings. For example, a calibration line may have been identified as
uncapped allowing shelter air into a sample train. The site technician may dispute this by informing the
audit team that the line was connected to a monitor that was not currently in operation. In these
situations, the monitoring organization may need to present additional information or data that will
negate the finding. In some cases, the monitoring organization may ask the audit team to revisit a site or
laboratory, in order to ensure all information-gathering was accurate and complete.
The audit investigation will sometimes uncover issues that necessitate additional review which may not
be feasible during the limited time on site. With this in mind, the auditor may have to request that the
monitoring organization submit additional documents, records, or data for review after the completion
of the exit briefing. In this event, the request should be discussed during the exit briefing. The auditor
should ask that the information be provided as soon as possible, but ideally within one week
following the site visit. Moreover, the auditor should clearly explain the rationale behind the
information request during the exit briefing. This explanation is important so that monitoring personnel
present, as well as the senior management, understand the request and recognize that additional
findings may result from it. Therefore, the exit briefing, under this scenario, may not be a complete
account of all findings, which will necessitate a follow-up conference call after the requested
information has been received and assessed (see Section 6.0 of this QAGD).

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 75 of 105
6.0 Report Development and Corrective Action Process
Generally speaking, the audit "ends" once the on-site investigation concludes. However, the TSA process
continues after the on-site investigation phase is completed. The activities that follow the on-site
investigation involve preparing the written TSA report to document the audit findings, and then working
with the monitoring organization at length to ensure identified issues are successfully resolved. This
section of the TSA QAGD provides more details regarding the events that should occur following the
completion of the on-site investigation.
The report development and corrective action process is one of the most important components of the
TSA. This section defines the steps and the timeline that the audit team should follow to ensure that an
accurate report is written and timely follow-up is completed. The following flow chart in Figure 6.1
provides a summary of these activities and the timelines that the audit team should follow in this
process.
1 Week
following the
audit
30 days of audit
completion
15 days following
draft report
30 days following
draft comments
30 days following
receipt of final report
45 days following
receipt of CAP
Figure 6.1 TSA Report Development and Corrective Action Flow Chart
6.1 Unresolved Issues and Additional Analyses
Although the time and effort spent planning for the TSA (i.e., pre-audit activities) should cut down on
any additional review needed after the field investigation concludes, there may be times when
Follow-Up Unresolved Issues
Internal Draft Report Technical Review
Issue Draft Report to Monitoring Organization
Review of Draft Report by Monitoring Organization
3J
Issue Final Report to Monitoring Organization
Corrective Action Plan from Monitoring Organization
3J=
Corrective Action Follow-Up Call
Continuing Follow-Up of Corrective Action
JF
TSA Close-Out in AOS

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 76 of 105
additional analyses or follow-ups are warranted. Issues may arise during the on-site investigation such
that the audit team may need additional resources in order to make a determination regarding a specific
finding. For example, some audit findings may require consultation with OAQPS, especially if they could
result in the flagging or invalidation of a significant amount of data. Additionally, the auditor may need
to consult with other EPA staff or OAQPS directly to discuss the interpretation of a regulation before
deciding how to frame a finding. Although contacting other EPA staff or OAQPS while still on-site during
the field investigation is preferred - so that the organization can immediately know the outcome - these
conversations may not be possible until after the audit team has returned to the Regional Office.
As discussed in Section 5 of this QAGD, observations during the on-site investigation may uncover issues
that need additional review. The auditor may not have time to complete these additional reviews while
on site. For example, the audit team may request specific data to analyze (such as raw temperature and
humidity data from a gravimetric laboratory), if procedures observed during the audit warrant its
examination. Or, the audit team may observe an unexpected procedure in the field and, subsequently,
request to review the SOP for that procedure (especially if that document was not provided to the
auditor beforehand). Conversely, there may be instances where post-audit review is necessary because
the monitoring organization itself was unable to provide requested information during the field
investigation due to limited staffing, time, or other reasons (such as an ineffective records management
system). Under scenarios such as these, continued analyses and discussions following the exit briefing
may be necessary.
Any information requested to be submitted post-audit should be discussed with the monitoring
organization during the exit briefing (see Section 5). The exit briefing is the time to clearly outline any
missing information and establish a deadline for submittal. Following the exit briefing, requesting new
information, beyond what was discussed during the exit briefing, should be kept to a minimum. The
auditor's goal is to provide the monitoring organization a draft findings summary (i.e., written audit
feedback - see Section 6.2.2 below) within 30 days of the completion of the on-site investigation. A
completed list of audit findings - along with any resulting conclusions - cannot be drafted for a written
report if information-gathering is ongoing. With that in mind, the recommended best practice is to limit
requesting any additional documents, records, or data to one week following the completion of the
exit briefing. It is important to remember that the audited organization has also expended time and
resources in preparation for the audit (e.g., documenting the TSA questionnaire, gathering and
submitting QAPPs and SOPs to the Regional Office, and so forth), as well as participating in the on-site
investigation. Post-audit information gathering by the lead auditor should conclude as promptly as
possible, in order to facilitate the timely completion of the TSA report, as well as to allow the audited
organization to resume its normal business routine.
However, there are certain situations where the auditor may need to request additional information
beyond the recommended one-week follow-up period. For example, if the audited organization submits
the wrong information, or the submitted information is not summarized in the manner originally
requested by EPA, then the auditor may request the monitoring organization resubmit - which will delay
resolution of the outstanding issue(s). Or, on the other hand, if the information submitted by the
monitoring organization clearly indicates a more serious operational problem, then the auditor may

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 77 of 105
have to request additional clarifying information or data in order to better understand the magnitude of
the situation. Regional Office upper management may need to be notified and become involved under
these scenarios, or when/if a monitoring organization is not cooperating in a forthright manner.
However, these example scenarios should be infrequent; extending the TSA beyond the on-site
investigation should not be the routine practice. (See Section 8 of this QAGD for more information on
elevation points.)
After the lead auditor has received and assessed all requested material, communication is needed with
the monitoring organization in order to relay the results of the follow-up analysis. The best practice
approach is to conduct a conference call with the organization's senior management and monitoring
staff in order to discuss any new findings and their implications. However, if no new findings resulted
from the analysis, the lead auditor should still inform the organization that no new issues resulted from
the information review - either via telephone or written communication (such as email). Similarly, if an
issue was discussed during the exit briefing that required input from OAQPS, the auditor should follow-
up with the audited organization after the necessary input from Headquarters has been obtained. It is
important that any "loose ends" be resolved promptly, so that needed corrective actions can begin.
6.2 TSA Report
The purpose of the TSA is to help the monitoring organization improve its ambient air monitoring
program by identifying vulnerabilities that weaken the organization's quality system and/or the
defensibility of its data. The TSA report serves to formally document the audit process and findings. The
audit report should be composed in a concise, easy-to-read manner that focuses on objective evidence.
Effective communications throughout the TSA process should lead to a written TSA report that contains
no surprises, which in turn fosters a partnership between the organization and the auditor in the
months that follow.
It is important to remember that the written TSA report can be reviewed not only by the organization's
technical monitoring staff and program manager, but also by the organization's director, other higher
ranking departmental or cabinet officials, as well as by the community at large. With that in mind, the
lead auditor should consider the TSA report's audience when framing findings and providing supporting
evidence and discussion. Ambient air monitoring regulations are complicated; the monitoring
procedures, technical and intricate. The written TSA report should present these complex topics in a
manner such that any reader can easily interpret the information and understand its significance. The
lead auditor (and audit team) should keep the following directives from the EPA Communications
Stylebook10 in mind when preparing the audit report:
>	Use plain language - which is communication the audience can understand the first time they
read or hear the information being presented.
>	Use active voice as much as possible. The two different ways of using verbs are known as
voices. When the verb is active, the subject of the verb is doing the action - so the voice is
10 https://www.epa.gov/stylebook

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 78 of 105
active. When the verb is passive, the subject undergoes the action, rather than doing it. Passive
is formed with tenses of the "be" verb, along with the past participle of the main verb. Active
voice is more interesting to read than passive voice, and is more likely to keep the reader
engaged. Sentences are often more concise when using active voice.
> Write in third person. Although second person should be used for most Agency
communications, the EPA Communications Stylebook indicates that technical documents, such
as a TSA report, are best presented in third person. The Stylebook also states that topics which
might involve strong negative reactions, or might be considered too personal, are better written
in third person as well.
The lead auditor should keep these writing guidelines in mind when preparing the TSA report. The
report should be written and formatted such that the reader can easily identify and understand the
areas where improvement within the organization is needed. Important findings and recommendations
may be lost if the audit report is too technical or verbose.
It is also important to note that, just like the TSA itself cannot thoroughly delve into every single
component of the organization's monitoring network, the TSA report will not capture every discussion
point, observation, or technical support recommendation offered during the on-site investigation.
Instead, the TSA report focuses on the most significant findings, technical and/or systemic, that indicate
non-compliance with regulatory requirements, or jeopardize the organization's data quality or
defensibility. Because the report serves to focus on these "big picture" issues, the lead auditor is
afforded some discretion - based on professional judgment, experience, and through collaboration with
the audit team - regarding whether to include minor observations in the final written report.
The pre-audit and on-site TSA activities often generate many pages of notes, checklists, and
photographs. To begin writing the report, the lead auditor should first assemble and organize all of the
team's notes and records, which may be in hardcopy and/or electronic formats. It is recommended that
the lead auditor utilize the first week following the on-site investigation to start drafting the TSA
report, while findings and observations are still fresh in the audit team notes, and before additional
work assignments shift the lead auditor's focus. The full report may take a few weeks to complete,
especially if additional analyses and communications post-site visit are warranted. Therefore, it is
recommended that management set the completion of the TSA report as a high priority and afford the
lead auditor the time necessary to complete the assignment.
6.2.1 Report Structure & Elements
EPA Regional Offices may have different writing styles and layouts for their official correspondence.
Some Regional Offices may compose TSA reports with detailed narrative that chronicles the general
observations of the audit, and provides visual imagery (i.e., vivid descriptions of objects) regarding each
field site and facility visited, whereas others may omit observations and imagery unless they are related
to an issue found on site. However, regardless of style, it is imperative that the TSA report be formatted
in a manner that clearly highlights the audit findings. The monitoring organization must be able to
quickly and easily identify the major issues requiring corrective action within the body of the TSA report.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 79 of 105
Therefore, the recommended best practice approach is to compose a succinct report that centers the
reader's attention to those issues requiring corrective actions, omitting imagery and details that do not
benefit the overall discussion. Appendix E of this QAGD provides an example template for writing a TSA
report.
Note: It is important that all TSA reports produced by an individual EPA Regional Office use the same
format and structure, and contain the same essential elements. Under no circumstances should the
auditor craft a unique TSA report for each organization audited.
Although the writing styles themselves may vary, in order to maintain consistency across the EPA
Regions, the following elements must be included in all EPA TSA reports:
•	Title & Approval Page - Contains the audit title, as well as the name and signature of the
individual who authorizes the report (mostly likely a designated Regional Offices approving
official or manager). The name and signature of the lead auditor should also be included on the
title page.
•	Executive Summary - An abstract, geared towards senior management, which summarizes the
most significant audit findings and explains the overall implications. Senior management officials
may not be fluent in the complex, technical aspects of the ambient air monitoring program;
therefore, a summary that concisely explains the big picture issues is an especially important
component of the TSA report. The abstract describes the significant findings in only a few
paragraphs. Typically, an executive summary is 1-2 pages in length, maximum.
•	Introduction - Should state when and where the audit occurred, list the audit team members,
audited participants, and field sites and/or facilities visited. The Introduction should also provide
a summary of the audit preparation activities (such as a listing of the AQS reports and quality
system documents reviewed prior to the on-site audit).
•	Findings and Recommendations - Itemizes the issues noted during the audit. For each non-
conformance observed, the report should state the finding, provide a brief discussion of the
finding, and follow it with a recommended solution/action to address the finding. (See Section
6.2.1.1 for more information.)
•	Conclusions - Recaps the most significant TSA issues, particularly those that appear to be
systemic within the organization, and discusses the consequences. More detailed than the
Executive Summary, the Conclusions section typically contains more technical information,
details, and/or specific examples. It highlights the most significant issues and clearly identifies
those findings that must be corrected in order for the organization to be in compliance with
regulations, or produce data sets that can be used for regulatory decision-making purposes.
Provides the targeted due date for the organization's written response to the TSA report and
anticipated Corrective Action Plan.
Other elements that may be included in the written TSA report include a cover page, Table of Contents,
commendations, and appendices. A Table of Contents is especially beneficial when the audit report is
lengthy; it allows the reader to quickly locate specific sections of the report. Some Regional Offices may
include the TSA questionnaire, as completed by the audited organization, as an appendix. A Corrective

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 80 of 105
Action Plan template, or Auditing Finding Tracking Form, may also be included in the appendix of the
report (see Section 6.4 of this QAGD for more information). Other items that may be included in the
appendices include audit checklists, photographs, and data analysis charts/graphs, among others. The
body of the TSA report can include commendations, where applicable; or, if the organization has made
numerous improvements since the last TSA, the report can have a separate section to highlight those
achievements. A Commendations section can also provide recognition for innovative processes or
emphasize procedures the audited organization does exceptionally well. However, commendations in
the written audit report should not be used to offset or diminish the severity of the findings.
Photographs can be beneficial in the
written TSA report, in order to more
clearly illustrate a finding or
observation. If used, the images
should be of sufficient clarity and
detail to support the stated issue. It is
important to remember that
photographs taken during the on-site
investigation serve as evidence and
will need to be logged in accordance
with the Regional Office's standard
operating procedures for field
activities. Typically, documentation of
who took the photo, when, and
where needs to be chronicled.
Recommended procedures are outlined in the following EPA document: Digital Camera Guidance for
EPA Civil Inspections and Investigations, U.S. EPA Office of Compliance, July 2006.11 However,
photographs are not required in the TSA report.
As a recommended best practice, the audit report should use a "Page X of Y" numbering format.
6.2.1.1 Report Elements
Core elements within each TSA report should be similar across the EPA Regional Offices, even if report
formats differ. Therefore, in order to maintain national consistency, the TSA report will include the
following information, at a minimum:
•	Audit report title, document control number, and/or any other unique identifying information;
•	Name of the organization preparing the report;
•	Names of audit team members, specifying the lead auditor;
•	Name, title, and signature of the individual authorizing the report;
•	Date of the audit;
•	Name of the audited organization;
Example Commendations
The agency has made numerous changes and enhancements to its
ambient air monitoring program in the past three years. These
improvements should enhance the agency's long-term data
capture, as well as bolster data quality. Examples of the
improvements include the following:
•	All field instruments have been replaced;
•	Data acquisition software has been upgraded;
•	Polling capabilities have been upgraded from analog to
digital; and,
•	Equipment has been purchased for purposes of
implementing internal performance audits.
11 https://www.epa.gov/sites/production/files/2013-09/documents/digitalcameraguide.pdf

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 81 of 105
•	Names and/or titles of audited participants;
•	Location(s) audited;
•	Audit purpose (i.e., reference to 40 CFR Part 58, Appendix A, ง2.5);
•	Specific programs or parameters audited;
•	Specific programs or parameters that were not covered by the TSA;
•	Explanation of finding categories/rankings (i.e., define terminology);
•	Identification of audit findings;
•	Supporting evidence and discussion of findings, including relevant regulatory or QAPP/SOP
citations;
•	Corrective action recommendations (see below); and,
•	Request for the organization's Corrective Action Plan.
Note: If a document control number (or similar) is used as a unique identifier for the specific TSA project,
it is recommended that the number be located within the header or footer of each page of the TSA
report.
Many of the items listed above can be included in the audit report's Introduction section. With regards
to programs and/or parameters that were not audited, it is important to add a disclaimer in the written
report to better inform the reader as to the extent of the audit investigation. Although it is generally
understood that a TSA cannot realistically cover all facets of an organization's monitoring program, it
helps to clarify which specific portions of the program were excluded from review. For example, a
statement such as the following could be added to the report:
"Due to time and resource constraints, this TSA focused on the field measurements used to
demonstrate compliance with the National Ambient Air Quality Standards (NAAQS). As such, this
TSA did not focus on meteorological measurements or air toxics field sampling and data
collection techniques."
Such a disclaimer can also be helpful to future EPA auditors. For instance, during pre-audit planning
activities, the auditor will see which programs/components were not reviewed when reading the
previous TSA report, and can then use that information to determine whether or not to make those
specific elements a priority during the upcoming TSA. Similarly, the Introduction should include a listing
of the documents reviewed in preparation for the audit (e.g., QMP, QAPP, SOPs, AMNP, internal audit
reports, etc), as well as data analyzed and reports queried. This listing documents information reviewed
by the Region which may or may not be discussed in the audit findings - and as such, may be helpful to
future auditors when assessing the scope of the previous TSA. This information may also be helpful to
the monitoring organization, in order for management and staff to better understand the extent of the
preparatory review. The summary of preparatory activities also gives credit to the lead auditor and audit
team for the many hours of study that went into the TSA that otherwise may not be recognized.
It is important to reiterate that audit reports should be focused on findings and recommendations
related to the monitoring organization's quality system. As such, the report should not contain
information that will be included in other EPA correspondence. For example, concerns related to the
review of the monitoring organization's Annual Monitoring Network Plan should not be included in the

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 82 of 105
written TSA report, since the Regional Office should issue separate letter(s) to the monitoring
organization regarding the network design. Additionally, TSA reports should not be used to estimate
design values for the monitoring organization's CBSAs/MSAs, as this information will be communicated
to the monitoring organization through other means.
Ultimately, the TSA report should contain language that guides the monitoring organization towards the
development of an effective Corrective Action Plan. Although the audited organization is charged with
determining the root causes of issues identified during the TSA and solutions that will work best within
their unique structure and network, the lead auditor should establish the EPA's minimum expectation
for corrective action within the written TSA report. Towards that end, the auditor should follow the
discussion of each itemized finding with a recommended action to address the finding. For example, if
quality system documentation is identified as outdated and in need of revision, the lead auditor could
recommend that the organization develop and submit to EPA a prioritized timeline for revising the
documents, at a minimum. In this manner, the organization must still determine the core reason(s) why
the documents are outdated and, with that, develop a plan that will result in both the timely revision of
the documents and a mechanism to ensure the documents stay up-to-date in the future. However, by
recommending in the TSA report that the monitoring organization prioritize the document reviews and
submit a revision schedule, the lead auditor is providing the audited organization with an indication of
EPA's minimum expectation towards addressing that specific finding. The monitoring organization can
then use that recommendation as a springboard for developing a more comprehensive corrective action
strategy.
In some cases, the TSA findings impact data quality and/or
demonstrate ambient data is not acceptable for use in
regulatory decision making. As a consequence of those
findings, concentration data must be appropriately flagged
or invalidated in the EPA AQS database. In this event, the
recommendations in the TSA report should include
directives towards necessary data modifications in AQS, in
addition to establishing a minimum expected action(s) to
resolve the underlying issue. In many cases, data are
impacted by a localized or short-term issue identified by the
EPA auditor, which only affects a small amount of data. The
appropriate AQS data handling convention for these issues
is typically straightforward and does not impact the
monitor's data completeness. However, the needed AQS
edits should still be specified in the TSA report, in order to
ensure that those specific modifications are included in the
organization's Corrective Action Plan, and resultantly
tracked and completed (see Section 6.4). On the other
hand, the consequences of some TSA findings may include
invalidating a significant portion of data, which could then impact the designation process. Under these
circumstances, the TSA report should clearly and directly state that the impacted data must be
Example Finding and
Recommendation Involving Short-
Term Data Loss
Finding: A through-the-probe calibration
line was left uncapped in the shelter after
the site calibrator was removed for
maintenance. As a result, the analyzer
sampled ambient air diluted with shelter
air.
Recommendation: Data must be
invalidated back to the date/time the
calibrator was removed from the site. In
addition, the agency must cap the
calibration line anytime the calibrator is
removed; this requirement should be
specified in the agency's SOP.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 83 of 105
invalidated. It is important to note that some monitoring organizations may not make large-scale edits
in AQS, due to institutional roadblocks or other agendas, without explicit language in the written TSA
report stating such action is required. Therefore, in order to provide the organization with the support
and justification it needs to make significant modifications to its program or data, the audit report must
state the necessary and/or minimum expectation for corrective action.
6.2.1.2 Ranking TSA Findings
Issues identified in an official TSA Report as "findings" can serve as a catalyst to promote change within
the audited organization, especially when identified shortcomings represent a departure from
regulatory requirements that jeopardize the validity of the organization's data. The organization's
technical staff and monitoring program manager are often aware of the needed improvements within
the network, but may be unable to address deficiencies due of a lack of resources or other institutional
roadblocks. In that regard, the written TSA report can sometimes serve as the tool the organization's
monitoring staff have needed to leverage senior management's support and, resultantly, acquire the
resources necessary to improve their program.
To provide a better understanding of the magnitude of the technical issues uncovered during the TSA,
the findings should be categorized. Ranking the findings in the TSA report will draw attention to the
most serious deficiencies and explain their import. Upon receipt of the audit report, it is often tempting
for a monitoring organization to complete the quickest or easiest corrective actions first, which may or
may not address the most significant audit findings. Therefore, by ranking the findings in the TSA report
itself, the categorization will assist the audited organization in prioritizing its corrective action plan.
Ideally, the most severe non-conformances should be corrected first, with less serious findings
addressed afterwards.
Commonly used tiers for categorizing findings - from most severe to least severe - include: Major
Findings, Minor Findings, Concerns, and Observations. Other terminology may be employed. However,
regardless of the naming convention, it is important that the TSA report define the terms used to tier
the findings, so that the reader can clearly understand their meaning and significance.
Table 6.1 provides one example of how to categorize audit findings, as well as provides a definition for
each term used.
Table 6.1 Example Findings Categorization
Finding:
Concern:
Nonconformance with or absence of a specified requirement
(regulatory, QMP, QAPP, SOP, etc.) or guidance deviation which
could significantly impact data quality.
Practices with the potential detrimental effect on the ambient air
monitoring program's operational effectiveness or the quality of
sampling or measurement results.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 84 of 105
I An infrequent deviation, error, or omission which does not impact I
Observation: j the output of the quality of the work product, but may impact the
| record for future reference.	!
In addition to defining the tiers used to rank the findings, it is also important to clearly distinguish in the
TSA report which categories require corrective actions. Using the example in Table 6.1 above, those
issues ranked as "Findings" or "Concerns" would require corrective actions by the audited organization;
"Observations," on the other hand, would warrant consideration by the organization, but not require
corrective action.
6.2.2 Draft Findings Summary (Report)
Ideally, the audited organization should formulate a Corrective Action Plan and begin implementing
needed improvements immediately following the completion of the EPA's on-site investigation.
However, some monitoring organizations prefer to postpone corrective actions until they have received
a formal document from EPA that enumerates the issues identified during the audit. The preparation of
the official TSA report, peer-reviewed and routed through management, usually takes longer than a
month to complete. Therefore, in order to expedite the corrective action process, the Regional Office
should provide written audit feedback (i.e., a draft findings summary) to the monitoring organization
prior to the completion of the finalized TSA report. The goal is to share the written audit feedback with
the monitoring organization within thirty (30) calendar days following the conclusion of the field
investigation.
There are a variety of ways the EPA Regional Office can provide written audit feedback to the
monitoring organization. Each Regional Office has the flexibility to choose which method of
correspondence is most practical and works best for the monitoring organizations within their
jurisdiction. Selecting one of the following three methods for drafting the findings summary is
recommended:
•	An Excel spreadsheet, which categorizes the findings and ranks them from most to least severe;
•	A formal briefing sheet, which captures the discussion from the exit briefing and identifies the
significant findings; or,
•	A draft report (Microsoft Word™ document) that contains a draft watermark (or similar draft
identifier) that indicates the report is not finalized.
Of these three methods, a formal briefing sheet is the quickest to prepare, but also contains the least
amount of information regarding the overall TSA findings. An Excel spreadsheet containing all of the
findings would most likely take longer to prepare than a briefing sheet - but it would likely omit
pertinent discussion to support or clarify the listed findings. A draft report in Microsoft Word allows for
the greatest detail and provides the majority of the written report to the monitoring organization. The

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 85 of 105
draft report would be the most transparent document to share with the audited organization; however,
it will take the most amount of time to prepare.
6.2.2.1 EPA Regional Office
All monitoring organizations within the Regional Office's jurisdiction should be aware of the Region's
TSA reporting process, including the mechanism used to relay the draft findings summary (e.g., Excel
spreadsheet, briefing sheet, or full report with draft watermark) and the appropriate comment process.
To ensure the audited organization knows what to expect, the process should be reiterated during the
TSA exit briefing.
Following completion of the on-site investigation, the TSA lead auditor should:
•	prepare the draft audit findings summary;
•	provide it to the audit team for review;
•	route it through management; and,
•	submit it to the monitoring organization within ~30 calendar days from completion of the field
investigation.
Given the numerous steps in this draft process, the 30-day window is a goal, not a requirement.
6.2.2.1.1 Internal Technical Review
Once the findings summary has been drafted, the lead auditor should share the draft document with the
audit team (or at least one member of the team) for technical review and feedback. Internal peer review
is a good way to ensure that the individual team member(s) concur with the technical facets of the
summary, including:
•	the findings enumerated are factual;
•	the findings are fair and accurately depicted;
•	the supporting discussion and technical evidence is correct and appropriate;
•	the categorization/ranking of the findings is appropriate; and,
•	there are no major omissions within the draft.
Peer review is a useful tool to ensure the lead auditor has successfully conveyed the observations made
by the audit team members. The lead auditor will often rely on the notes taken by teammates when
composing the written findings. There may be times when the lead auditor incorrectly interprets a
teammate's notes; therefore, the peer review process serves as a way to make sure the written
composition is accurate.
Note: It is acceptable for audit team members to compose language regarding their individual findings
and provide those to the lead auditor for inclusion in the draft document and/or final report. The TSA
report is not required to be written solely by the lead auditor.
Peer review is also a useful tool to ensure consistency throughout all audit reports produced within the
Regional Office. For example, if the lead auditor designates an issue as a "Finding" in the audit report,

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 86 of 105
yet on a separate TSA the same issue was deemed a "Concern" by a fellow auditor, this discrepancy
warrants discussion and reconciliation amongst the audit team. In this regard, maintaining a master
spreadsheet that tracks "Findings, Concerns, and Observations" (or similar terminology) may be useful
within the Regional Office, in order to ensure consistency from audit to audit. If the lead auditor found
evidence that elevated the "Concern" to the level of a "Finding", it is important that the written
discussion of that issue convey the additional, critical information, so that the designation of "Finding" is
justified.
A peer reviewer should also determine if supporting evidence and citations are correct, or if more
appropriate references could be added to better convey the intended message. It is important to
consider the 3-year time period which was audited, and ensure that any quotes from regulations, EPA
guidance, or the organization's quality system documents appropriately reflect the time period under
review. For example, if EPA guidance recently changed (e.g., the EPA 2.12 QAGD in 2016), but during the
time period of the TSA (e.g., 2013-2015) the organization was successfully implementing previously-
issued guidance (e.g., the 1998 2.12 QAGD), it is inappropriate to cite the newer guidance as justification
for an audit finding. Instead, the newer guidance may be used to direct the organization towards
implementing change and making improvements moving forward, but should not be held against them
as a "finding" in the TSA report.
Finally, a TSA report should contain no surprises. The lead auditor should not include issues in the draft
findings summary or finalized TSA report that were not discussed with the organization previously. If the
peer reviewer determines a finding has been included that was not discussed with the organization, the
discrepancy should be brought to the attention of the lead auditor. A follow-up conference call with the
audited organization may be necessary to correct this oversight, if the issue warrants inclusion in the
report.
6.2.2.1.2 Internal Administrative Review
Administrative review of the written correspondence ensures that the document is clear of grammatical,
mechanical, and spelling mistakes, and is formatted appropriately. This type of review also ensures that
the technical content makes sense from a layman's perspective. In some cases, a reviewer other than a
team member may be needed for this particular step. As stated earlier in Section 6.2, both the draft
findings summary and the finalized TSA report may be reviewed by individuals who are not directly
involved with the technical facets of an organization's ambient air monitoring program. Therefore, the
complex information documented in both the draft and final TSA report should be presented in a
manner that is technically accurate, yet easy to understand.
Once these edits have been completed, the document should be reviewed and approved for submittal
to the organization by the appropriate Regional Office manager or designated approving official. The
manager or designated approving official may recommend additional edits to the document prior to
signing. For example, revisions may be needed to correct any typographical issues missed during the
first levels of review, or perhaps to change the tone of a particular finding. In writing, tone is subjective
and often dependent on the individual reader; however, the use of active voice and third person helps
minimize issues with tone (see the EPA Communications Stylebook). If editing for tone, it is important

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 87 of 105
that the audit finding retain the technical facts of the issue. It is imperative that edits by individuals
independent from the audit team not jeopardize the technical accuracy of the report or dilute the
findings in a manner that negates their significance. The auditor's findings and observations are facts;
in upholding quality assurance principles, the findings should remain independent from outside
influences or other agendas.
6.2.2.2 Issuance of Draft Findings Summary
Once the findings summary has been internally reviewed and approved, the draft is issued to the
audited organization. The written findings summary should include a cover letter on organizational
letterhead, signed by the designated approving official.
This step in the process allows the audited organization an opportunity to preview the official report to
come, and in that regard, to "fact check" the written information. If the organization observes an
incorrect statement or datum within the draft, then the organization should provide corrections and/or
edits to the lead auditor. Helpful edits the monitoring organization can provide include factual
corrections such as staff names/titles, spellings, site names, and other typographical/minor errors made
by the audit team. The audited organization should be allowed 15 calendar days for this review process.
The cover letter that accompanies the findings summary should reiterate the fact that the draft is being
provided for fact-check purposes only; the 15-day review period is not the appropriate forum to contest
or negotiate audit findings. The cover letter should state that the audited organization can submit
corrections to the lead auditor; however, should no corrections be received within 15 calendar days, it
will be assumed that the draft document is acceptable to the monitoring organization in its present
form.
The issuance of this draft findings summary also facilitates corrective action measures. Monitoring
organizations can develop or formulate necessary actions upon receipt of the draft document, if they
have not already begun to do so.
6.2.3 Final TSA Report
The final TSA report should be issued to the monitoring organization as soon as possible following the
receipt of the draft comments, but no later than 30 calendar days afterwards. The final report should
include a cover letter on organizational letterhead, signed by the designated approving official.
If the Regional Office provides an Excel spreadsheet or briefing sheet to serve as the draft findings
summary, then the lead auditor should begin crafting the final TSA report immediately following its
completion. Upon receipt and review of the monitoring organization's written corrections or edits, the
lead auditor should ensure that necessary corrections are incorporated into the final report, as
appropriate. After the report is written, it should be routed through internal technical and
administrative review, as discussed in Sections 6.2.2.1.1 and 6.2.2.1.2. Once these reviews have been
completed, the final report should be reviewed and approved by the designated approving official, and
submitted in its final form to the monitoring organization.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 88 of 105
If the Regional Office provides a
fully composed report
(watermarked as draft) as the
draft findings summary, then
preparation of the final TSA
report after receipt of
comments is largely simplified.
If the monitoring organization
submits written corrections
and/or edits concerning the
draft, then the lead auditor
should review and incorporate
them into the final TSA report,
as appropriate, and simply
reprocess the report. A final
administrative review may be
necessary, if the edits were
substantial, to ensure the
newly composed language is
free of typographical or grammatical errors. However, if the monitoring organization does not submit
any corrections or edits to the draft, then the lead auditor can remove the draft watermark (or similar
draft identifier), and update the dates/signatures within the document. It will then be ready for
submittal to the monitoring organization.
There may be times when the audited organization submits evidence that corrective action measures
were completed after the on-site TSA concluded, but prior to the receipt of the draft findings summary.
This evidence is usually submitted at the time when the audited organization provides factual
corrections to the draft audit findings summary. In the event this occurs, the final TSA report should
retain the original finding in its original form (minus any typographical errors noted in the draft). The
opportunity for the monitoring organization to provide factual corrections to the draft report does not
serve as a medium to erase audit findings from the final report. The TSA report documents the "as
found" status of the monitoring organization at the time of the investigation, and should not be altered
to reflect improvements that occurred after the fact. The lead auditor can, however, add a "Note" in the
report that acknowledges that the organization implemented a corrective action measure for a
particular finding after the on-site investigation concluded, but prior to the issuance of the final TSA
report. It is recommended that this "Note" follow the recommendation that accompanied the finding
(see the text box insert above for an example).
Ultimately, the final TSA report should be prepared and submitted to the monitoring organization
directly following receipt of the monitoring organization's written corrections - which should equate to
approximately 75 calendar days from the on-site investigation, if targeted timeframes have been
maintained. Copies of the final TSA report should be sent to the monitoring organization's director or
Example "Note" in Final TSA Report
Finding: The probe cap at the ACME monitoring site was stainless steel.
Discussion: Studies have been conducted to determine the suitability of
materials for use in ambient air monitoring sampling trains. Pursuant to
40 CFR Part 58, Appendix E, Section 9(a), for those analyzers which
measure reactive gases, such as ozone, only inert materials - borosilicate
glass, Teflon, or their equivalent - are allowed in the sampling train (from
the inlet probe to the back of the analyzer). The probe cap utilized at the
ACME monitoring site is part of the sampling train. EPA auditors observed
this probe cap was made of steel, which does not meet Appendix E
requirements.
Recommendation: The cap at this site must be replaced immediately.
EPA acknowledges that the monitoring organization replaced the probe
cap on August 10, 2016, as documented in their response to the draft
audit report (in a letter dated December 8, 2016).

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 89 of 105
his/her designee, as well as to the monitoring program manager. The cover letter should indicate the
report is being distributed in its final form and draw attention to the next step of the TSA process: the
organization's official response to the report, which includes their anticipated schedule for corrective
action implementation.
6,3 Audited Organization Corrective Action Plans
6.3.1	Auditee Response
The audited organization should submit its official response and anticipated Corrective Action Plan to
the EPA Regional Office within 30 calendar days of receipt of the final TSA report. In some EPA Regions,
the audited organization is provided a specific template to use for documenting the Corrective Action
Plan. On the other hand, some Regions allow the organization flexibility to develop its own unique
Corrective Action Plan in any format, provided that the plan is clear and presents a timeline for
implementation.
In some cases, needed corrective actions may be easily implemented, with the identified issues resolved
before the on-site TSA concludes, or shortly thereafter. Upon receipt of the draft findings summary, the
audited organization is provided written notification of issues identified during the TSA, which should
also contain recommendations that minimally guide the course of action. Including the review time of
the draft findings summary, the organization has approximately two months in advance to consider the
audit team's findings and formulate its corrective action plan by the time the final TSA report is
received.
6.3.2	Auditor Review
The lead auditor (or designated audit team member) will review the organization's submitted response
and Corrective Action Plan. Overall, the audit team member looks for corrective action measures and
anticipated timelines that appear reasonable and meet minimum expectations. Although it may be
easiest for the organization to address issues with "quick fixes" first, ideally, the organization should give
priority to the most severe findings, since those findings often represent departures from regulatory
requirements. It's important to note that EPA's role in judging the effectiveness of some solutions may
be limited, depending on the monitoring experience and/or technical knowledge of the audit team with
regard to the parameter(s) under review. It may be necessary to consult with technical experts outside
the EPA Region or at OAQPS in order to determine if a resolution will be adequate and/or effective.
Each Regional Office has flexibility in how it responds to the submitted response and Corrective Action
Plan. Some Regional Offices may not directly approve or disapprove a Corrective Action Plan, but rather
"accept" the plan overall with the expectation that the organization will report to the Regional Office
the success (or not) of the implemented corrective actions, and that the organization will amend
targeted timelines if corrective actions are delayed. With regard to the latter, communication of the
amendment with the lead auditor is necessary, along with some form of written documentation (such as
an email or letter from the audited organization). However, some Regional Offices may provide a
written response to each finding's proposed corrective actions indicating that: 1) the proposed measure

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 90 of 105
is approved, or 2) the proposed corrective action does not adequately address the root cause of the
finding and, therefore, another solution(s) is needed.
It is during this step in the process where the audited organization may formally contest an audit finding
in writing and/or debate whether or not it will implement a recommended action stated within the
report (such as invalidating data as a consequence of an audit finding). If this situation occurs, the lead
auditor should work with the audit team and Regional Office management to determine the appropriate
course of action. OAQPS may need to be involved in order to mediate a resolution with the audited
organization. The outcome of the review should be included in a formal response to the audited
organization.
6.4 Corrective Action Follow-Up
Once the Corrective Action Plan has been accepted or approved, the next step of the process becomes
the tracking of completed action items. Dialog must be maintained with the monitoring organization
regarding actions taken to address TSA findings. Issues should be addressed in a timely manner, with the
understanding that systemic issues within a monitoring organization may take a longer amount of time
to rectify. In fact, issues that involve staffing, or any significant restructuring of the organization's quality
system, may take months for the organization to successfully complete. Continued communications and
routine follow-up on corrective action progress shows the audited organization that the Regional Office
is vested in the organization. It is recommended that the lead auditor (or designated audit team
member) conduct a follow-up phone call approximately 45 calendar days following receipt of the
monitoring organization's corrective action plan to receive a status/progress update. Any issues with
implementing the corrective action plan can be discussed during this call. Other follow-up calls may be
necessary following this first check-in, depending on the number and severity of findings in the TSA
report. A schedule for conducting additional follow-up calls can be developed, if needed.
6.4.1 Tracking
The monitoring organization's Corrective Action Plan should contain targeted due dates for the
completion of each corrective action item. On the key dates when action items are scheduled to be
completed, the monitoring organization should submit documentation to the lead auditor attesting to
the resolution of the issue(s), as well as provide supporting evidence. The lead auditor (or designated
team member) is responsible for tracking the completion of all action items in accordance with the
schedule (timeline) developed by the monitoring organization. Given the extent of the audit findings,
this phase of the TSA may take many months to complete. With that in mind, a strategy to manage this
phase of the project should be developed by the EPA Regional Office. Any mechanism to internally track
the status of action items is acceptable, provided that it guarantees that all items are accounted for and
successfully remedied.
There are numerous ways the completion of individual corrective action measures can be tracked. The
recommended approach is to use a templated form that the audited organization documents upon the
completion of each action item. See Figure 6.2 for an example of an audit finding tracking form. If using
a form such as this one, the audited organization is responsible for all necessary documentation. A
notable feature on the form is that it requires the signature of a reviewer from the audited organization

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 91 of 105
that attests that the corrective action was successfully completed. The form is sent to the lead auditor
upon its completion. Using this approach, the audited organization is responsible for completing one
form for each finding itemized in the TSA report; the lead auditor, then, is responsible for accounting
that a form for each auditing finding is received. If tracking corrective action items in this manner, it is
important to note that the individual forms for each finding will need to be maintained and entered into
the TSA Project File, once the TSA has been closed out (see Section 6.5.1 of this QAGD).
In addition to tracking the receipt of audit finding tracking forms, the lead auditor is also responsible for
evaluating the forms to determine whether the corrective action measures completed are acceptable,
which may include evaluating supporting evidence provided by the auditee. In some cases, a specific
deliverable may have been requested in the TSA report to illustrate successful completion of an action
item, such as photographs and measurements demonstrating that Appendix E siting issues have been
resolved, or specific AQS reports that illustrate recommended data flagging has been completed.
Consultation with the audit team may be necessary when evaluating the documentation and supporting
evidence provided by the monitoring organization, in order to determine whether or not a corrective
action measure is adequate and effective. If the lead auditor (or designated audit team member) is
satisfied with the corrective action measure taken, then the auditor should sign the audit form to close
the loop on the specific finding.
Although use of the audit tracking form is recommended, other tracking mechanisms are allowed. For
example, the Regional Office may choose to develop a unique electronic spreadsheet or database that is
used for tracking TSA corrective action measures. If using such a tool, the lead auditor is still charged
with maintaining the documentation and records supplied by the auditee that demonstrate the
corrective actions have been completed and were successful. On the other hand, if the Regional Office
used a spreadsheet to initially provide the draft findings summary to the audited organization, the
Regional Office may use that same spreadsheet - with additional columns added - to track the status of
corrective actions. Under this scenario, the lead auditor is most likely charged with documenting the
spreadsheet directly as the audited organization submits evidence that corrective action measures have
been completed. Or, the Regional Office may elect to have the audited organization document the
spreadsheet. If this option is employed, the lead auditor will need to determine how frequently the
spreadsheet should be submitted to the Regional Office in order to receive status updates, or determine
if the spreadsheet is only submitted once - when all corrective actions have been finalized. If the latter,
then routine conference calls should be scheduled and/or status reports required in order to keep the
auditor abreast of the organization's progress. However, even using this approach, the auditor is still
charged with maintaining copies of any additional records and supporting evidence the monitoring
organization submits along with the spreadsheet. When all action items have been completed and the
TSA is closed out, the spreadsheet and supporting records will need to be archived in the TSA Project
File.
While tracking corrective action measures, if expected documentation is not received by its anticipated
due date, the lead auditor should contact the organization to inquire about its status. If the organization
has had a delay, the lead auditor should request a written statement from the organization to explain
the issue and establish a revised timeframe for submittal. The auditor should maintain copies of this

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 92 of 105
correspondence, which serves as a record that documents the organization's progress and the Regional
Office's follow-up. As the audited organization works through correcting the identified audit findings,
conference calls with the organization's senior management may become necessary, especially if
repeated delays are reported or if the organization's completed corrective action(s) does not meet
minimum expectations. In these instances, EPA Regional Office upper management may need to
become involved as well. See Section 8 of this QAGD for more information.
6.4.2 Follow-up Site Visit
If the TSA findings were especially significant, if there was a marked lack of responsiveness in completing
corrective action, or if the organization itself requested assistance from EPA in the implementation of
specific corrective action measures, follow-up site visits may be needed. A follow-up visit provides the
auditor an opportunity to assess, in person, the progress the monitoring organization has made towards
correcting issues. In many cases, judging the effectiveness of a corrective action measure can be best
accomplished on site, as opposed to simply reading documentation that describes the resolution. While
on site, the auditor can also offer additional technical advice or recommendations in order to help
facilitate continued improvements. The follow-up visit provides the auditor the opportunity to help the
organization alter its course of action, if the auditor determines the modification is inappropriate or
deficient in some manner. In this regard, the follow-up visit facilitates improvements sooner, rather than
later. Most importantly, the follow-up visit affords the auditor the opportunity for continued face-to-
face interaction with the monitoring organization. It is recommended that this follow-up site visit be
documented in the manner by which the Regional Office deems most appropriate. Documentation could
be in the form of a follow-up memorandum or email to the monitoring organization that acknowledges
the visit and provides a brief summary of the activities completed. A more formal letter can be
developed, if preferred. This documentation should be included in the TSA Project File, upon completion
of the TSA.
It is important to note that some recommended corrective actions (especially those that involve
personnel or funding modifications) simply may not occur, due to institutional roadblocks within the
monitoring organization or other reasons. In this event, the lead auditor should document the lack of
corrective action (via email or internal memo to RO management, etc) in order to complete the internal
tracking record. This documentation will illustrate the lead auditor's commitment to tracking all audit
findings, and will also serve as a means to inform RO management that an impasse has occurred. In
these instances, the TSA will not be closed out in AQS and the monitoring organization will not receive
a close-out letter from the Regional Office. Additionally, the finding(s) and recommendation(s) should
be documented again in the monitoring organization's next TSA report, emphasizing that the issue(s)
was unresolved from the previous TSA.
6.5 TSA Close-Out
As stated earlier in this QAGD, TSAs are a mechanism for EPA to help monitoring organizations improve
their ambient air monitoring programs. Although state and local air monitoring organizations are
encouraged to provide solutions to issues identified as part of a TSA, EPA Regions and OAQPS may be
needed to provide suggestions and guidance on options that bring the monitoring programs into

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 93 of 105
compliance with federal regulations. As EPA and the monitoring organization collaborate on corrective
action measures, the progress towards close-out of audit findings should be tracked and documented.
This process may take many months to complete, depending on the extent of the audit findings.
It is important that the lead auditor compile all records for the project file that are to be archived,
including those records that demonstrate EPA has followed-up on the success of corrective action
measures (or lack thereof). Once the lead auditor has the necessary paperwork to show that all
identified TSA findings have been satisfactorily addressed (e.g., all requisite Audit Finding Response
forms have been received and approved), the Regional Office should issue an official "close out" letter
to the audited organization. Ideally, the close-out letter should be on organizational letterhead, signed
by the designated approving official. The letter can be brief, and should clearly state that the
organization has provided the necessary documentation to illustrate that required corrective action
measures were successfully implemented; with that, the TSA is considered officially "closed." Appendix
F of this QAGD contains an example close-out letter.
Along with the completion of this final correspondence, close-out information should also be entered
into AQS to document the completion of the TSA. See Section 6.6 of this QAGD. However, if the TSA
does not officially "close out" - meaning that the organization does not complete all the necessary
corrective actions prior to the beginning of its next TSA cycle - an official close-out letter will not be
generated and a close-out date will not be entered into AQS.
6.5.1 TSA Project File
Maintaining records and documents from the TSA is an important process for which the lead auditor is
responsible. Each Regional Office has rules and policies regarding records retention; RO management
should ensure that TSA documentation is filed and archived in a manner that adheres to Agency policies,
and that each auditor is aware of the requirements. As such, each Regional Office is encouraged to
document its TSA records management procedures in a SOP. Additionally, each Regional Office is
encouraged to create a checklist that itemizes the documents and records from the TSA that should be
maintained and archived. Records retained can be in electronic or hardcopy format, or a combination of
both.
After the TSA has been "closed out", the lead auditor should gather all pertinent documents and records
and place them collectively in a file that serves as the final repository for all information related to the
specific TSA. Examples of records to maintain and archive may include, but not be limited to, the
following:
•	a copy of the audit plan;
•	logbooks documented by the audit team during the on-site investigation;
•	pertinent AQS records or other documents that were investigated and resulted in significant
audit findings;
•	photographs taken during the on-site investigation that relate to audit findings documented in
the TSA report;
•	a copy of the draft findings summary and cover letter;
•	a copy of the final, signed TSA report and cover letter;

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 94 of 105
•	a copy of the questionnaire response documented by the monitoring organization;
•	a copy of the organization's responses to both the draft findings summary and the final TSA
report;
•	the monitoring organization's corrective action plan;
•	audit finding tracking forms and/or the tracking spreadsheet; and,
•	a copy of the signed close-out letter.
6,6 TSA Information in AQS
The written TSA report serves as official documentation to demonstrate that the requirements of 40 CFR
Part 58, Appendix A, ง2.5 have been fulfilled - for both EPA and the audited organization.
CFR-required TSAs must be reported to AQS. In 2013, an AQS entry screen was developed to allow the
reporting of five TSA-related parameters: 1) the monitoring organization audited, 2) the auditing
organization, 3) the audit begin date, 4) the audit end date, and 5) the close-out date. The audit begin
date should be the date the on-site investigation begins. The audit end date should be the date the draft
findings summary (report) is issued to the monitoring organization. The close-out date is defined as the
date when all corrective actions identified in the audit report are completed and a close-out letter is
issued, as discussed in Section 6.5 above. For those TSAs with outstanding issues that are not resolved
before the onset of the next TSA cycle for the monitoring organization, the close-out date should be
left blank in AQS.
The lead auditor is responsible for ensuring that information regarding the TSA is successfully uploaded
into AQS. It is recommended that the lead auditor (or delegate) enter the audit begin and end dates
when the draft TSA report is first issued to the monitoring organization. The close-out date should be
entered into AQS in conjunction with the issuance of the close-out letter, as the last step prior to
archiving the completed TSA project file.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 95 of 105
Technical System Audit Finding Response Form

Audit Title: (Insert Agency Name)


Audit Date:


Finding #: (insert finding number from audit report / corrective action plan)


Description of finding: (insert description included in the audit report>


Describe actions taken to correct finding and include date action was completed:


Identify or list any additional documentation (photos, copies of certificates or documents!
provided to EPA to verify completion of this finding and include tbcra as attachments when
submitting this form.


Verified/Prepared by:
Date:


THIS SECTION TO BE COMPLETED BY EPA STAFF


Reviewer's Comments:


Reviewed by:
Date:



Figure 6.2 Example Corrective Action Tracking Mechanism

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 96 of 105
7.0 TSA Training
While there is currently no standardized training required to conduct a TSA, each Region should have its
own curriculum to train auditors. This should initially include training on the CFR regulations, QA
Handbook, guidance documents, hands-on experience with monitoring equipment and procedures, and
new auditors shadowing senior staff. Shadowing senior audit staff allows the new auditor to follow the
process and observe the interaction between the monitoring organization and auditor. Training may be
aided by a walk-through using in-house developed TSA checklists. Often, the best training originates
from case studies and experience acquired during real TSAs. New auditors should not lead an audit until
they have working knowledge of the monitoring and quality system requirements and several months of
experience shadowing or participating in TSAs.
In Regions with limited technical or quality system experience, it would be beneficial to take advantage
of any cross-training or shadowing with other Regions. A wide array of technical expertise exists within
the EPA Regions as a whole. Some Regions have been working on data analysis tools or best practices
that can be shared and added to the guidance document. Other Regions have auditors who were prior
air monitoring site operators or technicians. New lead auditors should reach out and take advantage of
other EPA regional experience, OAQPS expertise, and the TSA workgroup members.
There have been ongoing efforts to facilitate hands-on training with specific equipment and developing
pollutant-specific checklists. For example, the Met One beta attenuation monitor (BAM) 1020 training
has been offered by vendor personnel for each Region, and specific checklists have been developed by
OAQPS to aid in auditing this monitor. There is interest/support in developing similar training and
checklists for other equipment as well. If possible, proficiency training provides an excellent opportunity
for new staff to learn and gain experience with monitoring equipment. All auditors should participate in
TSA, RO/OAQPS Monitoring, and QA conference calls for more information as it becomes available.
Training opportunities in the future include OAQPS-facilitated training with Regional expertise, a TSA
Workshop or TSA discussions at the National Ambient Air Monitoring Conferences, QA workshops, and
updated APTI training courses.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 97 of 105
8.0 EPA Audit Communication
While communication between the audit team and the audit team's immediate supervisor(s) is generally
anticipated to flow freely throughout the course of planning, performing, and writing the TSA report,
there are points where elevation to upper management within the Region, to other Regions, or to
OAQPS are more discrete. This section highlights some areas and situations where this elevation may
occur. As a note, each Region may have different management communication policies, and this section
is presented as a best practice for addressing specific concerns resulting from TSAs.
8.1 Regional Management Elevation Points
For purposes of this section, regional management refers to EPA management above the audit team's
immediate supervisor. The number of levels of management that may be involved depends on the how
the Region is structured and the potential impact of the TSA.
At the beginning of the fiscal year, EPA Regions should inform upper management of what organizations
will be audited that year. This information will assist in planning the coming year's priorities and in
directing the necessary funding and resources for the audits. Two to three months prior to each on-site
visit, the audit team's immediate supervisor should inform upper management of the upcoming TSA.
This may be followed a few weeks before the on-site visit by a short write-up emailed to upper
management, stating the specific dates of the on-site TSA.
Communication between the EPA audit team and the EPA Regional Office may be minimal during the
on-site portion of the TSA. However, it is common for the audit team to reach out to other EPA experts
(including OAQPS) during the on-site audit to answer technical questions or to request clarification on
regulatory requirements. The EPA audit team will inform EPA Regional Office management (i.e.,
immediate supervisor(s) at a minimum) of any significant findings prior to communicating preliminary
findings with the monitoring organization. By doing this, Regional Office management will be informed
before the audit results are formally presented to the monitoring organization and can be better
prepared to address questions or concerns that may result following the exit briefing. During the TSA
exit briefing, EPA management (i.e., immediate supervisor(s) at a minimum) should participate via
phone or in person.
Upper management within the relevant EPA Regional Office may be informed of findings at a number of
different stages, depending on the sensitivity or potential implications of findings, and management
interest. Elevation may consist of a discussion between supervisor(s) and upper management, a formal
briefing, or by emailing a summary and list of findings to upper management and offering a meeting to
discuss further.
Upper management may also need to be involved in the corrective action process between the EPA
Region and the monitoring organization. If a monitoring organization shows no progress or limited
progress during the TSA follow-up activities, the issue should be elevated to upper management for
resolution. Upper management may be able to offer solutions or facilitate change quicker and more
effectively than the audit team or the immediate supervisor.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 98 of 105
Points at which the audit team and supervisors should consider whether elevation is appropriate:
•	Upon receipt of the monitoring organization's comments on the draft report
•	When fully-drafted findings are available, either as part of, or separate from, a completed draft
TSA report
•	Prior to sending the draft report to the monitoring organization
•	Prior to sending the final report to the monitoring organization
•	Upon receipt of the corrective action plan
•	During the corrective action process
•	When there are delays in receiving the Corrective Action Plan
•	When there are delays in implementing the Corrective Action Plan
•	When there is a lack of communication from the monitoring organization
•	When there is refusal to reconsider corrective actions that have been identified as ineffective
These elevation points may be different for each EPA Region depending on management and structure.
The intent of identifying these different points is to allow for flexibility in handling communication of
audit results. Instances when upper management involvement is specifically desired include findings
that could result in significant amounts of data invalidation, or those with data implications for other
EPA Regions.
8.2	Inter-Regional Elevation Points
Other EPA Regions will be contacted after issuing the final report if they are directly affected by the
findings, such as in instances where two regions share an MSA and the TSA findings result in data
invalidation and revised design values. Upper management should inform the other affected region(s),
and the audit team may be requested to contact their staff-level counterparts.
Other EPA Regions may also be contacted to share perspectives and provide input on how certain
situations might be addressed. Questions or topics for discussion may be shared on national EPA calls,
such as the monthly QA or RO/OAQPS Monitoring Calls, or during TSA Workgroup calls.
8.3	OAQPS Elevation Points
OAQPS may be contacted for clarification or guidance in interpreting CFR and national guidance
documentation during the on-site audit or as the audit team drafts the TSA report. The audit team
should feel free to contact OAQPS to gain knowledge on what precedents have been set in handling TSA
findings in order to promote consistency across the regions. The region should elevate important audit
findings to OAQPS that could result in:
•	significant data invalidation affecting NAAQS attainment decisions,
•	large amount of data qualification/flagging,
•	or show serious deficiencies in the quality system.
It is especially important to report TSA results that may impact attainment decisions for the NAAQS.
OAQPS is responsible for calculating design values for the NAAQS monitoring network using data in AQS.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 99 of 105
If audit findings demonstrate that data in AQS does not meet data quality requirements, action must be
taken to correct the dataset through the use of flags or null codes as appropriate for the data quality
issue. For this reason, it is imperative that these types of issues are raised to OAQPS so that only data of
acceptable quality are used for NAAQS decision-making purposes. If large amounts of data are
recommended for invalidation or flagged as a result of a TSA, the EPA Region should also communicate
how much data is affected and when the data will be changed in AQS.

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 100 of 105
Appendix A
Technical Systems Audit Questionnaire
The following questionnaire incorporates the requirements of the 40 CFR ambient air
monitoring requirements and relevant EPA guidance and is intended to be used as a tool in
conducting the TSA of an ambient air monitoring organization. This questionnaire is available
on EPA's AMTIC website as a fillable Word document and also as an Excel document. Both
versions contain the same content and the user may choose to use either format. The
questionnaire may be edited for specific regional use.
Note: The criteria documented in this questionnaire was current as of the published date of this
document. Users should verify that the criteria have not been updated in the CFR or other
guidance documents, such as the QA Handbook, before use. If discrepancies exist, please
inform the author(s) of this document to recommend revision of this appendix.

-------
Appendix A
Technical Systems
Audit
Pre-Audit Questionnaire

-------
Contents
1.	General	4
a.	Program Organization	5
a.l Organizational Chart	5
a.2 Key Position Staffing	6
b.	Facilities	7
c.	General Documentation Policies	8
d.	Training	9
d.l Training Plan	9
d.2	Training Events	9
e.	Oversight of Contractors and Supplies	10
e.l	Contractors	10
e.2 Supplies	10
2.	Quality Management	11
a.	Status of Quality Assurance Program	11
a.l QA Activities	11
a.2	QC Acceptance Criteria	12
b.	Internal Performance Evaluation (PE) Audits	13
b.l	Internal Audit Questions	13
b.2 Internal Audit Procedures	13
b.3 Certification of Audit Standards	13
b.4 Audit Equipment	14
b.5	Audit Acceptance Criteria	14
c.	Planning Documents Including QMP, QAPP, & SOP	15
c.l	QMP Questions	15
c.2 QAPP Questions	15
c.3 SOP Questions	16
d.	Corrective Action	17
e.	Quality Improvement	17
f.	External Performance Audits	18
3.	Network Management	19
a.	Network Design	19
b.	Siting	19
1

-------
b.l Site Evaluations	19
b.2	Site Non-Conformance	20
c.	Waivers	20
c.l	Waiver Questions	20
c.2 Waiver Types	20
d.	Documentation	21
4.	Field Operations	22
a.	Field Support	22
b.	Instrument Acceptance	23
b.l Instrumentation	23
b.2	Instrument Needs	23
c.	Calibration	23
c.l	Calibration Frequency and Methods	23
c.2	Calibration Questions	24
d.	Certification	24
d.l	Flow Devices	24
d.2 Certification Questions	25
d.3 Ozone Traceability Diagram	26
d.3 Calibrator Certification	27
e.	Repair	28
f.	Record Keeping	29
5.	Laboratory Operations	30
a.	Routine Operation	30
a.l Methods	30
a.2	Quality System	31
b.	Laboratory Quality Control	32
b.l	Standards	32
b.2 Laboratory Temperature and Relative Humidity	32
c.	Laboratory Preventive Maintenance	33
d.	Laboratory Record Keeping	34
e.	Laboratory Data Acquisition and Handling	36
f.	Filter Questions	38
e. Metals & Other Analyses	39
2

-------
e.l Laboratory QA/QC	39
e.2 Chemicals	40
e.3 Pb	41
6. Data & Data Management	42
a.	Data Handling	42
b.	Software Documentation	44
c.	Data Validation and Correction	45
d.	Data Processing	45
d.l Reports	45
d.2	Data Submission	46
e.	Internal Reporting	47
e.l	Reports	47
e.2 Responsibilities	47
3

-------
1. General
Monitoring Organization/Agency/PQAO Audited
Address:
Street
City, State, Zip Code
Date of Technical Systems Audit: Click or tap to enter a date.
This section of the questionnaire completed by: Click or tap here to enter name.
Key Individuals (e.g., Agency Director, Ambient Air Monitoring Network Manager, QA Manager, etc.):
Title/Position
Name
Click or tap here to enter text.

Click or tap here to enter text.

4

-------
a. Program Organization
a.l Organizational Chart
Upload an organizational chart, or attach to the form:
5

-------
a.2 Key Position Staffing
Enter the number of personnel available to each of the following program areas, and any vacancies, if
applicable.
Program Area
Number of People
Number of People
Vacancies

(Primary)
(Backup)

Network Management (site setup,
Click or tap here to
Click or tap here to
Click or tap here to
siting, ANP, etc.)
enter text|
enter text|
enter text|
Field Operation (QC checks, site
Click or tap here to
Click or tap here to
Click or tap here to
visits, site maintenance, etc.)
enter text|
enter text|
enter text|




Quality management (audits, QA
Click or tap here to
Click or tap here to
Click or tap here to
documentation, certifications etc.)
enter text|
enter text|
enter text|




Data and Data Management (data
Click or tap here to
Click or tap here to
Click or tap here to
review, validation and acquisition
enter text|
enter text|
enter text|
system, AQS, etc.)



Technical support (equipment repair
Click or tap here to
Click or tap here to
Click or tap here to
and maintenance)
enter text|
enter text|
enter text|
Comment on the need for additional personnel, if applicable.
Click or tap here to enter text.
6

-------
b. Facilities
Identify the principal facilities where the agency conducts work related to air monitoring. Do not include
monitoring stations, but do include facilities where work is performed by contractors or other
organizations.
Ambient Air Monitoring
Function
Facility Location
Comment on any significant changes to be
implemented within the next one to two
years.
Instrument repair
Click or tap here to enter
Click or tap here to enter text.

text.

Certification of Standards
Click or tap here to enter
Click or tap here to enter text.
(e.g. gases, flow transfers,
MFCs)
text.

PM filter weighing
Click or tap here to enter
Click or tap here to enter text.

text.

Pb analysis
Click or tap here to enter
Click or tap here to enter text.

text.

Data verification and
Click or tap here to enter
Click or tap here to enter text.
processing
text.

General office space
Click or tap here to enter
Click or tap here to enter text.

text.

General lab/work space
Click or tap here to enter
Click or tap here to enter text.

text.

Storage space, short and
Click or tap here to enter
Click or tap here to enter text.
long term
text.

Air Toxics
Click or tap here to enter
Click or tap here to enter text.

text.

Indicate below any facilities t
lat should be upgraded or any needs for additional physical space
(laboratory, office, storage, etc.)
Click or tap here to enter text.
7

-------
c. General Documentation Policies
Complete the following table.
Question
Yes
No
Comment
Does the agency have a documented records
management plan?
~
~

Click or tap here to enter
text.

• If yes, does this include electronic records?
~
~
Click or tap here to enter
text.

Does the agency have a method to track files
considered official records and their media type (i.e.
paper, electronic)?
~
~

Click or tap here to enter
text.

Does the agency have a schedule for retention and
disposition of records?
~
~

Click or tap here to enter
text.

Are records kept for at least three years?
~
~

Click or tap here to enter
text.

Who is responsible for the storage and retrieval of records?

Click or tap here to enter
text.

What security measures are utilized to protect records?

Click or tap here to enter
text.

Where/when does the agency rely on electronic files as primary records?

Click or tap here to enter
text.

What is the system for storage, retrieval and backup of the electronic
files?

Click or tap here to enter
text.

8

-------
d. Training
d.l Training Plan
Complete the following table.
Question
Yes
No
Comment
Does the agency have a training plan?
~
~
Click or tap here to enter
text.
Where is it documented?
Click or tap here to enter
text.
Does it make use of seminars, courses, EPA-
sponsored college level courses, etc.?
~
~
Click or tap here to enter
text.
Are personnel cross-trained for other ambient
air monitoring duties?
~
~
Click or tap here to enter
text.
Are training funds specifically designated in the
annual budget?
~
~
Click or tap here to enter
text.
Does the training plan include:
Yes
No
Comment
Training requirements by position?
~
~
Click or tap here to enter
text.
Frequency of training?
~
~
Click or tap here to enter
text.
Training for contract personnel?
~
~
Click or tap here to enter
text.
A list of core QA-related courses?
~
~
Click or tap here to enter
text.
d.2 Training Events
Indicate below the most training events since the last TSA and identify the personnel who participated in
them.
Event
Dates
Participant(s)
Click or tap here to enter text.
Click or tap to

Click or tap here to enter

enter a date.

text.
9

-------
e. Oversight of Contractors and Supplies
e.l Contractors
Complete the following table. If your agency does not use contract personnel, proceed to section e.2
Supplies.
Contractors
Yes
No
Comment
Who is responsible for oversight of contract
~
~
Click or tap here to enter
personnel?


text.
Are contractors providing a service (e.g., PM2.5 lab)
audited? How often? Who is the contractor?
~
~
Click or tap here to enter
text.
What steps are taken to ensure contract personnel
~
~
Click or tap here to enter
meet training and experience criteria?


text.
Are contractor Quality Documents reviewed before
procuring a service?
~
~
Click or tap here to enter
text.
How often are contracts reviewed and /or
~
~
Click or tap here to enter
renewed?


text.
e.2 Supplies
Complete the following table.
Suppliers
Yes
No
Comment
Have specifications been established for
~
~
Click or tap here to enter
consumable supplies and/or for equipment?


text.
What supplies and equipment have established specr
ications?

Click or tap here to enter



text.
Is equipment from suppliers open for bid?
~
~
Click or tap here to enter



text.
10

-------
2. Quality Management
This section of the questionnaire completed by: Click or tap here to enter name
Key Individuals:
Title/Position
Name
Click or tap here to enter text.

Click or tap here to enter text.

a. Status of Quality Assurance Program
a.l QA Activities
Complete the following table.
Question
Yes
No
Comment
Does the agency perform all QA activities
with internal personnel (i.e., developing
QMPs/QAPPs/SOPs and DQOs,
performing systems audits, assessments
and performance evaluations, corrective
actions, validating data, QA reporting,
etc.)? If no, in the comment field, indicate
who is responsible and which QA
activities are performed.
~
~
Click or tap here to enter text.

If the agency has contracts or similar agreements in place
with either another agency or contractor to perform audits
or calibrations, please name the organization and briefly
describe the type of agreement.
Click or tap here to enter text.

Does the agency perform all QC activities
with internal personnel (i.e.,
zero/span/one-point QC checks,
calibrations, flowrate, temperature,
pressure and humidity checks,
certifying/recertifying standards, lab and
field blanks, data collection, balance
checks, leak checks, etc.)?
~
~
Click or tap here to enter text.

11

-------
a.2 QC Acceptance Criteria
Complete the following tables.
Question
Yes/No
Location
Comment
Has the agency established and
Choose
Choose an
Click or tap here to enter text.
documented criteria to define agency
an item.
item.

acceptable QC results?



Pollutant
Does the agency adhere to
the critical QC acceptance
criteria for criteria
pollutants1 and
meteorological
measurements1?
QC Acceptance
Criteria
(if other than
validation
templates1)
Action or
Warning Limits
Corrective
Action
Choose an
Choose an item.
Click or tap here to
Click or tap here
Click or tap
item.

enter text.
to enter text.
here to enter




text|
1 QA Handbook Volume II, Appendix D Validation Templates; Handbook for Air Pollution Measurement Systems,
Appendix C Validation Templates; Quality Assurance Handbook for Air Pollution Measurement Systems: Volume IV:
Meteorological Measurements Version 2.0
12

-------
b. Internal Performance Evaluation (PE) Audits
b.l Internal Audit Questions
Complete the following table.
Question
Yes
No
Response
Does the agency maintain a laboratory to
~
~
Click or tap here to enter text.
support quality assurance activities?



Has the agency documented and
~
~
Click or tap here to enter text.
implemented specific audit SOPs



separate from monitoring SOPs?



Are the QA personnel organizationally
~
~
Click or tap here to enter text.
independent from the personnel



responsible for generating environmental



data? (40 CFR Part 58 Appendix A Section



2.2) If no, please explain in the comment



field.



Are annual performance evaluations
~
~
Click or tap here to enter text.
(audits) conducted by technician(s) other



than the routine site operator(s)? (40 CFR



Part 58 Appendix A Section 3.1.2) If no,



please explain in the comment field.



Does the agency have identifiable
~
~
Click or tap here to enter text.
auditing equipment and standards



(specifically intended for sole use) for



audits?



Are audit equipment and standards ever
~
~
Click or tap here to enter text.
used to support routine calibration and



QC checks required for monitoring



network operations? If yes, please



explain in the comment field.



b.2 Internal Audit Procedures
If the agency does not have a performance audit SOP (included as an attachment), please describe the
performance audit procedure for each type of pollutant.
Pollutant
Performance Audit Procedure
Choose an item.

Click or tap here to enter text.

b.3 Certification of Audit Standards
Use the table below to provide information on certification of audit standards (ex. Flowmeters, gas
standards, etc.) currently being used.
Vendor
Audit Standard
Certification
Certification
Frequency
Click or tap here to
Click or tap here to
Choose an item.
Choose an item.
enter text.
enter text.


13

-------
Complete the following table.
Question
Yes
No
Comment
Does the agency have a separate certified
source of zero air for performance audits?
~
~
Click or tap here to enter text.

Does the agency have procedures for
auditing and/or validating performance of
meteorological monitoring?
~
~
Click or tap here to enter text.

b.4 Audit Equipment
Use the table provided below to list the agency's audit equipment and age of audit equipment.
Manufacturer
Make and Model Number
Purchase Year or Year Acquired
Click or tap here to enter text.

Click or tap here to enter text.

Choose an item.

b.5 Audit Acceptance Criteria
Complete the following tables.
Question
Yes/No
Location
Comment
Has the agency established and
documented criteria to define agency
Choose
Choose an item.
Click or tap here to
an

enter text.
acceptable audit results? If yes,
item.


comment where (page number,



section, etc.)



Pollutant
Does the agency adhere to
the audit acceptance
criteria for criteria
pollutants and
meteorological
measurements1?
PE Audit
Acceptance
Criteria (if other
than validation
templates1)
Do the audit
levels (gaseous
PE audits only)
meet 40 CFR Part
58 Appendix A
Section 3.1.2.1
criteria?
Corrective Action
Choose
Choose an item.
Click or tap here
Choose an item.
Click or tap here
an item.

to enter text.

to enter text.
14

-------
c. Planning Documents Including QMP, QAPP, & SOP
c.l QMP Questions
Complete the following table.
Question
Response
Does the agency have an approved quality management plan (QMP)?
Choose an
item.
• If yes, who approves the QMP (EPA, self-approval, PQAO, etc.)?
Click or tap
here to enter

text.
• Is the QMP multi-media, or air-specific?
Choose an
item.
• Does the agency have any QMP revisions still pending EPA approval?
Choose an
item.
• Has the QMP been approved by EPA within the last 5 years?
Choose an
item.
• What is the approval date of the QMP?
Click or tap
to enter a
date.
c.2 QAPP Questions
Complete the following table.
Question
Response
Does the agency have an EPA-approved quality assurance project plan
Choose an item.
(QAPP)?

• If no, has the agency been delegated self-approval?
Choose an item.
How often does the air monitoring agency review QAPPs?
Click or tap here to enter
text.
Does the agency have any QAPP revisions still pending EPA approval?
Choose an item.
How does the agency verify that the QAPP is fully implemented?
Click or tap here to enter
text.
How are staff notified and trained when a QAPP is revised?
Click or tap here to enter
text.
What personnel regularly receive updates?
Click or tap here to enter
text.
Does the agency have any missing QAPPs that need to be developed?
Choose an item.
• If yes, list any missing QAPPs.
Click or tap here to enter
text.
15

-------
List all QAPPs.
QAPP Title
Approval
Date
Pollutants
Status
Click or tap here to enter text.
Click or tap to
Click or tap
Choose an item.

enter a date.
here to



enter text.

c.3 SOP Questions
Complete the following tables.
Question
Response
Are all standard operating procedures (SOPs) complete, or are some in
development?
Choose an item.

Are any monitoring SOPs needed?
Choose an item.

• If yes, list the SOPs that need to be developed.
Click or tap here to enter


text.



Are SOPs available to all field operations personnel?
Choose an item.

Are SOPs for "episodic monitoring" prepared and available to field
personnel? Refer to QA Handbook Vol. II, Section 6.0.
Choose an item.

Are SOPs based on the framework contained in Guidance for Preparing
Standard Operating Procedures (SOPs) (EPA QA/G-6)?
Choose an item.

Does the agency have SOPs specific to data handling and data
validation?
Choose an item.

Who approves SOPs?
Click or tap here to enter
text.

How often are SOPs reviewed and updated?
Click or tap here to enter
text.

How are staff notified and trained when a SOP is revised?
Click or tap here to enter
text.

List all SOPs.
SOP Title
Approval
Date
Pollutants
Status
Click or tap here to enter text.
Click or tap
Click or tap
Choose an item.

to enter a
here to


date.
enter text.

16

-------
d. Corrective Action
Question
Response
Does the agency have an operational, documented, and comprehensive
Choose an item.
corrective action program in place?

• As a part of the QAPP?
Choose an item.
• As a separate, or part of a, SOP?
Choose an item.
Does the agency have established and documented corrective action
Choose an item.
limits for QA and QC activities?

Are corrective action procedures based on results of the following that
have exceeded established limits?
Click or tap here to enter
text.
• 1-Point QC checks
Choose an item.
• Calibrations and zero/span checks
Choose an item.
• Flow rate verifications
Choose an item.
• Performance evaluations (gaseous audits and semi-annual flow
Choose an item.
rate audits)

• Precision goals (collocated PIVh.sand PMio)
Choose an item.
• Bias goals
Choose an item.
• NPAP audits
Choose an item.
• PEP audits
Choose an item.
• Completeness goals
Choose an item.
• Data audits
Choose an item.
• Technical Systems Audits
Choose an item.
How is responsibility for implementing corrective actions assigned?
Click or tap here to enter
text.
How does the agency follow up after corrective actions are
implemented?
Click or tap here to enter
text.
Briefly describe recent examples of the ways in which the above
corrective action system was employed to remove problems.
Click or tap here to enter
text.
e. Quality Improvement
Question
Response
What actions were taken to improve the quality system
since the last TSA?
Click or tap here to enter text.

Since the last TSA, do your control charts indicate that the
overall data quality for each pollutant is steady or
improving?
Click or tap here to enter text.

What was the cause when goals for measurement
uncertainty per 40 CFR Part 58 Appendix A were not met?
Click or tap here to enter text.

Have all deficiencies indicated on the previous TSA report
been corrected? If no, please list and explain.
Click or tap here to enter text.

What are your agency's plans for quality improvement?
Click or tap here to enter text.

17

-------
f. External Performance Audits
Question
Response
Comment
Does your agency participate in the following
Click or tap here to enter text.
external performance audits? If the agency does not

participate, please explain why.


• NPAP
Choose an
Click or tap here to enter text.

item.

• PM2.5
Choose an
Click or tap here to enter text.

item.

• PEP
Choose an
Click or tap here to enter text.

item.

• Pb-PEP
Choose an
Click or tap here to enter text.

item.

• Pb Strip Audit
Choose an
Click or tap here to enter text.

item.

• Ambient Air Protocol Gas
Choose an
Click or tap here to enter text.
Verification Program
item.

(AA_PGVP)


• Round Robin metal PT
Choose an
Click or tap here to enter text.

item.

List other performance audit participation.
Click or tap here to enter text.
Who performs NPAP and PEP audits?
Click or tap here to enter text.
18

-------
3. Network Management
This section of the questionnaire completed by: Click or tap here to enter name.
Key Individuals:
Title/Position
Name
Click or tap here to enter text.

Click or tap here to enter text.

a. Network Design
For monitoring organizations and agencies that do not submit the annual network plan required by 40
CFR 58.10, please complete the table below. For those that do submit an annual network plan, proceed
to section b.
Site Name
AQS Site ID #
Pollutants Monitored
Proposed Changes
Click or tap here to
Click or tap here to
Click or tap here to enter
Click or tap here to enter
enter text|
enter text|
text^
text^




b. Siting
b.l Site Evaluations
Complete the following table.
Question
Yes
No
Comment
Does the current level of monitoring effort,
station placement, instrumentation, etc., meet
requirements imposed by current grant
conditions?
~
~
Click or tap here to enter
text.
Are there any issues?
~
~
Click or tap here to enter
text.
How often are site evaluations for 40 CFR Part 58
Appendix E criteria conducted?
Frequency:
Click or tap here to enter
text.

Date of last
review:
Click or tap to enter a date.
19

-------
b.2 Site Non-Conformance
Please list any monitors with siting non-conformances, the AQS ID numbers for those monitors, the type
of non-conformance and the reason(s) for the non-conformance. If none of your agency's monitors have
siting non-conformances, proceed to section c. Waivers.
Monitor
AQS Site ID #
Type of Non-
Conformance
Reason(s) for Non-
Conformance
Choose an item|
Click or tap here to
Choose an item|
Click or tap here to

enter text.

enter text.
c. Waivers
c.l Waiver Questions
Complete the following table.
Question
Yes
No
Comment
Does your agency have any waivers?
~
~
Click or tap here to enter
text.
Does your agency plan to request any waivers?
If yes, identify waivers in the Comment section.
~
~
Click or tap here to enter
text.
Has your agency obtained necessary waiver
provisions to operate equipment which does
not meet the effective reference and
equivalency requirements?
~
~
Click or tap here to enter
text.
Do any sites vary from the required frequency in
40 CFR Part 58.12?
~
~
Click or tap here to enter
text.
Do any collocated PM2.5 sites exceed the
distance requirements in 40 CFR Part 58 Section
3.2.3.4 (c)? Waiver allowances can found in 40
CFR Part 58, Section 3.2.3.4 (c)
~
~
Click or tap here to enter
text.
c.2 Waiver Types
Indicate any waivers requested or granted by the Regional Office, and provide waiver documentation. If
your agency does not have any waivers, proceed to section d. Documentation
Waiver Type
Reason
Choose an item.

Click or tap here to enter text.

20

-------
d. Documentation
Complete the following table.
Question
Yes
No
Comment
Are hard copy or electronic site information files
retained by the agency for all air monitoring stations
~
~
Click or tap here to
enter text.
within the network?



Does each station have the required information including:
AQS Site ID Number?
~
~
Click or tap here to
enter text.
Photographs of the four cardinal compass points?
~
~
Click or tap here to
enter text.
Startup and shutdown dates?
~
~
Click or tap here to
enter text.
Documentation of instrumentation?
~
~
Click or tap here to
enter text.
Who has custody of the current network documents
Name: C
ick or
Click or tap here to

tap here to enter
enter text.

text.


Title:Click or tap
Click or tap here to

here to enter
enter text.

text.

21

-------
4. Field Operations
This section of the questionnaire completed by: Click or tap here to enter name.
Key Individuals (e.g., Field Manager, Field Supervisor, Field QA Manager, etc.):
Title/Position
Name
Click or tap here to enter text.
Click or tap here to enter text.
a. Field Support
Complete the following table.
Question
Yes
No
Comment
On average, how often are most of your stations visited
by a field operator?
Click or tap here to enter text.

Is this visit frequency consistent for all air
monitoring data collecting organizations
within your agency?
~
~
Click or tap here to enter text.

On average, how many stations does a single operator
have responsibility for?
Click or tap here to enter text.

How many sites (SLAMS/NCORE/SPM) have sampling
manifold?
Click or tap here to enter text.

Do the sample inlets and/or manifolds
meet the requirements for through-the-
probe audits?
~
~
Click or tap here to enter text.

Briefly describe the most common manifold type
Click or tap here to enter text.

How often are manifolds cleaned?
Click or tap here to enter text.

Is there a conditioning period for the
manifold cleaning?
~
~
Click or tap here to enter text.

Are manifolds equipped with a blower?
~
~
Click or tap here to enter text.

Is there sufficient air flow through the
manifold/sampling lines at all times?
~
~
Click or tap here to enter text.

How is the air flow through the manifold/sampling
monitored?
ine
Click or tap here to enter text.

What is the average residence time?
Click or tap here to enter text.

How often is the residence time calculated?
Click or tap here to enter text.

Sampling lines:
1) What material is used for instrument sampling
lines?
Click or tap here to enter text.

2) How often are sampling lines changed?
Click or tap here to enter text.

Do you utilize uninterruptable power
supplies or backup power sources at your
sites?
~
~
Click or tap here to enter text.

What instruments or devices are protected?
Click or tap here to enter text.

22

-------
b. Instrument Acceptance
b.l Instrumentation
Please list the instruments in your inventory.
Pollutant
Number of
Instruments
Make and Models
Reference or
Equivalent Number
Choose an item.
Click or tap here to
Click or tap here to
Click or tap here to

enter text.
enter text.
enter text.
b.2 Instrument Needs
Please list your instrument needs in order of priority.
Click or tap here to enter text.
c. Calibration
c.l Calibration Frequency and Methods
Please indicate the frequency and method of multi point calibrations.
Pollutant
Frequency
Calibration Method:
Calibration Method:


Back of Instrument
Through the Probe
Choose an item|
Click or tap here to
~
~

enter text.


23

-------
c.2 Calibration Questions
Please complete the following table.
Question
Yes
No
Comment
How are field calibration procedures documented and
Click or tap here to enter text.
how are the results recorded?



Are calibrations performed in keeping
~
~
Click or tap here to enter text.
with the guidance in Vol. II of the QA



Handbook?



Are calibration procedures consistent
~
~
If no, why not? Click or tap here to enter
with the operational requirements of


text.
Appendices to 40 CFR Part 50 or to



analyzer operation/instruction manuals?



Have changes been made to calibration
~
~
Click or tap here to enter text.
methods based on manufacturer's



suggestions for a particular instrument?



Do standards used for calibrations meet
~
~
Comment on deviations. Click or tap here to
the requirements of appendices to 40 CFR


enter text.
50 (EPA reference methods) and Appendix



A to 40 CFR 58 (traceability of materials to



NIST, SRMs or CRMs)?



Are all flow-measurement devices NIST-
~
~
Click or tap here to enter text.
traceable?



d. Certification
d.l Flow Devices
Please list the authoritative standards used for each type of flow measurement, and indicate the
certification frequency of standards to maintain field material/device credibility.
Flow Device
Serial Number
Primary Standard
Certification
Frequency
Use (calibration,
audit, or spare)
Choose an item.
Click or tap here
Click or tap here
Click or tap here
Choose an item.

to enter text.
to enter text.
to enter text.

24

-------
d.2 Certification Questions
Please complete the following table.
Question
Yes
No
Comment
How are certifications performed? (internally, by a vendor,
or third party?)
Click or tap here to enter text.

Where do field operations personnel obtain gas
standards?
Click or tap here to enter text.

How are the gas standards verified after receipt?
Click or tap here to enter text.

What equipment is used to perform calibrations (e.g.,
dilution devices)
Click or tap here to enter text.

Do the dilution air flow control and
measurement devices conform to CFR
requirements?
~
~
Click or tap here to enter text.

What traceability is used?
Click or tap here to enter text.

Is calibration equipment maintained at
each station?
~
~
Click or tap here to enter text.

How is the functional integrity of this equipment
documented?
Click or tap here to enter text.

Who has responsibility for maintaining field calibration
standards?
Click or tap here to enter text.

*Please provide copies of certifications of all standards currently in use from your master and/or
satellite certification logbooks (i.e.. chemical, gas, flow, and zero air standards).
25

-------
d.3 Ozone Traceability Diagram
*Please provide a flow diagram establishing traceabilitv from the SRP (Level 1) to the ozone transfer
standards used in your network.
26

-------
d.3 Calibrator Certification
Please list the authoritative standards and frequency of each type of dilution, permeation and ozone
calibrator and indicate certification frequency.
Calibrator
Primary Standard
Frequency of
Certification/Calibration
Choose an item.
Click or tap here to enter text.
Click or tap here to enter text.
27

-------
e. Repair
Complete the following table.
Question
Yes
No
Comment
Who is responsible for performing preventive

Click or tap here to enter text.
maintenance?



Is special training provided to them for
~
~
Click or tap here to enter text.
performing preventive maintenance?



Briefly comment on background or



courses.



What is the preventive maintenance schedule for each
Click or tap here to enter text.
type of field instrumentation?



If preventive maintenance is MINOR, it is performed at
Click or tap here to enter text.
(check one or more)



~ Field Station ~ Headquarters Facilities ~ Manufacturer

~ PQAO



If preventive maintenance is MAJOR, it is performed at
Click or tap here to enter text.
(check one or more)



~ Field Station ~ Headquarters Facilities ~ Manufacturer

~ PQAO



Does the agency have service contracts
~
~
Click or tap here to enter text.
or agreements in place with instrument



manufacturers? Indicate in the



Comment section or attach additional



pages to show which instrumentation is



covered.



Comment briefly on the adequacy and
~
~
Click or tap here to enter text.
availability of the supply of spare parts,



tools, and manuals available to the field



operator to perform any necessary



maintenance activities. Do you feel that



this is adequate to prevent any



significant data loss?



Is the agency currently experiencing any
~
~
Click or tap here to enter text.
recurring problem with equipment or



manufacturer(s)? If so, please identify



the equipment or manufacturer, and



comment on steps taken to remedy the



problem.



28

-------
f. Record Keeping
Complete the following table.
Question
Yes
No
Comment
What type of station logbooks are maintained at each
monitoring station? (maintenance logs, calibration logs,
personal logs, etc.)
Click or tap here to enter text.

What information is included in the station logbooks?
Click or tap here to enter text.

Who reviews and verifies the logbooks for adequacy of
station performance?
Click or tap here to enter text.

How is control of logbook maintained?
Click or tap here to enter text.

Where is the completed logbook archived?
Click or tap here to enter text.

What other records are used? (Use drop-down menu
below). Comment on the use and storage of these
documents.
Click or tap here to enter text.

Choose an item.
Click or tap here to enter text.

Are calibration records (or calibration
constants) available to field operators?
~
~
Click or tap here to enter text.

*Please attach an example field calibration record sheet.
29

-------
5. Laboratory Operations
This section of the questionnaire completed by: Click or tap here to enter name.
Laboratory Name:
Laboratory Name
Laboratory Address:
Laboratory Address
Key Individuals (e.g., Laboratory Manager, Laboratory Supervisor, Laboratory QA Manager, etc.):
Title/Position
Name
Click or tap here to enter text.

Click or tap here to enter text.

a. Routine Operation
a.l Methods
In the table below, identify which of the following analyses are performed in the laboratory and state
the method used to conduct the analyses.
Pollutant
Method
Choose an item.

Click or tap here to enter text.

Please describe areas where there have been difficu
ties meeting the regulatory requirements for any of
the above methods.
Click or tap here to enter text.
30

-------
a.2 Quality System
Complete the following table.
Question
Yes
No
Comment
Are procedures for the methods listed in
~
~
Click or tap here to enter text.
section a.l included in the agency's QA



Project Plan?



Have the laboratory SOPs been reviewed
~
~
Click or tap here to enter text.
and approved? If yes, in the comment



section, indicate by who (EPA, PQAO,



etc.)?



Are SOPs easily and readily accessible for
~
~
Click or tap here to enter text.
use and reference within the laboratory?



If not, where are the documents stored?



Does the lab have sufficient
~
~
Click or tap here to enter text.
instrumentation to conduct the analyses?



Are separate facilities maintained for
~
~
Click or tap here to enter text.
weighing the different sample types?



(e.g., hi-volume vs low-volume), or is one



weighing room utilized for all samples?



Describe.



Does your laboratory hold certifications?
~
~
Click or tap here to enter text.
(EPA, NIST, State, NELAC, or other)



Does your laboratory operate under a
~
~
Click or tap here to enter text.
Quality Assurance Manual or equivalent



document?



Does your laboratory participate in
~
~
Click or tap here to enter text.
performance evaluation programs?



Does your laboratory have a corrective
~
~
Click or tap here to enter text.
action process for non-conforming work?



Does your laboratory have a laboratory
~
~
Click or tap here to enter text.
staff person assigned the role of Quality



Assurance Officer?



Please describe needs for laboratory instrumentation.
Click or tap here to enter text.
31

-------
b. Laboratory Quality Control
b.l Standards
Please identify the equipment and standards used in support of the gravimetric laboratory, including any
quality assurance standards (such as additional weight sets or portable RH/temperature probes).
Device
Pollutant
Brand (Make)
Model (Class)
Calibration/Certification
Expiration Date
Choose an
Choose an
Click or tap
Click or tap here
Click or tap to enter a
item.
item.
here to enter
to enter text.
date.


text|


*Please have calibration/certification records available for all laboratory standards.
b.2 Laboratory Temperature and Relative Humidity
Complete the following table.
Question
Yes
No
Comment
What is the accuracy specification and recording time
(e.g., 5 min. averaging time) of the temperature sensor
(logger) used in the gravimetric laboratory?
Click or tap here to enter text.

What is the accuracy specification and recording time
(e.g., 5 min. averaging time) of the RH sensor (logger)
used in the gravimetric laboratory?
Click or tap here to enter text.

What is the accuracy specification for any RH/temp audit
device used in the laboratory, if applicable?
Click or tap here to enter text.

Does the laboratory utilize an IR gun to
obtain sample shipment temperatures?
~
~
Click or tap here to enter text.

• If yes, is the IR gun NIST-
traceable? Provide the
certification expiration date.
~
~
Click or tap here to enter text.

If the laboratory does not utilize an IR gun, what device is
used to obtain shipment temperature? Please describe its
traceability and provide a certification expiration date.
Click or tap here to enter text.

32

-------
c. Laboratory Preventive Maintenance
Complete the following table.
Question
Yes
No
Comment
Is preventive maintenance performed on
~
~
Click or tap here to enter text.
laboratory equipment? If so, who has the



responsibility for performing preventive



maintenance?



If equipment maintenance is performed
~
~
Click or tap here to enter text.
by laboratory staff, does the SOP detail



the procedures to be followed? Provide



the SOP title, date, and revision number



where the procedures are found.



Is a maintenance log maintained for the
~
~
Click or tap here to enter text.
balance?



Are service contracts in place for the
~
~
Click or tap here to enter text.
balance?



If utilizing a weighing room, are service
~
~
Click or tap here to enter text.
contracts in place for the climate control



unit/HVAC?



Describe static control equipment utilized in the

Click or tap here to enter text.
weighing room, if applicable.



Does the weighing room undergo routine
~
~
Click or tap here to enter text.
cleaning activities? On what frequency?



Briefly describe the weighing room cleaning procec
ure.
Click or tap here to enter text.
33

-------
d. Laboratory Record Keeping
Complete the following table.
Question
Yes
No
Comment
Are all samples that are received by the
laboratory logged in?
~
~
Click or tap here to enter text.

Discuss sample routing (or attach a copy of the latest
SOP which covers this). Attach a flow chart on the next
page, if possible.
Click or tap here to enter text.

For the following 4 questions, select the medium used to c
medium is not listed, select "Other" and list the medium. 1
"N/A".
ocument various activities enlisted. If the
f the information is not recorded, select
• Environmental conditions, weighing session
results, balance checks, and weight checks?
Choose an item.

• Serial numbers of filters prepared for the field?
Choose an item.

• Serial number of filters returning from the field
for analysis?
Choose an item.

• General information about daily lab activities,
preventive maintenance procedures, and/or
other significant events in the laboratory that
may impact data quality or the data record?
Choose an item.

How are data records from the laboratory archived?
Click or tap here to enter text.

• Where?
Click or tap here to enter text.

• Who has the responsibility? (identify
person/position)
Click or tap here to enter text.

How long are records kept? Indicate the number of
months/years.
Click or tap here to enter text.

Does the laboratory SOP contain
procedures for sample chain-of-custody
(COC)?
~
~
Click or tap here to enter text.

• If yes, indicate the title, date, and revision
number, and where it can be found.
Click or tap here to enter text.

What type of COC record accompanies the samples?
Click or tap here to enter text.

Does the laboratory maintain original
COCs or copies?
~
~
Click or tap here to enter text.

Where are COCs filed?
Click or tap here to enter text.

34

-------
*lf possible, attach a sample routing flow chart:
35

-------
e. Laboratory Data Acquisition and Handling
Question
Yes
No
Comment
Identify those laboratory instruments (e.g., balances,
temperature/RH loggers, etc.) which make use of
computer interfaces directly to record data.
Click or tap here to enter text.

Are QC data results readily available to the
analyst during a weigh session?
~
~
Click or tap here to enter text.

Do RH/temperature loggers record values
using paper chart records (chart wheels)?
If yes, where are the paper charts
maintained? Are they signed and dated?
~
~
Click or tap here to enter text.

What is the laboratory's capability with regards to data
recovery? In case of problems, can the laboratory
recapture data that may be lost in the event of
computer failure? Discuss briefly.
Click or tap here to enter text.

Does the laboratory maintain an SOP that
discusses how to use the laboratory's data
acquisition instrumentation? If yes, please
provide the SOP title, date, and revision
number.
~
~
Click or tap here to enter text.

36

-------
*Please attach a flow chart/diagram which illustrates the transcriptions, verifications, validations, and
reporting processes the data goes through before being released by the laboratory.
37

-------
f. Filter Questions
Question
Yes
No
Comment
Does the agency use filters supplied by
~
~
Click or tap here to enter text.
EPA?



If the answer to the above question is No,
~
~
Click or tap here to enter text.
do the filters utilized meet the



specifications in 40 CFR Part 50? Who is



the vendor? Be prepared to provide



documentation to demonstrate



acceptance testing results.



Are unexposed filters visually inspected
~
~
Click or tap here to enter text.
via strong light from a view box for



pinholes and other imperfections?



Are unexposed filters equilibrated in a
~
~
Click or tap here to enter text.
controlled conditioning environment



which meets or exceeds the requirements



of 40 CFR Part 50? Describe the



conditioning room/chamber.



How long is the conditioning period?
Click or tap here to enter text.
Briefly describe how exposed filters are prepared for
Click or tap here to enter text.
conditioning.



Are exposed filters reconditioned in the
~
~
Click or tap here to enter text.
same conditioning environment as the



unexposed filters?



Are the temperature and relative humidity
~
~
Click or tap here to enter text.
of the conditioning environment (i.e.,



weigh room or conditioning chamber)



monitored? What is the resolution of the



data collected (e.g., 1-minute, 5-minute,



1-hour, etc.)?



How often are balance checks performed?
Click or tap here to enter text.
Do the weights (mass reference standards)
~
~
Click or tap here to enter text.
bracket the weights of the filters being



utilized? What are the masses of the



weight standards used?



To what sensitivity are filter weights recorded?
Click or tap here to enter text.
Are filters packaged for protection to and
~
~
Click or tap here to enter text.
from the laboratory?



On average, what is the elapsed time in hours between
Click or tap here to enter text.
the end of sampling and laboratory receipt?



In what medium are field measurements recorded
(e.g.,
Click or tap here to enter text.
in a log book, on a filter form, or on standard forms)?

Briefly describe how and where exposed filters are

Click or tap here to enter text.
stored after being weighed.



On what frequency are lab blanks utilized?
Click or tap here to enter text.
38

-------
Are chemical analyses performed on
filters? If yes, which? Where are these
additional analyses performed?
~
~
Click or tap here to enter text.

e. Metals & Other Analyses
If your laboratory completes lead (Pb) and/or other metals analyses, please complete the tables in this
section.
e.l Laboratory QA/QC
Question
Yes
No
Comment
Are at least one duplicate, one blank, and
one standard or spike included with a
given analytical batch?
~
~
Click or tap here to enter text.

Briefly describe the laboratory's use of data derived
from blank analyses.
Click or tap here to enter text.

Are criteria established to determine
whether blank data are acceptable?
~
~
Click or tap here to enter text.

How frequently and at what concentration ranges does
the lab perform duplicate analyses? What constitutes an
acceptable agreement?
Click or tap here to enter text.

Please describe how the lab uses data obtained from
spiked samples, including the acceptance criteria (e.g.,
acceptable percent recovery).
Click or tap here to enter text.

Does the laboratory include samples of
reference material within an analytical
batch? If yes, indicate frequency, level,
and material used.
~
~
Click or tap here to enter text.

Are mid-range standards included in
analytical batches? If yes, describe the
frequency, level, and compound.
~
~
Click or tap here to enter text.

Are criteria for real time quality control
established that are based on the results
obtained for the mid-range standards
discussed above? If yes, briefly discuss
them below or indicate the document in
which they can be found.
~
~
Click or tap here to enter text.

Are appropriate acceptance criteria for
each type of analysis documented?
~
~
Click or tap here to enter text.

39

-------
e.2 Chemicals
Complete the following table.
Question
Yes
No
Comment
Are all chemicals and solutions clearly
marked with an indication of shelf life?
~
~
Click or tap here to enter text.

Are chemicals removed and properly
disposed of when the shelf life expires?
~
~
Click or tap here to enter text.

Does the laboratory purchase standard
solutions such as those for use with lead
or other metals analyses?
~
~
Click or tap here to enter text.

Are only ACS grade chemicals used by the
laboratory?
~
~
Click or tap here to enter text.

Comment on the traceability of chemicals used in t
preparation of calibration standards.
ie
Click or tap here to enter text.

40

-------
e.3 Pb
Question
Response
Comments
Is Pb analysis performed by a
Choose an item.
Click or tap here to enter text.
contract laboratory? If yes,


provide the laboratory name in


the comment section.


What filter media is used for Pb
Choose an item.
Click or tap here to enter text.
analysis?


Are filter samples visually
Choose an item.
Click or tap here to enter text.
inspected for defects (e.g.,


pinholes, tears and non-uniform


deposit)?


Are filters invalidated if defects
Choose an item.
Click or tap here to enter text.
are found? If no, why not?


Are tweezers used to handle
Choose an item.
Click or tap here to enter text.
filters? If yes, what material are


the tweezers made of (ex.


Teflon, plastic, metal, etc.)?


What extraction method is used
Choose an item.
Click or tap here to enter text.
for filters?


What reagents are used to clean glassware?
Click or tap here to enter text.
List standards used for analysis.
Click or tap here to enter text.
Are filter lot blanks analyzed for
Choose an item.
Click or tap here to enter text.
Pb content at a rate of 20 to 30


random filters per batch of 500


or greater? Only for filters not


provided by EPA.


How often are MDLs determined?
Click or tap here to enter text.
How many replicates used for MDLs?
Click or tap here to enter text.
Are MDLs calculated in
Choose an item.
Click or tap here to enter text.
accordance with 40 CFR Part


136, Appendix B? If not, why


not?


Are waste HN03, HCL, and
Choose an item.
Click or tap here to enter text.
solutions containing these


reagents and/or Pb placed in


labeled bottles and delivered to


a commercial firm that


specializes in removal of


hazardous waste?


41

-------
6. Data & Data Management
This section of the questionnaire completed by: Click or tap here to enter name.
Key Individuals:
Title/Position
Name
Click or tap here to enter text.

Click or tap here to enter text.

a. Data Handling
Complete the following table.
Question
Yes
No
Comment
Is there a procedure, description, or a chart
which shows a complete data sequence
from point of acquisition to point of
submission of data to EPA?
~
~
Click or tap here to enter text.

Are procedures for data handling (e.g., data
reduction, review, etc.) documented? If yes,
comment on where.
~
~
Click or tap here to enter text.

In what media (e.g., flash drive, telemetry, wireless,
etc.) and formats do data arrive at the data processing
location?
Click or tap here to enter text.

How often are data received at the processing location
from the field sites and laboratory?
Click or tap here to enter text.

Are there any activities being done before
data is released to agency internal data
processing?
~
~
Click or tap here to enter text.

How are data entered to the computer system? (e.g.,
computerized transcription, manual entry, digitization
of strip charts, or other)?
Click or tap here to enter text.

For manual data, is a double-key entry
system used?
~
~
Click or tap here to enter text.

42

-------
"Please provide a data flow diagram indicating the data flow within the reporting organization.
43

-------
b. Software Documentation
Complete the following table.
Question
Yes
No
Comment
Does your agency use an AQS Manual?
~
~
Click or tap here to enter text.
Does the agency have information on the
~
~
Click or tap here to enter text.
reporting of precision and accuracy data



available?



What software is used to prepare air
~
~
Click or tap here to enter text.
monitoring data for release into the AQS



and AirNow databases? Include the names



of the software packages, vendor or



author, revision numbers, and the revision



dates of the software.



What is the recovery capability in the
~
~
Click or tap here to enter text.
event of a significant computer problem



(i.e., how much time and data would be



lost)?



Has your agency tested the data
~
~
Click or tap here to enter text.
processing software to ensure its



performance of the intended function are



consistent with the QA Handbook Volume



II, Section 14.0?



Does your agency document software
~
~
Click or tap here to enter text.
tests? If yes, provide the documentation.



44

-------
c. Data Validation and Correction
Complete the following table.
Question
Yes
No
Comment
Is there documentation in regards to data
that has been identified as suspect and
subsequently flagged?
~
~
Click or tap here to enter text.

Please describe what action the data validator wil
(e.g., flags, invalidate, etc.) if they find data with
exceeded QC criteria.
take
Click or tap here to enter text.

Please describe how changes made to data that were
submitted to AQS and AirNow are documented.
Click or tap here to enter text.

Who has signature authority for approving corrections?
Name:Click or tap here to enter text.
Program Function: Hick or tap here to
enter text.
What criteria are used to determine a data point be
deleted or invalidated?
Click or tap here to enter text.

What criteria are used to determine if data need to be
reprocessed?
Click or tap here to enter text.

Are corrected data resubmitted to the
issuing group/record generator for cross-
checking prior to release?
~
~
Click or tap here to enter text.

d. Data Processing
d.l Reports
Complete the following table.
Question
Yes
No
Comment
Does the agency generate data
~
~
Click or tap here to enter text.
summary reports?



Please list at least three reports routinely

Click or tap here to enter text.
generated, including the information requested
below.

Report Title
Distribution
Period Covered
Click or tap here to enter text.
Click or tap here to enter
Click or tap here to enter text.

text.

45

-------
d.2 Data Submission
Complete the following table.
Question
Yes
No
Comment
How often are data submitted to AQS?
Click or tap here to enter text.
How often are data submitted to AirNow?
Click or tap here to enter text.
Briefly comment on difficulties the agency may have
Click or tap here to enter text.
encountered in coding and submitting data following the

AQS guidelines.



Does the agency retain a hard copy
~
~
Click or tap here to enter text.
printout of submitted data from AQS?



Are records kept by the agency for at least
~
~
Click or tap here to enter text.
3 years in an orderly, accessible form? If



yes, does this include:



• Raw data
~
~
Click or tap here to enter text.
• Calculations
~
~
Click or tap here to enter text.
• QC data
~
~
Click or tap here to enter text.
• Reports: list which reports are
~
~
Click or tap here to enter text.
used



Has your agency submitted data (along
~
~
Click or tap here to enter text.
with the appropriate calibration equations



used) to the processing center?



Are concentrations of PMio corrected to
~
~
Click or tap here to enter text.
EPA standard temperature and pressure



conditions (i.e., 298 K, 760 mm Hg) before



input to AQS?



Are concentrations of PIVh.sand Pb
~
~
Click or tap here to enter text.
reported to AQS under actual (volumetric)



conditions?



Are audits on data reduction procedures
~
~
Click or tap here to enter text.
performed on a routine basis? If yes, at



what frequency?



Are data precision and accuracy checked
~
~
Click or tap here to enter text.
each time they are calculated, recorded,



or transcribed to ensure that incorrect



values are not submitted to EPA?



46

-------
e. Internal Reporting
e.l Reports
What internal reports are prepared and submitted as a result of the audits required under 40 CFR Part
58, Appendix A?
Report Title
Frequency
Click or tap here to enter text.

Click or tap here to enter text.

What internal reports are prepared and submitted as a result of precision checks also required under 40
CFR Part 58, Appendix A?
Report Title
Frequency
Click or tap here to enter text.
Click or tap here to enter text.

Question
Yes
No
Comment
Do either the audit or precision check
reports indicated include a discussion of
corrective actions initiated based on audit
or precision check results?
~
~
Click or tap here to enter text.
e.2 Responsibilities
Who has the responsibility for the calculation and preparation of data summaries? To whom are such
summaries delivered?
Name
Title
Type of Report
Recipient
Click or tap here to
enter text.
Click or tap here to
enter text.
Click or tap here to
enter text.
Click or tap here to
enter text.
Identify the individuals within the agency responsible for reviewing and releasing the data.
Name
Program Function
Click or tap here to enter text.

Click or tap here to enter text.

Question
Yes
No
Comment
Does your agency report to the Air Quality
Index (AQI)?
~
~
Click or tap here to enter text.

Is data certification signed by a senior
officer of your agency?
~
~
Click or tap here to enter text.

47

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 101 of 105
Appendix B
Example Field Audit Logbook
The following example of a field audit logbook consists of a series of tables and checklists to
assist an auditor in investigating ambient air monitoring sites. The contents are as follows:
•	Monitoring site evaluation form
•	Site drawing form
•	Comment page
•	Photograph log
This logbook may be edited for specific regional use.
Note: The siting criteria documented in this template was current as of the published date of
this document. Users should verify that the criteria have not been updated in the CFR or other
guidance documents before use. If discrepancies exist, please inform the author(s) of this
document to recommend revision of this appendix.

-------
^tDsrx
pro^ฐ
LOGBOOK: Air Monitoring Site Evaluations
Technical Systems Audit
Agency Name:	
Audit Date:
Lead Auditor:
Audit Team
Print Name / Signature / Initials
1: (Team Lead)
2:
3:
4:
5:
Page 1 of 9

-------
Site Name
(AQS ID:
xx-xxx-xxxx)
Insert AMP 390
Report for specific site
Page 2 of 9

-------
MONITORING SITE EVALUATION FORM (MSEF) (Page 1/5)
Local Site Name:	 Initials:	 Date:	
EPA auditor should document in the Site Logbook - the time / date / purpose of visit / EPA representatives present [Y/N] Completed
Arrival Time:	Departure Time:	 Primary Operator:	
Observer(s):	
NETWORK(s): [nCriteria / nNCore / nNear Road / nlMPROVE / nCASTNet / nNATTS / nPAMS / nToxics]
SITE
[Y/N]-Security Fence [Y/N]-Razor/Barb Wire [Y / N / NA] Grass/Shrubs Cut [Y / N / NA] Bare Soil Area
[Y/N] Vandalism - [nlnside / nOutside] Date:	 [Y/N] Police Report Filed
Issues:	
SHELTER - Interior
Arrival Temperature:	ฐC (from data logger) Operator Site Visits: 	per [week | month |	]
[Y/N] Leaking Roof [Damaged: nCeiling / nFloor / nWalls] [Y/N] Clean / Neat [Y/N] Fire Extinguisher
[Y/N] Insect / Wildlife Issues [Y/N] Thermometer (min/max) [Y/N] Gasoline (inside shelter)
Issues:
MONITOR(s):	Location: Exterior Samplers [~ Roof / nGround / ~ Not Present]
Monitor(s)
Manufacturer
Model
Serial Number








































MET: | ~Sonic / nAnalog] - | ~ WS / nWD / ~Temp / nRH / Other:	Make:	] ~ Not Present
CALIBRATOR(s): ~ Not Present	[Y/N] Are QA/QC Check Gases Vented Outside Shelter?
QA/QC
Make
Model
Serial Number
Certification
Date
Expiration
Date






























Page 3 of 9

-------
MSEF (page 2/5): Local Site Name:	Initials:	 Date:	
Is any analyzer sampling shelter air through its calibration line? [Y/N] If yes, photo, document and notify agency mgr.
All Gas Standards Pass thru all Filters during: [Y/N] Calibrations [Y/N] Precision Checks [Y/N] Audits
Issues:	
CYLINDER GAS STANDARDS:	~ Not Present
VENDOR:	(PSI Reading < 200, tank is empty and should not be in service)
QA
/QC
Gas Standard
PSI
Reading
Expiration
Date
Standard
Concentration
Serial Number
























Issues:	
SUPPORTING INSTRUMENTATION: Internal
[Y/N] Temperature Sensor [Y/N] Uninterruptable Power Supply [Y/N] On-Site Computer
Zero Air System: Commercial System (Make / Model): 	
Cartridge System: [nSilica Gel [~Pink / ~Blue] / nCharcoal / nPurafil / nHopcalite / Other:	]
[Y/N] Needs Service Last Service Date: 	 Condition: 	
Issues:	
Data Logger: [n8816 /~ 8832 / n8872 / Other:	]
Instrument(s) to Logger: |~ Analog / ~ Digital / ~ Mixed| Communications: [nCell Modem / cDSL / cDial up]
Strip Chart: [^Electronic / nPaper / nBoth / nNo Access] [Y/N] Operator Proficiency [Y/N] Time Accurate
(Examine on chart: last calibration / precision check / audit - look for stability, concentration level, comments, etc.)
Issues:	
Probe Line(s): [^Replaced/ nCleaned] -Frequency:	 Last Service Date: 	
[Y/N] Clean [Y/N] Heated [Y/N] Insulated [Y/N] Moisture [Y/N] Retractable [Y/N] Old / Unused Lines
[Y/N] Lo Flo Manifold -> [Y/N] Any Open Ports? -> How many analyzers using manifold? 	
Issues:	
RECORDS — At Site
Documents Available: [^Hardcopy / ^Electronic] - [nQAPP(s) / nSOP(s) / ^Instrument Manual(s)]
Issues:	
Logbooks: [^Hardcopy / ^Electronic] - [nSite Log / ^Instrument Log / Other(s):	]
(Entries well documented?)!
Charts / Papers on Walls: What do they Track, Up-to-date?	
Page 4 of 9

-------
MSEF (page 3/5): Local Site Name:
Initials:
Date:
SHELTER — Exterior	~ Not Present
Type: [nFreezer / nWood Building / nBrick-Block / Other:	]
[Y/N] Needs Maintenance (specify) [Y/N] Tied Down [Y/N] Electrically Grounded [Y/N] Roof Railing
Roof Access: [Stairs [Interior/Exterior] / Ladder [attached/removable] / ~ Not Present] [Y/N] Loose Decking (Trip Hazard)
Issues:	
OUTDOOR SAMPLERS	~ Not Present
[Y/N] Locked [Y/N] Electrically Grounded [Y/N] Stabilized [Y/N] Clean Inside [Y/N] Head/Separator Clean
Operator / Log: VSCC/WINS Clean Schedule:	 PM10 Head Clean Schedule:	
Issue(s): 	
COLLOCATED SAMPLERS: ~ Not Present	(39.4 inches = 1 meter)
Pollutant
Flow
(Hi/Lo)
Separation Distance
(meters)












Collocated monitors must be within 4 meters of each other and at least 2 meters apart for flow rates greater than 200 liters/min or at least 1
meter apart for samplers having flow rates less than 200 liters/min to preclude airflow interference, unless a waiver is in place as approved by the
Regional Administrator pursuant to section 3 of Appendix A.
PROBE SYSTEM(s): External ~ Not Present
Inlet Type: [nSingle Line / nDual Line / lBcII Type (CAS design)]
Funnel(s): [aRam Shield / aPart of Probe] Funnel Material: [nTeflonฎ / nGlass / nStainless Steel / Other:	]
Probe Line(s): [nTeflonฎ / Other:	] Probe Fitting(s): [nTeflonฎ / Other:	/ ~ Not Present]
Residence Time: (20 sec. max)
Pollutant(s)
Inlet
Height
(meters)
Inlet Location
(Side of Shelter, Ground, Roof)
Horizontal
Distance
(meters)
If Applicable
Vertical
Distance
(meters)
If Applicable
Monitoring SCALE
AQS
Annual
Network
Plan



































FOR Horizontal and Vertical Distances: Separation Distance = (1 meter for 03, CO, S02, N02) & (2 meters for PM, Pb)
When probe is located on a rooftop, this separation distance is in reference to walls, parapets, or penthouses located on roof.
Height of Roof: 	meters Roofing Material:
Issues:	
Page 5 of 9

-------
MSEF (Page 4/5): Local Site Name:	Initials:	 Date:	
QBSTRUCTION(s): Distance from sampler, probe to obstacle, such as a building, must be at least twice the height the obstacle
protrudes above the sampler and probe.
Obstacle Distance(s) (OD)
All distances in meters	OD MUST be > [2*(OH-IH)]
Obstacle(s)
Obstacle
Height (OH)
Sampler/Probe
Inlet Height
(IH)
[2*(OH-IH)]
Obstacle Distance
(OD)















Please identify each of these obstacles in the SITE DRAWING (next page)
TREE DRIPLINE(s):	_inches =	meters (nearest inlet to dripline) ;t( No Trees Present
(39.4 inches = 1 meter)		inches = 	meters (nearest inlet to dripline) ~ Mot Present
	inches = 	meters (nearest inlet to dripline) ~ Not Present
Should be greater than 20 meters from the dripline of tree(s) and must be 10 meters from the dripline when the tree(s) act as an obstruction.
Issues:	
UNRESTRICTED AIR FLOW:	0 Estimated Degrees of Clearance
Must have unrestricted airflow 270 degrees around the probe or sampler; 180 degrees if the probe is on the side of a building or a wall.
leters
10 meters
Page 6 of 9

-------
MSEF (page 5/5): Local Site Name:
Initials:
Date:
SITE DRAWING
~	Direction NORTH
~	Primary Wind Dir
~	Security Issues
~	Sloping Areas
~	Railroad Tracks
Page 7 of 9
- Please Indicate: (relevant distance / height measurements)
~	Monitoring Shelter
~	Probe Positions)
~	Exterior Samplers
~	Met Tower
~	Security Fencing
~	Nearby Trees/Shrubs
~	Roadways
~	Buildings
~	Walls
~	Other Obstructions
~	Possible Sources
~	Paved / Unpaved Areas
~	Nearby Construction
~	Flues, Vents, Boilers
~	Meat Cooking

-------
Local Site Name:	Initials:	Date:
Additional Comments:
Page 8 of 9

-------
PHOTO LOG: Local Site Name:	 Initials:	 Date:
Camera [~ EPA / ~ Personal - Owner:	] Make/Model:	
Filename:	Date:	Time:	Photographer:
Description:	
Filename:	Date:	Time:	Photographer:
Description:	
Filename:	Date:	Time:	Photographer:
Description:	
Filename:	Date:	Time:	Photographer:
Description:	
Filename:	Date:	Time:	Photographer:
Description:	
Filename:	Date:	Time:	Photographer:
Description:	
Filename:	Date:	Time:	Photographer:
Description:	
Filename:	Date:	Time:	Photographer:
Description:	
Filename:	Date:	Time:	Photographer:
Description:	
Filename:	Date:	Time:	Photographer:
Description:	
Filename:	Date:	Time:	Photographer:
Description:	
Filename:	Date:	Time:	Photographer:
Description:	
Page 9 of 9

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 102 of 105
Appendix C
Low-Volume Weighing Laboratory Audit Checklist
The following checklist incorporates the requirements of the 40 CFR Part 58, the Appendix L
method, and the guidance contained in Quality Assurance Guidance Document 2.12. This
checklist is intended to be used as guide to aid auditors in auditing low-volume PMio or PM2.5
gravimetric laboratories.
Note: The criteria documented in this template was current as of the published date of this
document. Users should verify that the criteria have not been updated in the CFR or other
guidance documents, such as the QAGD 2.12, before use. If discrepancies exist, please inform
the author(s) of this document to recommend revision of this appendix.

-------
LOW-VOLUME GRAVIMETRIC WEIGH LAB SYSTEMS REVIEW
Analyst:	Audit Date:
Location:	Auditor:
Audit Questions
Response
Comments
Yes
No
N/A
I. Routine Operations & Site Housekeeping
List any visible sources that may impact the weigh lab.
Is the room access restricted? Is the equipment located
inside the weigh room only that which is required for
daily weighing operations?


Is there an anteroom? Describe its condition.


What is being done to control drafts in the weigh
room? Is the microbalance located so that it is not
impacted by drafts?


Is the weighing table stable so vibrations do not affect
the balance?




Is the balance checked to determine if it is in fact
leveled? If so, at what frequency? Visually inspect
balance to determine if level.




Is the balance grounded?




Is the balance left in the "On" position at all times?




Is the autocalibration feature on the microbalance on or
off?


What is the readability and repeatability of the
balance?


Is the balance certified annually by an outside source?
When was the balance last certified? Where are the
records of this maintained? Request copy of
documentation.




What anti-static prevention device(s) is in place?


Are polonium strips used to eliminate static? Are they
purchased in advance? If so, how far in advance?
How often are they replaced? What is the date stamped
on the current strips?




What device (LIMS, datalogger, etc) is used to
monitor temperature and RH readings? What is the
resolution of data collected? What procedure is
utilized to verify these readings and at what frequency?


1

-------
Is the datalogger calibrated/certified annually? When
was the last certification? Where are the records
maintained? Request copy.
At what frequency are the temperature and humidity
sensors certified by an outside source? When was the
most recent certification? Where are these records
maintained? Request copies of documentation.
Is the temperature maintained at 20-23 ฐC, with a
temperature control range of < 2ฐC over a 24-hour
period? How is control demonstrated?
Is the relative humidity maintained at 30-40%, with a
standard deviation of <5% over a 24-hour period?
How is control demonstrated?
How and where is temperature & RH data review
documented? Frequency?
Are pre & post sampling RH differences calculated?
Where is this documented?
If the temperature or RH is found to be out of
specification, what corrective action is taken? Are
weigh sessions halted?
Are maintenance/service contracts in place for climate
control unit, sensors, and software in the lab?
Is electronic data backed-up at a defined frequency?
How long? Where is it stored?
Print and review temperature and RH graphs for three
prior weigh sessions (one for each year under review).
Obtain the 24-hr means and SDs for temp and RH for
those sessions.
Are there two sets of weights being used: a primary set
and a working set? Are they at least Class 2 weights?
Do the weights bracket the weight of the filters being
utilized?
How frequently are the weights recertified by an
outside source? Where is this documented? Request
copies of documentation.
2

-------
At what frequency are the working weights verified
against the primary weights? Where is this
documented? Request copies of documentation.


Who is responsible for ensuring that all standards are
certified at their required frequencies? How is this
tracked?

Is a logbook used to document environmental
conditions, weighing sessions, balance checks, weight
checks, lot blank stability test results, etc, as well as lab
maintenance activities? If not, how and where is this
information documented? Are entries signed and
dated?




On what frequency and how is the weigh lab cleaned?
Describe daily, monthly, & yearly cleaning regimes.


How are sample cassettes, including stainless steel
backing screens, cleaned? At what frequency are they
cleaned? Are they inspected for cracks or other
damage? Do both halves fit tightly together when
assembled?

Is there a separate location designated specifically for
cooler packing/unpacking? If so, describe its location
and condition.




II. Sample Conditioning
Where are new lots of filters stored once they are
received from EPA?


Describe the lot blank test procedure.

What were the results of the most recent lot stability
test? Request copy of documentation.

Are all filters visually inspected for defects both pre
and post-sampling? What technique is used to inspect
the filters? What criteria would cause a filter to be
rejected? Give examples.




Are all filters pre-conditioned prior to both the initial
and final weighing sessions? How long is the
conditioning period? Where are they conditioned? If
on a metallic shelf, is it grounded?




3

-------
How is the conditioning period affected if the weigh
room conditions are out of tolerance?
Are the filters conditioned in petri dishes or slides? Are
the lids on the slides or are they slightly ajar?
III. Sample Weighing
Is one person designated to weigh all sample filters? Is
there a back-up analyst? Does the same analyst weigh
the same filters pre- and post-?
Are filters sampled within 30 days of the initial (tare)
weigh? How is this tracked?
Are samples pre-assigned to a site? Or, is a site
assigned after a filter is deployed? Elaborate on how
filters are requested and distributed.
How are filters prepared for field deployment (i.e.,
loaded into sample bags or a magazine)? Where does
this activity occur?
Are samples maintained in a secure area at all times
after being delivered to the laboratory?
How are filters packed for shipment back to the
laboratory? What happens if a shipment (package
and/or individual filters) is damaged in transit?
Where are samples unloaded from the transport
containers? Is this area clean and secure?
Is the temperature of the cooler recorded at receipt in
the laboratory? Is the temperature device NIST-
traceable? Where is the temperature reading recorded?


Describe the post sample weigh time limits (i.e., 10
days/30 days) established for the lab and supporting
rationale.


If samples are refrigerated, is the temperature of the
refrigerator monitored? How is this accomplished? Is
the monitoring equipment verified/certified?
4

-------
Is a LIMS systems used to record the weighing results?
If not, describe how each weighing session is
documented.
Are anti-static, powder-free gloves and lab coats worn
while handling sample filters?
Are teflon forceps used to handle the sample filters?
How are they cleaned and at what frequency?
Are the same forceps used for handling the mass
standards used in handling sample filters?
How often are balance checks performed? What is the
tolerance (fxg) for balance checks? How and where is
this documented?
If balance checks do not agree within ฑ3 jig, what
corrective action is taken?
How often are lab blanks weighed? If the lab blanks
are not within ฑ15 ng, what corrective action is taken?
Are lab blanks used more than once?
Are field blanks weighed with each session? Are final
weigh results within ฑ 30 ng of the initial weigh? How
is this tracked? What corrective action is taken when
FBs are out of limits?
Are duplicate filters weighed with each session? What
is the acceptance limit (i.e., ฑ 15 ng)? Are sampled
filters or blanks used?
What happens if a weighed filter appears to be an
outlier? How is it handled?
How are re-weighs documented?
Are trip blanks utilized? If so, at what frequency?
What is the acceptance limits for trip blanks (i.e., ฑ15
5

-------
Following each manual weigh session, is the weighing
(batch) audited? If so, who does the audit? What
percentage of filters are reweighed? What limits are
used to determine good agreement?
Who does the lab analyst notify when discrepancies are
found and/or corrective actions are needed?
After the final weighing has been completed, how are
samples stored, and for what period of time are they
retained?
IV. Data Handling
Are chain of custody (COC) forms submitted by the
field technician for each sample? Are they signed by
all parties within the chain? Where are they
maintained? Who reviews the COC forms?
How are field flags/notes linked to the filter data?
How are these notes communicated to the data
reviewer?
Once filter weighing is complete, is a report listing the
sample concentrations generated for QA review? Does
it list any void/flagged sample(s) and the reason for
invalidating the sample(s)? Who is responsible for
generating this report? How is responsbile for
reviewing it?
Are concentrations verified to ensure data entry &
computations are correct? How many samples are
reviewed per batch? Describe the verification
procedure that is utilized.
Are control charts used? If so, detail the types of
control charts developed & how they are maintained.
At what frequency are the charts reviewed & by
whom? Where are these charts located?
During the data review process, does the reviewer
differentiate between critical and non-critical criteria
when flagging data? Describe this process.
During the data review process, are sampler
maintenance results, precision checks, and audit results
reviewed to determine if any samples should be
invalidated or flagged based on the results of these
activities?
6

-------
Are the filter, summary data, and interval data
downloaded from the instrument for each sample run?
Where is this data stored? Is the data reviewed as part
of the QA audit process?




How are corrective actions addressed? Are forms filled
out for corrective actions? Who reviews them in the
data validation chain? If corrections are made to data
as a result of corrective actions, how is this
documented & verified? Who is responsible for follow-
up?


Are exceptional events or impacts from nearby sources
documented? Where?

Once the data has been audited, are null codes and any
qualifiers applied to the sample reviewed? If so, who is
responsible for applying the codes/flags? Is this prior
to the data being uploaded to AQS?




Are internal performance & systems audits of the
weigh lab and supporting equipment (loggers,
balances, etc) performed? Who conducts these audits?
Describe the review process and how the results are
documented. How are staff notified of the audit
results? Where are the audit reports filed?




V. Other
Does the laboratory operate under an approved QMP,
QAPP, and SOP? What are the approval dates for the
current revisions?




What is the size of the particulate matter network for
which this weigh lab is in operation? Describe the
number of samplers and their operational frequency.
Approximate number of samples weighed per
month/year.

Does this weigh lab gravimetrically analyze samples
collected for PM10 and/or lead? If yes, then describe
any notable differences in procedures for the PM10
and/or lead samples.




Describe in detail the training the lab analyst has
received. Who trained the analyst? Which courses
have been taken? Has the analyst been trained on both
QC operation of the lab, as well as data
verification/validation procedures?

7

-------
Describe in detail the training the back-up weigh lab
analyst has received. How often do they weigh filters?
Additional Comments on Audit:
Auditor's Signature:
8

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 103 of 105
Appendix D
Laboratory Systems Review: Lead in Air
Audit Checklist
The following audit checklist is designed to aid the auditor in performing an audit of a Pb
laboratory. To use the checklist effectively, the auditor must be familiar with the requirements
of the specific FRM or FEM that is currently in use in the laboratory and supplement this
checklist with its specific method criteria. The checklist does not include specific QC criteria or
detailed procedural steps that will be unique to each method. The checklist includes
commonalities that exist in all of the Pb methods, as well as laboratory best practices. This is a
compilation of these method consistencies that the auditor can use to assist in auditing the
laboratory. This checklist should be tailored to the needs of the specific Pb analytical method
implemented in the laboratory under audit.
Note: The criteria documented in this template was current as of the published date of this
document. Users should verify that the criteria have not been updated in the CFR or other
guidance documents before use. If discrepancies exist, please inform the author(s) of this
document to recommend revision of this appendix.

-------
LABORATORY SYSTKMS R1MK\Y: Lead in Air
Laboratory:	
Air Monitoring
Agency:	
Reference or Equivalent Method :
Audit Questions
Response
I. Document Control
a) Is there an official, documented Operating
Procedure (SOP) for this method? Is this a controlled
document? Describe how it is controlled.

b) Who is responsible for the updates to the SOP? At
what frequency is the SOP updated? Describe the
review process.

c) Are changes allowed to the SOP outside of a review
cycle? Describe this process and how the changes are
documented and staff are notified of any pertinent
chanses.

d) Do changes to the SOP ever require updates of the
Laboratory Information Management System(LIMS)?
If so what is the procedure for LIMS updates?

e) How are SOP updates communicated to staff?

f) When SOPs are updated for air analysis is the Air
Monitoring Unit notified of the changes?

II. Training and Demonstrations of Competency
a) Are all analyst involved with Lead in air sample
preparation and analysis properly trained? Review
available training records


-------
b) Are Initial and Continuing Demonstrations of
Competency (DOC) required for each analyst?
Describe the DOC process.

c) Who maintains records of acceptable competency
demonstrations? Ask to review these records.

d) Are Method Detection Limit (MDL) Studies
performed? At what frequency? Describe the MDL
determination procedure. Ask to review records of
MDL studies performed for the audit time period.

e) Are Linear Dynamic Range (LDR) studies
performed? At what frequency? Describe the
procedure.

f) Who maintains documentation of LDR studies? Ask
to review records of studies performed during the audit
time period.

g) What is the Reporting Limit for Lead in Air? How is
the reporting Limit determined?

III. Sample Receiving, Sample Log-in and Sample Custody
a) Describe how samples are received by the laboratory
(Are they shipped, are they in folders or envelopes,
who receives them, etc.)

b) Are samples received as a group (all samples
collected for one month) or are individual samples
submitted?

c) Is a COC form submitted for each sample?
(Request copies of COC form for review)


-------
d) Are samples logged in as received? If not where are
samples stored prior to sample log-in? Following
sample log-in, where are samples stored?

e) Describe the sample storage location (are there any
sources of potential contamination).

f) Is access restricted to the custody , storage and
analytical labs? How is it restricted?

g) Following sample preparation, where are the
extracts stored? Where are the unused portions of the
filters stored?

IV. Sample Preparation
a) Describe the sample preparation location, are there
any sources of contamination present, is there
sufficient room to perform all activities?

b) Preparation of the filter requires a strip to be cut
from the 8 X 10 filter. What size filter strip is used for
preparation? Ask for a demonstration of the cutting
procedure.

c) Are instruments (cutters, cutting boards, etc.)
cleaned in between fitters? Describe this procedure.

d) What type of glassware/labware is used for sample
extraction? Is it acid washed? Is it certified clean by
the vendor (if so is there a certificate available) How is
cleanliness of the labware determined?

e) What ancillary equipment is utilized for sample prep
(Hotblock, Hotplate, Ultrasonic Bath) Describe the
condition of the equipment and any checks utilized to
verily proper operations.

f) Describe the sample preparation procedure.

g) Are duplicate samples prepared? Describe this
process.


-------
h) Are blank filters prepared and assessed for
background concentration? What level of background
is acceptable? Is this value used in the determination of
the final lead concentrations reported?

i) Are matrix spikes prepared? Describe spike
preparation procedures. At what frequency are spikes
prepared?

j) Is a quality control check sample from a second
source prepared? Is there a COA provided for the
QCS? (Request copy(s) for review) At what frequency
is the QCS prepared/analyzed?

V. Standards Preparation
a) Are stock standards prepared in-house or purchased
as certified standards from a vendor? If purchased is a
Certificate of Analysis provided? (Request copies of
Co As for review). How/where are Co As maintained?

b) How are expiration dates of the stock standards
established? Are stock standards ever utilized outside
of the expiration date. If so, describe the process for
verifying the standards and how this is documented.

c) At what frequency are working and/or calibration
standards prepared? How are expiration dates for these
standards established?

d) How is the traceability of the standard preparation
documented? (logbooks, controlled prep forms, LIMS).
Request copies of standard prep records for review.

e) Are all standards prepared in a matrix comparable to
the matrix of the samples following extraction?


-------
f) Are air displacement pipettes utilized for preparing
standards? Are they certified? At what frequency? Ask
to review certification records.

VI. Audit Strips
a) Are audit test strips spiked at 30-100% and 200-300
% of the lead NAAQS, analyzed? Are these strips
prepared in-house or provided by an EPA contract
laboratory? At what levels of the NAAQS are the strips
analyzed((if prepared in house)?

b) Describe the preparation of the audit strips if
prepared in -house. Review documentation of audit
strip preparation. Are standards and reagents
independent of the calibration standards utilized in the
preparation procedure?

c) At what frequency are the strips analyzed? Are
dilutions required to produce results for audit strips
within the calibration curve?

d) How are the results of the audit strips reported to the
air monitoring unit (Spreadsheet, Sample Report)? In
what units are the results reported?Is the true value for
the audit strips reported with the results?

e) What criteria must be met for the audit strip results
to be considered acceptable? If the criteria is not met,
what corrective action is performed?

VII. Instrument Calibration/Quality Control
a) What instrumentation is utilized for analysis
(Make/Model)? Is the instrument capable of meeting
the method requirements?

b) How often is the instrument calibrated? How many
standards are used to calibrate the instrument?


-------
c) What is the requirement for the correlation
coefficient ? What curve fit is used?

d) Is an initial check standard analyzed to verify the
calibration stability is ฑ5%? Describe what corrective
action is taken if the calibration stability check exceeds
acceptance criteria. Is the standard prepared from the
same source as the calibration standards?

e) Are continuing calibration standards analyzed? At
what frequency? At what level(s) ? What is the
acceptance criteria? Describe what corrective actions
are taken if the criteria is not met. Are the continuing
check standards prepared from the same source as the
calibration standards?

f) Describe the type(s) of blanks utilized in the
analysis. What is the acceptance criteria for each type
of blank? What corrective action is performed if blanks
exceeded the acceptance criteria?

g) If duplicates are prepared and analyzed, what is the
acceptance criteria for duplicate analyses? What
corrective action is performed in the event the
duplicate results do not meet this criteria?

h) Describe how matrix spikes are assessed? What are
the acceptance criteria?

i) Are dilutions performed for samples that exceed the
calibration? If dilutions are performed, is a dilution
check standard included in the analytical batch. What
criteria is utilized to assess dilutions?

VIII. Lab Data Review
a) Following analysis, who is responsible for review
and uploading data into the LIMS system. Describe
this process. How is the review documented?

b) Is the data reviewed by a secondary reviewer? What
does the secondary review process entail? How is this
process documented.


-------
c) If corrections to data are required as a result of
secondary review, who makes the corrections and how
is this documented?

c) Is data ever qualified? Who determines the
appropriate qualifiers? Are the qualifications reviewed
as part of the review process?

d) Does a list of qualifiers and their definition
accompany the data?

e) Are the results of method QC reported with the
sample results?

f) Is an assessment of measurement uncertainty
provide with the final results?

g) What units are the results reported in?

h) How are final reports generated and disseminated to
the sample submitter?

Additional Comments on Audit:

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 104 of 105
Appendix E
Example Technical Systems Audit
Report Template
The following Technical Systems Audit report template is designed to include the desired
elements that each Technical Systems Audit report should include, and serves as an aid for
Regional auditors to use in preparing the TSA report. This template may be edited for specific
regional use.

-------
2017 Technical Systems Audit Report - Draft or
Final
Agency Name:
Agency Location:
Project Date:

PRO^
Project Leader: NAME
ADDRESS
PROJECT ID:
Draft OR Final Report
Page 1 of 9

-------
Approvals:
Project Leader:
NAME	Date
Section
Approving Official:
NAME, TITLE	Date
Section
PROJECT ID:
Draft OR Final Report
Page 2 of 9

-------
Table of Contents
1.0 Executive Summary	4
2.0 Introduction	5
3.0 Commendations	6
4.0 Findings and Recommendations	6
4.1	FIELD OPERA TIONS	7
4.2	LABORA TORY OPERA TIONS	7
4.3	RECORDS MANAGEMENT	8
4.4	DATA MANAGEMENT	8
4.5	QUALITY ASSURANCE.	9
5.0 Conclusions	9
Appendix A: Response - Technical Systems Audit Questionnaire	10
PROJECT ID:
Draft OR Final Report
Page 3 of 9

-------
1.0 Executive Summary
Enter language here, followed by page break
PROJECT ID:
Draft OR Final Report
Page 4 of 9

-------
2.0 Introduction
On	, 2017, EPA Region	personnel conducted a TSA of the	
ambient air monitoring program. The audit team included 	 (lead auditor), and
Pursuant to 40 CFR Part 58, Appendix A, ง2.5, TSAs of each Primary Quality Assurance
Organization (PQAO) are required to be conducted every three years; monitoring organizations
within a PQAO should be audited within 6 years (2 TSA cycles). 	operates its ambient
air monitoring program under the	PQAO, utilizing the quality assurance
project plans (QAPPs) and standard operating procedures (SOPs) established by	. The
purpose of this TSA was to assess	's compliance with established regulations governing
the collection, analysis, validation, and reporting of ambient air quality data. Data reviewed as part
of this TSA included that generated during the 2014-2016 calendar years. Data was queried from
EPA's Air Quality System (AQS) database prior to the on-site audit. SESD's Ambient Air
Monitoring Technical Systems Audit Form (i.e., questionnaire) was completed by	staff prior
to the on-site audit and is included as Appendix A of this report.
The audit included a review of data, recordkeeping, documentation, and support facilities housed
at the	office complex, located at	, (City, State). The	(PM10/PM2.5
gravimetric, or toxics/analytical) laboratory was audited as well. _#_ air monitoring stations were
inspected during the audit. The sites visited are listed below.
Common Site Name	AOS Identification
NAME	XX-XXX-XXXX
NAME	XX-XXX-XXXX
During the audit, the following personnel were interviewed.
•	NAME, Title
•	NAME, Title
The following AQS reports were reviewed in preparation for this TSA.
•	AMP 251: QA Raw Assessment Report (2014-2016)
•	AMP 256: QA Data Quality Indicator Report (2014-2016)
•	AMP 350: Raw Data Report (2014-2016)
•	AMP 350MX: Raw Data Max Values Report (2014-2016)
•	AMP 360: Raw Data Qualifier Report (2014-2016)
•	AMP 380: Site Description Report (2014-2016)
PROJECT ID:
Draft OR Final Report
Page 5 of 9

-------
•	AMP 390: Monitor Description Report (2014-2016)
•	AMP 430: Data Completeness Report (2014-2016)
•	AMP 480: Design Value Report (2016)
•	AMP 503: Extract Sample Blank Data (2014-2016)
•	AMP 504: Extract QA Data (2014-2016)
•	AMP 600: Certification Evaluation and Concurrence (2014-2016)
Additionally, the following quality documents were reviewed.
•	NAME, Control Number, Revision Number, Date
•	NAME, Control Number, Revision Number, Date
•	NAME, Control Number, Revision Number, Date
3.0 Commendations
Enter text here
4.0 Findings and Recommendations
The observations from this TSA were compared to EPA regulations, technical policies and
guidance, and the monitoring organization's quality system documentation.
Quality system deviations found through this TSA are classified into three categories: Findings,
Concerns, and Observations. These quality system deviations are defined as follows:
Finding:
Departure from or absence of a specified requirement (regulatory, QMP,
QAPP, SOP, etc.) or guidance deviation which could significantly impact
data quality.
Concern:
Practices thought to have potential detrimental effect on the ambient air
monitoring program's operational effectiveness or the quality of sampling or
measurement results.
Observation:
An infrequent deviation, error, or omission which does not impact the output
of the quality of the work product, but may impact the record for future
reference.
PROJECT ID:
Draft OR Final Report
Page 6 of 9

-------
For each of these categories, corrective action recommendations are provided. Corrective actions
are required for all quality system deviations ranked as Findings or Concerns. Depending on the
severity of the deviation, a specific data deliverable(s) may be requested to show that the corrective
action recommendation has been successfully implemented. In these cases, the TSA report will
specify the deliverable(s) that will be required for AQS and/or submitted to EPA. Observations
do not require corrective actions.
4.1	FIELD OPERATIONS
4.1.1	Finding:
Discussion:
Recommendation:
4.1.2	Concern:
Discussion:
Recommendation:
4.1.3	Observation:
Discussion:
Recommendation:
4.2	LABORATORY OPERATIONS
4.2.1	Finding:
Discussion:
Recommendation:
4.2.2	Concern:
Discussion:
Recommendation:
4.2.3	Observation:
PROJECT ID:
Draft OR Final Report
Page 7 of 9

-------
Discussion:
Recommendation:
4.3	RECORDS MANAGEMENT
4.3.1	Finding:
Discussion:
Recommendation:
4.3.2	Concern:
Discussion:
Recommendation:
4.3.3	Observation:
Discussion:
Recommendation:
4.4	DATA MANAGEMENT
4.4.1	Finding:
Discussion:
Recommendation:
4.4.2	Concern:
Discussion:
Recommendation:
4.4.3	Observation:
Discussion:
PROJECT ID:
Draft OR Final Report
Page 8 of 9

-------
Recommendation:
4.5 QUALITY ASSURANCE
4.5.1	Finding:
Discussion:
Recommendation:
4.5.2	Concern:
Discussion:
Recommendation:
4.5.3	Observation:
Discussion:
Recommendation:
5.0 Conclusions
Enter text here
	must develop a corrective action plan and timeline to address the findings and concerns
identified in Section 4 of this report and respond back to EPA within 30 days of receipt of the final
TSA report. Please note that the corrective actions do not have to be completed by this date, only
a plan to address the findings and concerns. Observations do not require corrective action and,
therefore, do not need to be addressed. If	anticipates that the development of the corrective
action plan will not be completed within 30 days after the receipt of the final TSA report, please
contact EPA to request an extension.
Page Break, then attach Appendix A
PROJECT ID:
Draft OR Final Report
Page 9 of 9

-------
Conducting TSAs
Revision 0
Date: 11/2017
Page 105 of 105
Appendix F
Example Technical Systems Audit Close-Out Letter
The following Technical Systems Audit Close-Out letter is designed to include the desired
elements that each TSA Close-Out letter may include, and serves as an aid for Regional auditors
to use in closing out the TSA. This template may be edited for specific regional use.

-------
^1eD s,%
m)
%> 


-------
United States	Office of Air Quality Planning and Standards	Publication No. EPA-454/B-17-004
Environmental Protection	Air Quality Assessment Division	November 2017
Agency	Research Triangle Park, NC

-------