EPA OFFICE OF AIR
QUALITY PLANNING
AND STANDARDS
SPECIAL
POINTS OF IN-
TEREST:
The OA EYE
ISSUE 16
JUNE, 2014
August 2014 National
Ambient Air Monitor-
ing Conference
RP transactions going
away for collocated
data in 2015
INSIDE THIS
ISSUE:
National Monitoring Con- I
I ference
TSA Reporting to AQS I
Zero Drift Guidance	2
Changed
Pb PEP QAPP and SOP 4
PEP Re-certifications	4
Completed
New Evaluation Tech- 5
nique for NATTS PT
When Does the Clock 7
Start for Standard Certifi-
cations
Primary Monitors Identi- 7
I fied in Two Places
RP Transactions Going 7
I Away for Collocated Data
AQS Transaction for	8
QAPP Entry
NATTS TSA Coming to a 9
Lab Near You
Primary Monitor Substi- 9
tution in PEP Program
Posting CSN Flow Audit 10
2013 PEP data in AQS 10
National Ambient Air Monitoring Conference Update
The EPA, in conjunction with
the National Association of
Clean Air Agencies
(NACAA), is continuing to
firm up the agenda for the
National Ambient Air Moni-
toring Conference the week
of August I 1-14 in Atlanta
Georgia.
A draft agenda is currently
posted on AMTIC and we
have received abstracts for
the QA presentation ses-
sions. Acceptance emails for
the presentations went out
the week of June 9th. From
the QA side we are working
on the Monday QAI01 train-
ing session and we plan on
restructuring it a little. First,
we have a whole day instead
of the half-day we had in
Denver. The extra time will
allow us to provide more
detail on our topics. We
don't plan on focusing as
much on the regulations
(however they will be dis-
cussed) but on the overall
quality system and those areas
needing close attention. In
addition we plan to provide
more time for questions and
answers after each training
topic. On Tuesday afternoon
we are trying something new
with what is called the Pro-
gram Breakout Discussion
Centers. Some have consid-
ered this session the "speed-
dating for scientists". It will
be an opportunity to ask
questions in a group forum.
This looks to be a busy activi-
ty and we look forward to
your feedback on this session.
As mentioned, the half-day
QA presentation session has
a full speaker list with the first
afternoon session focused on
PM2.5 and the second session
a mix of topics.
The conference will be held
at the Atlanta Marriott Mar-
quis, located in Downtown
Atlanta, GA. The hotel con-
ference rate is $1 33 a night
for a single or double room,
plus applicable taxes
(currently 16% per room per
night). All reservations must
be made by Monday, July 21,
2014. After this date, reserva-
tions are subject to space and
rate availability. To make
your reservation, please call
the national reservation num-
ber at I -800-228-9290 and
reference NAAMC or EPA
and the Atlanta Marriott Mar-
quis to receive the group
rate. Reservations can also be
made online by clicking here.
For additional information, go
to AMTIC
Technical Systems Audits Will be Reported to AQS
As part of the development of new QA
transactions, AQS has now included a QA
transaction for reporting technical systems
audits (TSA). At a minimum any EPA-
funded TSAs will be reported to AQS. This
includes NATTS, CSN and the EPA Region-
al TSAs. The QA transaction includes:
•	Performing agency
•	Monitoring Agency
•	Begin date
•	End data
•	Closeout date
The closeout date represents the date
when all corrective actions (if any correc-
tive action required) have been implement-
ed. Monitoring organizations are encour-
aged to use this reporting feature for inter-
nal TSAs.

-------
Zero Drift Guidance Changed
\
EPA has had some monitoring organizations ex-
press concern about the zero drift requirements in
the validation templates in the 2013 QA Handbook
for Air Pollution Measurement Systems Volume II Ambi-
ent Air Quality Monitoring Program. EPA will revise
this Handbook to provide zero drift acceptance
criteria guidance for 24-hour and 14-day intervals
as follows in Table I.
Table 1 . Revised 24-hour and 14-Day Zero Drift
Criteria
Zero
Drift
Units
so2
o3
no2
CO
24-hour
ppm
0.003
0.003
0.003
0.4
14-day
ppm
0.005
0.005
0.005
0.6
Background
The zero/span implementation frequency and
acceptance criteria are not identified in CFR and
are considered guidance. As such, during revision
of QA Handbook guidance, the EPA is able to work
with the monitoring organizations to change guid-
ance as needed. Over the years the zero guid-
ance has changed in the following ways:
1985-1998- No validation template developed but
the EPA espoused a 0-30 ppb requirement and a 0-
15 ppb requirement based on two different but
acceptable calibration techniques.
1998-2008 - Creation and use of a measurement
quality objectives (MQO) table. Acceptance was
+20-30 ppb if calibration updated at each zero/span
or+10-15 ppb if fixed calibration used.
2008-2013- First validation template and ac-
ceptance criterion of <_ + 3% of full scale.
2013-present- Due to the use of better technolo-
gies and trace gas instruments the zero drift guid-
ance criterion was changed to + 1.5 ppb.
In 2008, the QA Handbook used a three percent of
full scale criterion for the zero which relates to
the concentration scale that the monitor operates.
Many gaseous analyzers have scales of either 1000
ppb or 500 ppb. Therefore 3% of full scale for
1000 ppb would provide an acceptance criterion of
30 ppb and 500 ppb would provide an acceptance
criterion of 15 ppb (similar to older Handbook
guidance). So up until the 201 3 document, the
zero drift acceptance criteria were fairly wide.
For the 201 3 QA Handbook revision, instead
of using a percentage of the scale of the in-
strument, we used a straight ppb (03, S02
and N02) or ppm (CO) difference. This
seems to make sense since we should control
zero drift at an absolute value rather than
depending on instrument scale. However, we
drastically reduced the drift from 30 or 15
ppb to 1.5 ppb for 03, S02 and NOz In retro-
spect we may have been using 12- and 24-
hour performance specifications described in
40 CFR Part 53 for Federal Reference Meth-
ods (FRMs) and Federal Equivalent Methods
(FEMs) without considering that EPA guidance
allows for bi-weekly (14-day) zero checks.
Greater allowance for zero drift may be ex-
pected over two weeks compared to a 12- or
24-hour time period. After the Handbook
was posted, EPA received an email that the
CO acceptance criterion was incorrect. The
criterion for CO was unintentionally listed at
0.03.ppm	rather than 0.3 ppm
EPA asked the EPA Regions and monitoring
organizations to submit zero data from instru-
ments they operate. EPA received data from
monitoring organizations in Regions I, 7, 8
and 9 and evaluated the information by two
approaches
Approach I (absolute value SD)
1.	For each site, take the absolute value of
each zero result and calculate a site mean
(Avg ABS Zero). In this manner positive
values and negative values do not cancel
each other out.
2.	Calculate the standard deviation of the
absolute value zero (ABS SD)
3.	Multiply the standard deviation by 2 or 3
and add this value to the site mean. This
is the biweekly zero acceptance criterion.
(2*SD+Avg, or 3*SD+Avg)
Continued on page 3

-------
ISSUE 16
PAGE 3
Zero Drift Guidance Changed (continued from Page 2)
Approach 2 (Pos/Neg SD)
1.	For each site, take the absolute
value of each zero result and
calculate a site mean (Avg ABS
Zero). In this manner positive
values and negative values do not
cancel each other out. This is the
same as in approach #1
2.	Calculate the standard deviation
of the zero data using the posi-
tive and negative values (P/N SD).
3.	Multiply the P/N SD by 2 or 3
and add this value to the site
mean. This is the biweekly zero
acceptance criterion.
In cases where there are positive and
negative zero values, Approach 2 will
create a higher biweekly acceptance
value.
Table 2 provides an aggregate of the
data we evaluated using the two ap-
proaches. Realizing the data set is
very limited and using Approach #2:
CO - The average zero daily drift is
0.09 ppm (within the 0.3 ppm 12- to
24-hour acceptance criterion) and the
3* SD of the positive/negative is 0.4
ppm. We propose to revise the 24-
hour zero drift to 0.4 ppm and allow a
bi-weekly drift of 0.6 ppm
N02- The average zero daily drift is
0.38 ppb (within the 1.5 ppb validation
template acceptance criterion) and the
3* SD of the positive/negative is 2.14
ppb. We propose to revise the 24-hour
zero drift to 3.0 ppb and allow a bi-
weekly drift of 5.0 ppb.
S02 -The average zero daily drift is
0.39 ppb (within the 1.5 ppb validation
template acceptance criterion) and the
3* SD of the positive/negative is 1.73
ppb. We propose to revise the 24-hour
zero drift to 3.0 ppb and allow a bi-
weekly drift of 5.0 ppb.
03 (Table 6)- The average zero daily
drift is 0.58 ppb (within the 1.5 ppb vali-
dation template acceptance criterion)
and the 3* SD of the positive/negative is
2.6. We propose to revise the 24-hour
zero drift to 3.0 ppb and allow a bi-
weekly drift of 5.0 ppb.
Based on the data received and adding
for a small margin of error, we feel
these are reasonable acceptance criteria.
The new acceptance values will take
place immediately. A technical memo
has been posted on AMTIC at: http://
www.epa.gov/ttn/amtic/cpreldoc.html
with more details of the data used in
the evaluation. In addition, we will be
providing a spreadsheet on changes we
are making to the Handbook on AMTIC
at http://www.epa.gov/ttn/amtic/
qalist.html. We have a document called
"Validation Template Tracking Table"
that we will use to update changes.
We strongly encourage monitoring net-
works to perform the zero/span checks
(and one-point QC) more frequently
than bi-weekly. The information that
we used for the evaluation demonstrat-
ed that most organizations are perform-
ing these checks at higher frequencies
than the required minimum and with
the advent of the automated zero/span/
precision delivery systems, it will help
keep data quality within acceptable lev-
els and reduce the potential for data
invalidation.
NOTE: Some monitoring organizations
operating trace gas instruments have
asked if they can continue to use the
current more restrictive acceptance
criteria. The answer is yes. Monitor-
ing organizations can implement
"tighter" criteria as they see fit.
Table 2. Zero Drift Results from Monitoring Organization Data Submittals

Number of
Using Absolute Value SD
UsingSD Pos/Neg
Pollutant
Monitors
Avg ABS Zero ABSSD
2*SD+Avg J*SD+Avg
P/N SD
2*SD+Avg 3*SD+Avg
CO (ppm)
17
0.09l| 0.098
0.288 0.386
0.112
0.321 0.436
N02 (ppb)
10
0.377 0.519
1.414 1.933
0.586
1.549 2.135
S02 (ppb)
16
0.386 0.410
1.209 1.614
0.449
1.283 1.732
03 (ppb)
49
0.585 0.571
1.716 2.282
0.675
1.936 2.612
Correction in Issue 15
An article in Issue 15 Page 8 titled AQS Monitor Type Changes and Identifying Monitors for NAAQS exclusion had a
typographical error. One monitor type was listed "SPM-Other" this is incorrect and should be listed as "Other".

-------
PAGE 4
Pb-PEP QAPP and SOP Revisions
Over the past few months, all of the critical Pb-PEP
documents, including the QAPP and SOPs, have been
reviewed, modified, and revised or are in the process
of revision. A big thanks to all of those in the Pb-PEP
community who have devoted time to this effort. The
QAPP has been revised to include the major changes
that have occurred in the program such as: using a
contract laboratory to support 46mm Teflon© filter
analysis using XRF, including the use of the AIRQA
website for data submittal and approval, and the ap-
proval process itself. The SOPs are being updated to
reflect the changes in the QAPP and to include some
information that we have learned in the field to make
the procedure better.
The EPA Regions are presently going through the signa-
tures approval process for the documents as they are
completed and once completed they will be posted on
AMTIC.
Another Performance Evaluation Program Re-Certification in the Books!
Mesa Labs Acquires BGI Instruments
For those that have purchased or used BGI equipment, they have recently been acquired by Mesa Labs. An
article on this can be found at the Mesa Lab website.
During the week of April 7th, QA staff from Califor-
nia to Puerto Rico gathered at the OAQPS office in
RTP, MC for the Performance Evaluation Program
(PEP) Re-Certification. Our Performance Evaluation
Program includes the PM2.5 Performance Evaluation
Program (PM2.5 PEP), National Performance Audit
Program (NPAP), and the Lead Performance Evalua-
tion Program (Pb-PEP). The staff at the re-
certification represented Regional QA pro-
gram leads, state auditors, Puerto Rico QA
staff, Environmental Services Assistance
Team (ESAT) contractors, and support con-
tractors. The re-certification included ex-
tended days of activities including hands-on
demonstrations of proficiency, written test-
ing, discussions of changes and issues in the
programs, and upcoming challenges within
the programs.
This training had previously been held annu-
ally until 2012, but has since moved to every
two years due to funding limitations. The
expectation for the future is that we will
continue to offer this re-certification every
two years, but we will also re-evaluate the
frequency as needed. For those participating
in the PEP programs, this re-certification is a
required element that must be completed before
conducting these audits to keep consistency across
the programs. As a note, new staff members can be
trained and certified by Regional EPA Leads for the
QA programs in the off years when this training is
not offered at OAQPS. However, in the years this
re-certification is offered, we strongly encourage
attendance.
I would like to give a special thanks to the veteran
auditors who assisted the re-certification by partici-
pating as station leaders during the hands on portion
of the process. We couldn't have pulled it off with-
out the great help and expertise that you provided.
Thanks again! See you again in 2016... Greg Noah
THE QA EYE

-------
ISSUE 16
PAGE 5
*
/
I
Since the development of the NATTS Proficiency Test
(PT) Program, EPA has had some issues related to the
evaluation of VOC PT data. This paper identifies another
potential evaluation tool that does not use an "assigned"
concentration to compare laboratories against but uses
the data from the NATTS Laboratories to evaluate
whether certain labs might be considered outliers from
the population.
What is a proficiency test? The following is a definition
from the Mayo Clinic Glossary:
A program in which multiple specimens are periodically sent to
a group of laboratories for analysis andlor identification. Each
laboratory's results are compared with those of other laborato-
ries in the group andlor with an assigned value and reported
to the participating laboratory and others (CLSI GP27-A2). The
PT is an evaluation of the ability of a laboratory to achieve a
correct test result when compared with other laboratories
using the same methodology. This is accomplished using the
laboratory's materials, personnel, equipment, environmental
conditions, and procedures through the analysis of unknown
specimens distributed at periodic intervals by an external
source
In reviewing a number of descriptions of proficiency test-
ing it is clear that there are two potential ways for evalu-
ating PT data: I) evaluation against an assigned value
(referee lab result), and 2) evaluation against all other labs
in the study. EPA has attempted to use an evaluation
against an assigned value for a few reasons:
•	Laboratories participating are using different methods
which may produce different results, and
•	Canisters sent to the contractor for filling may be
contaminated at different levels which is out of the
control of the PT program. The PT, it was thought,
could be helpful in identifying this issue.
•	Using a lab mean approach would require averaging
the potential measurement uncertainties described
above into the evaluation that might mask specific
laboratory issues that need to be resolved.
However, EPA also recognizes that there are issues with
using one referee lab to provide an "assigned" value to
the constituents (pollutants) tested in the PT. The 2014
Quarter I VOC PT test seemed to indicate the referee
lab results were lower than all but one of the NATTS
laboratories participating in the program. EPA is looking
to increase the number of referee labs in future PTs and
is also planning on some additional studies. For example,
the referee laboratories receive new cleaned cannisters
from one source while, as explained earlier, the PT labor-
N
X
\
\
atories have different cleaning techniques and different ages
of the cannisters sent for filling which could potentially cause
issues. EPA is thinking of testing a portion of laboratory
cannisters by the referee labs prior to shipping them back to
their home laboratories in order to perform additional eval-
uations.
Although using the laboratory mean of all PT labs participat-
ing in the program is a viable option for evaluating and com-
paring labs, another process has been used in the past and is
succinctly explained in the paper by W.J. Youden called
"Ranking Laboratories by Round Robin Tests". The proce-
dure is used to evaluate the performance of laboratories by
ranking them according to the magnitude of the results they
report in a series of test samples. The test is used to identi-
fy laboratories that consistently report low or high results.
This procedure was utilized for the CY 201 3 (Quarters I
and 3) and 2014 (Quarter#!) VOC PT results.
Procedure.
This ranking procedure is very straight forward. In the
Quarter I PT there were 14 NATTS laboratories and 15
pollutants measured:
1.	For each pollutant measured, sort the pollutant from
lowest concentration to highest.
2.	Assign the laboratory with the lowest concentration #1
and the highest #14. If you add all the ranks up, a total
of 105 points are assigned to a pollutant if all labs report
the pollutant. For some pollutants a lab did not report a
value so in that case I 3 labs will be ranked and the total
points for that pollutant will be 91.
3.	Once all the pollutants are ranked for each lab, total the
points for each laboratory.
4.	Evaluate the points against Table III in the Youden Re-
port which is the approximate 5 percent probability
limits for ranking scores. Table I is a partial section of
Youden's Table from the report. (Continued on page 6)
Number of Materials
3 4	5 6 7 8 9 10 II 12 13 14 15
4	5 7 8 10 12 13 15 17 19 20 22
3
12	15 17 20 22 24 27 29 31 33 35 33
4	5 8 10 12 14 16 18 20 22 24 26
4
16	19 22 25 38 31 34 37 40 43 46 49
5-13	Removed for convienience
4
8
50
56
61
41
52
130
129
149
4
8
53
50
65
44
55
139
149
159
EPA Developing Different Evaluation Techniques for NATTS Proficiency Test Data
Table I-Approximate 5 % Probablility Limiti for Ranking Scorei
Number of
Labs

-------
I W I TS PT Evaluation Technique (continued from page 5)
Results
Table 2 provides the results of steps I -3 of the pro-
cedure for the QI 2014 data. Since the name of
some of the pollutants are quite long, Table 3 pro-
vides a crosswalk of the pollutant numbers provid-
ed in Table 2. The total points in the study were
1505 with an average score of 107.5.
Table 2 NATTS Lab Concentration Ranking Quarter 1 2014 VQC PT
NATTS Lab identified and so had a total of 15 labs. The
probability limits around 15 labs with 15 constituents is 71
for the low limit and 169 for the high limit. Notice that in all
three evaluations labs 10-2 and 4-02 are identified as pro-
ducing consistently low concentrations compared to the
population of laboratories. In Table 6, lab 01 -03 which was
the additional laboratory in that quarter (not in tables 4 or
5) is identified as reporting consistently high results.
NATTS
Lab
1
2
3
4
5
6
Pollutant
7 8 9
10
11
12
13
14
15
Total
01-01
5
4
10
4
3
6
7
2
2
7
6
10
10
7
8
91
01-02
10
12
12
10
6
9
8
9
9
10
9
12
11
8
10
145
01-04

9
8
9
8
10
4
4
4
5
10
5
12
6
3
97
02-01
8
5
2
6
14
4
10
10
5
4
4
6
8
13
6
105
03-01
9
S
13
8
4
1
12
5
6
3
2
2
4
9
7
93
03-02
1
13
11
11
7
14
13
7
13
6
5
7
14
14
13
149
04-01
7
7
5
5
5
S
5
3
7
9
7
8
7
5
4
92
04-02
12
1
3
1
2
3
2
1
3
1
1
4
3
3
2
47
04-04
3
3
4
13
13
13
3
13
10
8
11
13
13
4
9
133
05-03
11
14
14
2
10
5
14
14
12
12
12
3
1
10
14
148
06-01
2
10
9
12
11
11
9
6
3
13
13
11
5
12
11
138
09-03
13
6
6
7
9
12
6
12




2
2
12
87
10-02
4
2
1
3
1
2
1
11
1
2
3
1
6
1
1
40
11-01
6
11
7
14
12
7
11
8
11
11
8
9
9
11
5
140
Table3. Pollutant Descriptions
Number
Pollutant
1
Acrolein
2
Benzene
3
1,3-Butadiene
4
Carbon Tetrachloride
5
Chloroform
6
1,2-Dibromoethane
7
1,2-Dichloroethane
8
Dichloromethane
9
1,2-Dichloropropane
10
1,3-Dichloropropene - cis
II
1,3-Dichloropropene - trans
12
1,1,2.2-Tetrachloroethane
13
Tetrachlo methylene
14
Trichloroethylene
15
Vinyl chloride
The 5 percent probabil-
ity limits for a 14 lab
study with 15 pollutants
("materials" described
in the Youden Report)
is 67 for the low limit
and 158 for the high
limit. Table 4 illustrates
that based on the evalu-
ation, labs 10-2 and lab
4-02 consistently re-
port low results (< 67)
compared to the popu-
lation. No lab was con-
sidered to report con-
sistently high results
(>158). EPA decided to
evaluate the process
using the 2013 Quarter
I and Quarter 3 VOC
results. Tables 5 and 6
are the summary re-
EPA Plans to implement this approach
to provide additional insight into the PT
results. In the future it may decide to
report three or four versions of the
information:
1.	Comparison to NATTS Lab mean
2.	Comparison to nominal spiking val-
ue provided by the PT contractor
3.	Comparison by referee lab
4.	Comparison using Youden Ranking
Technique
Since the goals of the NATTS program
is to detect trends in toxic pollutants, it
is important to be able to ensure data
being used in these trend assessments
are comparable. Evaluating the PT re-
sults in a number of ways can help ensure this comparability.
Table 4. 2014 Q1
Table 5
95% PL
. 2013 Q1
67-158
Table 6. 2013 Q3
95% PL 71-169
suits of the evaluation. Table 6 had one additional
Lab
Score

Lab
Score
10-02
40

01-04
50
04-02
47

04-02
54
09-03
87

10-02
60
01-01
91

03-01
61
04-01
92

11-01
92
03-01
93

01-01
98
01-04
97

04-01
115
02-01
105

04-04
121
04-04
133

02-01
124
06-01
138

09-08
137
11-01
140

09-03
139
01-02
145

06-01
149
05-03
148

03-02
152
03-02
149

05-03
153
Lab
Score
10-02
IS
04-02
62
01-04
81
03-01
84
04-01
97
09-03
97
06-01
111
01-01
119
09-08
123
04-04
125
11-01
130
02-01
139
03-02
158
05-03
162
01-03
205
THE QA EYE

-------
PAGE 7
When does the Clock Start on a Certified Instrument
Recent discussions among OAQPS and the EPA Re-
gions brought to light some information regarding the
date to use for certification and recertification of
standards, particularly in regard to flow rate stand-
ards. Some vendors may be manufacturing flow rate
devices and certifying them on some specific day (e.g.,
01/01/2014). The instruments could sit on the shelf
for some period of time before being purchased (e.g.,
06/01/2014). There is as least one vendor that pro-
vides paperwork suggesting that the certification peri-
od for the purchased instrument can start on the date
the monitoring organization operator first puts the
standards into service and provides paperwork with the
following entries:
Date Placed in Service
(To be filled in by operator)
Recommended Recalibration Date
(12 months from date placed in service)
EPA believes that the certification date is the date the instru-
ment is officially certified and not the date placed into service.
This includes both new purchases as well as re-certifications.
For new purchases, we suggest you work with the vendor to
include a certification along with the purchase price or mini-
mally ask what the certification date is of the instrument you
are purchasing.
WARNING...Primary Monitors Are Identified in Two Places... and Really Have the Same Meaning
The term "Primary Monitor" has re-
placed the use of identifying the
NAAQS monitor with the pollution
occurrence code (POC). The primary
monitor is the monitor now identified
for use in NAAQS designations when
data for that monitor is available. For
any pollutant there can be one and only
one primary monitor designated at a
site for any given time period. As we
wrote in QA EYE Issue 14, if the prima-
ry monitor does not operate for the
day it was supposed to sample or col-
lect data, other monitors at the site can
substitute for the primary. If there is
only one monitor at a site for a particu-
lar pollutant, AQS will designate this
monitor as the primary. If a second
monitor is added, and the monitoring
organization decides the second monitor
needs to be the primary, it must go into
the Maintain Site Form and identify the
second monitor as the primary.
However, there is another form that is
used to designate what monitor the
required QA collocated monitor needs
to be compared to. Our 40 CFR Ap-
pendix A requirements also defines the
primary monitor as the NAAQS moni-
tor.
In order to identify the correct collo-
cation pairing, the primary and the
collocated monitor must be identified
in the Monitor Collocation Form.
For any site the primary monitor
should be the same in both forms. EPA
has found some discrepancies so
please be mindful of this. AQS is
aware of this issue and is looking for
ways to use the primary monitor iden-
tified in the Maintain Site Form for QA
purposes.
Reminder #3 Elimination of RP Transactions for Collocated Data in 2015
Since 2006 (see QA EYE Issue 2 page 5)
EPA has been advocating the use of prima-
ry monitors and the identification of the
CFR required collocated monitor to be
identified in the collocations table allowing
the collocated data to be submitted as
raw data and eliminating the need for
monitoring organizations submission of a
precision transaction (RP) for this infor-
mation. Once the new QA transactions
are completed (2014), use of the RP
transaction for collocated data will be
eliminated in December 2014.
In order to implement this reporting pro-
cedure, the primary monitor and the col-
located monitor must be identified in the
"Monitor Collocation Form" using the
"MJ" transaction for the primary and col-
located monitor. NADG provided a
review of use of both methods and discov-
ered that most organizations are using the
raw data transaction.
HOWEVER out of the 49 monitoring
organizations that were using RP transac-
tions almost 50% (23) were also entering
the collocated data as raw data, so either
they are using both entry methods for the
same data, which is not necessary, or
they have different entry people entering
the data differently.
Contact the AQS helpline for further
information and help setting this up.
THE QA EYE

-------
ISSUE 16	PAGE 8
AQS QA Transactions for QAPP Entries
A number of new entries are becoming available for
submission of QA Data. A few new and very simple en-
tries involve the tracking of quality management plans
(QMPs), quality assurance project plans (QAPPS) and
technical system audits (TSAs). The QA EYE reported
some information on this back in the Issue 14 (June,
2013).
The EPA Regions will provide an evaluation date when
the QAPP was reviewed and a status. Status codes
are provided in the figure below. Once a QAPP has a
status of "approved" the evaluation date will be the
date of final QAPP approval.
When entry or edits are completed click the SAVE
icon.
Since QAPP data for the criteria pollutants was entered
into AQS last year it is ready for updates as necessary
and it is the farthest along reporting-wise. For entry, as
well as editing, the following are a simple set of proce-
dures:
1.	Select maintenance tab from the menu and then
select QA Assessments and finally QA Project Plans.
2.	Query the 4 digit Agency Code. Select a Parameter
Classification. The default is "Criteria" for the cri-
teria pollutants but other parameters can be en-
tered.
3.	Upon execution of the query, records for each cri-
teria pollutant monitored (active at the time or
within the begin and end date of the query) will be
retrieved. Depending on whether this is an initial
entry there may or may not be any information in
the "Submission Date" "Evaluation Date" or "Status"
fields. If the QAPP is a new submission, in most
cases, the monitoring agency will supply the submis-
sion date information.
Entry for QMPs follow a similar procedure with the ex-
ception that parameters are not identified since the QMP
represents the monitoring agencies overall quality system
not individual parameters.
For both QMPs and QAPPs, it is assumed that EPA Re-
gional offices will enter the evaluation dates and the status.
In cases where QAPP approval have been delegated back
to monitoring organizations, the monitoring organizations
can enter this data but it is expected that the EPA Regions
will concur with the information submitted to AQS.
In the case of the TSAs (see page I):
•	The Performing Agency is the Agency responsible for
conducting the audit and/or follow-up.
•	The Monitoring Agency is the Agency being audited.
•	The begin and end date are the dates of the audit.
•	The closeout date is the date when all corrective ac-
tion (if necessary) has been implemented.
^Maintain QA Project Plans (Epa Headquarters)
Query Criteria
Agency 0635 	| |Maine D.E.P. Bureau Of Air Quality Control, Augusta	Begin Date f
Parameter Classification [cRTTErIa	(Criteria Pollutants	En(l Date
YYYYMMDD
YYYYMMDD
Parameter —_ir
Status
rue
*J
Fint^T
Project Plans
Agency
Code
0635
Parameter
Classification
Parameter Submission Evaluation
Code Date Date
I Status
Approved
Conditionally Approved

Disapproved

Not yet reviewed

«i
	I jJ
42101
Status
20030623 20090623
Find
OK
Cancel

I
20 0 90623 [Approved"
0635
0635
42602
20090623
20090623 |Approved
44201
20090623
20090623 [Approved
20111015 [Approved"
0635
88101
20111015
20111015 [Approved
X

-------
We are
committed to
producing a level
of consistency
and quality in
the NATTS
program which
will provide data
that is
defensible,
precise, and
useful for
making
decisions.

NATTS Technical Systems Audits Coming to a Lab Near You...
Dave Shelow and I are finally teeing up the 2014
NATTS Technical Systems Audits (TSAs) and driv-
ing them to a lab near you! Beginning in June and
continuing throughout the year, our contractor,
Battelle, will be conducting TSAs at a third of the
NATTS labs and sites in the network. Each year we
plan on auditing a third of the network with a goal
of completing the network in three years. Last year,
we had a transition period where we only complet-
ed three audits, but we are back on track. Battelle
has compiled a tentative schedule for the audits and
will make every attempt to coordinate with the
agencies and Regions to allow participation for both
parties.
A primary goal in conducting these audits is to fos-
ter cooperation between OAQPS the EPA Regions
and the monitoring organizations participating in the
NATTS program. This is important because my
priority is to push hard for corrective action to
address the major findings in the TSAs. We spend a
great deal of time and resources in conducting these
audits and it doesn't make sense to identify short-
comings and potential improvements if we aren't
going to follow up. I'll be expecting the Regions to
play the primary role along with us at OAQPS in
ensuring corrective actions take place.
As we all know, audits reveal findings with varying
impact on the data, and it is sometimes difficult to
determine where to focus limited time and re-
sources. We have developed a ranking system for
the findings to help identify priorities for corrective
action. OAQPS and the Regions will review the
findings together and will rank them accordingly in
the audit report. The three categories and the defini-
tions are listed below:
1)	Findings- those findings that the audit team felt
were a major concern to the data collection activ-
ity.
2)	Observations- findings that are of less immedi-
ate concern but should be thought about when
revising or improving the agency's quality system
3)	Recommendations- those observations that
could improve the quality system but would not
appear to affect data quality.
Using this convention, the audit results are organized
such that the "Findings" category should be addressed
first, then the "Observations", and finally the
"Recommendations". Our expectation is that the
Regions will take the lead and work with the agencies
to focus on and resolve the audit results as recom-
mended above.
Yes, this is NATTS. There will be discussions, opin-
ions, and the occasional argument over audit results.
One of the ancillary purposes of the audits is to facili-
tate discussion and find out where the NATTS Tech-
nical Assistance Document needs work. We are al-
ways open to listen to ideas that will improve the
NATTS program. Most of all, we are committed to
producing a level of consistency and quality in the
NATTS program which will provide data that is defen-
sible, precise, and useful for making decisions. Our
hope is that the TSAs will move us forward to this
goal. Greg Noah
Making the Best Use of Limited PEP Data... the Primary Substitute
In 2006 we reduced the require-
ments for the PM2.5 PEP program
from 25% of each method desig-
nation 4 times a year to a require-
ment of to 5 or 8 audits a year.
This reduction has put a premium
on the PEP values since we have
much less data to make bias as-
sessment's with adequate levels of
confidence. Therefore, when we
perform a PEP and then find out
that the primary sampler did not
produce a valid sample, we try
to perform another PEP in order
to meet data completeness crite-
ria. Since we are not always
aware that this invalidation has
occurred , EPA is contemplating
substituting a second monitor (if
one exists) at a site on days when
the primary has not produced a
valid sample. OAQPS will be
working with the Regions on the
most appropriate process for
doing this. We may attempt to
pair with like method designations
first or pair with the next lowest
POC. A similar procedure may
be used for the 201 I -2013 PM2.5
QA Report that is currently under
development.
N _
NEWSLETTER TITLE

-------
ISSUE 16
Posting CSN Flow Audit Data in AQS
PAGE 10
" "V
We have made progress preparing a
streamlined procedure for monitoring
organization QA staff to post CSN
sampler flow audits into AQS. The
National Air Data Group has devised
an input transaction to place the data
in tables designed to "feed" the newly
developed QA transactions. The
audit result for each channel will be
populated for each of the analytes
that are currently being routinely
quantified and reported. This will
also include audit values for the more
automated but less deployed sam-
plers such as the Sunset Carbon, the
Thermo Electron Model 5020 sulfate,
and Magee aethalometers.
We are reviewing the current AQS
tables to make sure there are no in-
correct or misleading combinations of
analyte codes, sampler ID and method
codes. If your agency is using something
other than a URG 3000N, Met One
SASS or SuperSASS, Sunset Carbon, the
Thermo Electron Model 5020 sulfate, or
Magee aethalometer for analytes that
you are posting to coincide with a
PM2.5 CSN analyte code (88xxx)
please contact Dennis Crumpler or
Robert Coats (see address below)
It is extremely important that SLT agen-
cies review metadata that has been en-
tered into AQS for their CSN sites and
samplers. NATTs analytes (metals) val-
ues should not be loaded under PM2.5
chemical speciation analyte codes unless
they have been acquired with PM2.5
chemical speciation samplers and analyt-
ical methods, i.e., the values are serving
two roles. In reality low-volume
NATTs metals should derive from PM-
10 samples so the proper "method
code" will create a delineation that
prevents PM2.5 CSN audit data from
pairing with NATTs routine data.
Four agencies have just been given the
green light to try to post their CSN
audit data for 2014 using the new in-
put transaction template. If successful
we will prepare a webinar to push the
template out to all the CSN operating
agencies.
A similar template will exist for post-
ing monthly verifications as well.
Note this will not work for audits of
IMPROVE samplers. That may come
in the future. For more information
contact crumpler.dennis@epa.gov or
coats.robert@epa.gov
Dennis Crumpler
2013 PM2.5 PEP DATA is Available in AQS
Hopefully most agencies have tried to
generate AMP256 reports which pro-
vide the "Performance Evaluation
Program" bias measurements for
201 3. We successfully posted results
for a about 500 sampling events out
of a possible 560. The error report
for the 50+ unpaired results has
some recurrent themes. We think
there were a few human errors in
recording the correct AQS site ID
number on our PEP field data sheets
or entering the data in our data base.
These will be identified and corrected
by our PEP contractors or the EPA
Regional PEP Leads. But a significant
number of failed pairings were due to
the absence of data posted for the
primary sampler. Where the SLT pri-
mary sampler failed during an event and
it was the only sampler at the site dur-
ing that sampling event, or the SLT inval-
idated the data for that sampling day,
the data loss is irrecoverable. If howev-
er, on the sampling day, another collo-
cated FRM or FEM sampler was generat-
ing data that the monitoring agency
could legitimately use for design values,
we have a chance to pair those togeth-
er. The problem at the moment is that
the AMP 256 report algorithm does not
know to look for another value. The
National Air Data Group and the QA
team is investigating a solution, but it
may not be implemented for some time.
In these cases we recommend that the
EPA Regional PEP lead provide the
PEP data that is unmatched to the
respective monitoring agency. The
agency can then use the data for certi-
fication, if it can find a data point from
another sampler that operated on that
date at that site. Remember the net-
work Data Quality Objective calcula-
tion is based on 3 years of data. If the
PQAO does not have 15 data points
per <5 sites or 24 for >5 sites, they
can discuss getting additional PEP data
points in the next year with the Re-
gional PEP lead. For more information
contact Crumpler.Dennis@epa.gov.
Dennis Crumpler

-------
J*1®8*'*
?
73
\
^ PRO^
EPA
EPA-OAQPS
C304-02
RTP, NC 27711
*
\
uj
o
The Office of Air Quality Planning and Standards is
dedicated to developing a quality system to ensure that
the Nation's ambient air data is of appropriate quality
for informed decision making. We realize that it is only
through the efforts of our EPA partners and the moni-
toring organizations that this data quality goal will be
met. This newsletter is intended to provide up-to-date
communications on changes or improvements to our
quality system. Please pass a copy of this along to your
peers and e-mail us with any issues you'd like discussed.
E-mail: papp.michael@epa.gov
Mike Papp
Important People and Websites
Since 1998, the OAQPS QA
Team has been working with the
Program
Person

Affiliation
Office of Radiation and Indoor Air
STN/IMPROVE Lab Performance Evaluations
Eric
Bozwell
ORIA- Montgomery
in Montgomery and Las Vegas and
Tribal Air Monitoring
Emilio
Braganza
ORIA-LV
ORD in order to accomplish it's
Speciation Trends Network QA Lead
Dennis
Crumpler
OAQPS
QA mission. The following per-
OAQPS QA Manager
Joe
Elkins
OAQPS
sonnel are listed by the major
Standard Reference Photometer Lead
Scott
Moore
ORD-APPCD
programs they implement. Since
all are EPA employees, their e-
mail address is: last name.first
name@epa.gov.
National Air Toxics Trend Sites QA Lead
Criteria Pollutant QA Lead
NPAP Lead
PM2.5 PEP Lead
Greg
Mike
Mark
Dennis
Noah
Papp
Shanis
Crumpler
OAQPS
OAQPS
OAQPS
OAQPS

Pb PEP Lead
Greg
Noah
OAQPS
The EPA Regions are the prima-
Ambient Air Protocol Gas Verification Program
Solomon
Ricks
OAQPS
ry contacts for the monitoring
STN/IM PROVE Lab PE/TS A/Special Studies
Jewell
Smiley
ORIA-Montgomery
organizations and should always
STN/IM PROVE Lab PE/TSA/Special Studies
Steve
Taylor
ORIA-Montgomery
be informed of QA issues.
Websites
Website
EPA Quality Staff
AMTIC
AMTIC QA Page
URL
EPA Quality System
http://www.epa.aov/ttn/amtic/
http://www.epa.gov/ttn/amtic/guality.html
Description
Overall EPA QA policy and guidance
Ambient air monitoring and QA
Direct access to QA programs

-------