The OA EYE

ISSUE 13

O

AUGUST, 2012

EPA OFICE OF AIR
QUALITY PLANNING
AND STANDARDS

SPECIAL
POINTS OF
INTEREST:

• National Air

Quality Confer-
ence Presenta-
tions on AMTIC

INSIDE THIS
ISSUE:

National Monitoring Con- I
| ference

Pb Analysis Audits	2

I Annual PEs and Bracket- 3
ing Routine Concentra-
tions

NATTS Network Assess- 3
ment

PM2.5 Bias Puzzle	4

When to Report/ Not 6
Report QC Data

Ambient Air Protocol 7
Gas Verification Program
Annual Summary

Trace Gas NPAP	8

Progress

PM10 filter Archiving	9

Guidance

Progress on New QA	9

Transactions

National Air Quality Conference Well Attended

When all was said and done,
over 500 people attended the
Ambient Air Quality Confer-
ence the week of May 14 in
Denver Colorado. The
meeting packed in one day of
training and two and one half
days of plenary and concur-
rent technical sessions. Forty
five vendors attended the
conference and overall the
attendees seemed happy with
the venue and the sessions.
Those attending received a
follow-up email requesting
additional feedback on the
conference. We received
I 14 responses to the survey
and the information will help
us plan for future meetings.

QA Activities at the Con-
ference

On Monday, a half day train-
ing course was provided on
the 40 CFR Part 58 Appendix
A QA Regulations. The ses-
sion co-chaired by Donovan
Rafferty, WA Dept of Ecology
and Mike Papp, OAQPS. The
session had trainers repre-
senting the States (Donovan
Rafferty), Tribes (Melinda
Ronca-Battista, Tribal Air
Monitoring Support Center),
the EPA Regions (Mathew
Plate, R9 and Greg Noah, R4)
and OAQPS (Dennis Grum-
pier, Mark Shanis, and Mike
Papp). There was a lot of
material covered in the 4

hours and although 80 people
signed up for the session the
150 seat room was standing
room only at the end of the
day. Training slides are
posted at the conference site
on AMTIC. Survey responses
on the training session were
positive.

On Wednesday morning there
was a breakout session on
monitoring regulation changes.
The session was more of a
"let it all hang out" session
where ideas were brought to
the table to spur conversa-
tions and get initial impres-
sions. Mike Papp, OAQPS
talked about potential changes
to the 40 CFR Part 58 Appen-
dix A regulations. Prior to
the conference, Mike asked
for feedback from the moni-
toring community, via the QA
Strategy Workgroup, on areas

they thought needed changing
in Appendix A. Mike is main-
taining a table of these
changes and is accepting any
additional comments.
Wednesday also included a
half-day QA technical session
co-chaired by Yousaf
Hameed, Clark County and
Dennis Crumpler and Mark
Shanis, OAQPS. There were
many interesting papers at
the session which was well
attended. Presentations are
posted at the conference site
on AMTIC. http://
www.epa.gov/ttn/amt.ic/
naamc.html

Much of this Newsletter will
be devoted to issues or dis-
cussions that came up during
the conference.


-------
PAGE 2

Pb Analysis Audits- Questions about Concentration Ranges and Reporting.

Many people refer to the Pb analysis audits as the Pb audit strips. However, with the advent of Pb-PM,0, analysis audits need to be
available for teflon filters with analysis either by XRF or by approved FEM methods. The following information is provided to help ex-
plain the concentration ranges that are considered acceptable.



Current Regulation

level

Pb Cone
(K/strip)

Ambient Air
Cone (jMe/rn3)

Cone Percentage
Of NAAQS

1

9 30

0.04 0.15

30100%

2

60-90

0.30-0.45

200-300%

are not converted to concentration (ug/m3).

Pb Strips

These strips can either be developed by EPA (ordering on a annual basis), by a third
party, or by the analytical laboratory performing the analysis. Standard operating proce-
dures for the development of the strips are on AMTIC http://www.epa.gov/ttn/amtic/pb-
monitoring.html. QA EYE Newsletters 8 and 9 have additional information about these
strips. The current requirement in Appendix A for the concentrations for the Pb
analysis audits are 30-100% of the NAQQS for level one and 200-300% of the NAAQS
for level two. The reporting units for the data are in ug/strip ( AQS units code 077) and

The equivalent ambient Pb concentration in ug/m3 is based on sampling at a 1.7 m3/min flow
rate for 24 hours on a 20.3 cmX25.4 cm (8X10 inch) glass fiber filter where one twelfth of the
filter (3/4 inch strip) is used. EPA has received comments that monitoring organizations are
using different filter strip sizes and different flow rates in the samplers so when they calculate
the concentrations they need for the filter strip, it is outside the limits provided in the "Pb
Cone" column of the table above and AQS identifies the concentration as not meeting the
acceptance criteria. In some cases AQS did not allow the data to be reported.

Since the acceptable flow volumes for the TSP monitors are I.I to 1.7 m3/min and the Pb
strips are made at 3/4 to I inch widths, the calculations to the left were performed. The data
illustrate that based upon what size strip and volume is used, the concentration limits can
extend from 6- 40 ug/strip for audit level I and 40- 123 ug/strip for audit level 2. Although
EPA would prefer that monitoring organization develop the Pb analysis audits within the 9-30
and 60-90 ug/strip range, in 2012 AQS will accept values at 4-40 ug/strip for level I and 45-
170 ug/strip for level 2. In 2013 we will be revising the ranges to 4-40 ug/strip for level
I and 45-125 ug/strip for level 2.

Pb-PMIO (Teflon)

do not vary. The table to the left provides the appropriate ranges for the Teflon in units of ug/
filter of 1.0 - 4.0 ug for Level I and 7.0 to I 1.0 ug for level 2. The ranges are calculated by
dividing the concentration (ug/filter) by the 24 hour sample volume which is 24 m3. Since some
XRF analysis is reported in ug/cm2, a conversion using the area of the filter of I 1.86 cm2 (as
defined in 40 CFR part 50 App Q sect 2.1) is used. Since EPA has had some difficulty preparing/
acquiring audits at the appropriate ranges, AQS is currently accepting 1.0 - 4.0 ug for Level I
and 5.0 to I 1.0 ug for level 2.

Reporting the Pb Analysis Audit Data- Replicate Analysis Results

Since some monitoring organizations are using contractors to analyze their filters, there has been some issues concerning what Pb
analysis audit data to report. At a minimum, 6 analysis audit values (three low concentration and three high concentration) should be
reported each quarter. Some contractors may run the audits within the quarter on different days, and some may run the audits all on
the same day. It's preferred that they run on different days but it's not a requirement. However, the laboratories may also be provid-
ing the monitoring organizations with replicate analysis. For TSP strips this means that they are analyzing the extract a number of times
and providing the results of this multiple analysis to the monitoring organization. In the case of XRF analysis, the laboratory may be
running XRF on different portions of the same filter and providing the results of this analysis to the monitoring organizations. For pur-
poses of AQS reporting, EPA requires only the mean of the replicates reported. Some laboratories may be running the XRF audit fil-
ters more often then required. These additional results provide more information about the quality of the laboratories results and can
! reported to AQS.

NAAQS-0.15 ug/m3

Sampler Flow 1.7 m3/min

Level 1



Level 2



30%

100%



200%

300%

ug/m 3

0.045

0.15



0.3

0.45

ug/strip (3/4" strips)
ug/strip 1" strips

9.18
12.24

30.6
40.8



61.2
81.6

91.8
122.4







Sampler Flow 1.1 m3/min









Level 1



Level 2



30%

100%



200%

300%

ug/m 3

0.045

0.15



0.3

0.45

ug/strip (3/4" strips)
ug/strip 1" strips

5.94
7.92

19.8
26.4



39.6
52.8

59.4
79.2



Pb-TSP by extraction

For AMP255

Level 1



Level 2



Min

Max



Min

Max



5.9

40



45

125





























PM10 Teflon by extraction



Level 1



Level 2



30%

100%



200%

300%

ug/m3

0.045

0.15



0.3

0.45

ug/filter

1.08

3.60



7.20

10.80





























PM10 Teflon by XRF





Level 1



Level 2



30%

100%



200%

300%

ug/m 3

0.045

0.15



0.3

0.45

ug/filter

1.08

3.60



7.20

10.80

ug/cm2

12.81

42.70



85.41

128.11

THE QA EYE


-------
ISSUE 13

PAGE 3

Annual PEs and Bracketing 80% of Ambient Air Concentrations... How's it calculated

~

For the annual performance evaluation
requirements for the gaseous pollutants,
EPA has received a fair number of
questions on how to handle the sugges-
tion that: "The audit levels selected should
represent or bracket 80 percent of ambient
concentrations measured by the analyzer
being evaluated".

This intent of this suggestion is to imple-
ment audits at concentrations normally
measured by the routine monitor so the
assessment represents an estimate of
uncertainty around routine concentra-
tion levels. Monitoring organizations
trying to perform these assessments
have asked EPA for the best method to
represent or bracket their data. EPA
did not get specific on this criteria in
order to provide the monitoring organi-
zations some flexibility in this approach.
However, due to the popularity of this
question some suggestions follow.

One could perform the 80% bracketing
on a site level or on an aggregation of all

V _

sites within the monitoring organization.
All sites combined-

1)	Take 3 years of hourly data from all
sites, find the 80th percentile and use that
as a starting point to find appropriate audit
levels. Each year add a new year to create
a rolling 3-year average. You could then
use the same audit concentration levels for
each site in the network.

2)	Take the most current valid year of
hourly data from all sites, find the 80th
percentile and use that as a starting point
to find appropriate audit levels. Each year
use the most current year to provide an
estimate. You could then use the same
audit concentration levels for each site in
the network.

Single Sites-

Perform the same estimate in # I or #2
using individual sites for the development
of site specific audit levels.

	V

\

Protecting the NAAQS.

A comment that we've heard from moni-
toring organizations is that they like to
audit in a manner that ensures the moni-
tor is accurate at the NAAQS level. If
this is the case, there are two acceptable
solutions: I) audit a fourth point around
the level of the NAAQS or 2) use one of
the three required audit points for this
audit level. Since the audit levels do not
need to be consecutive, either approach is
acceptable.

Any of the techniques discussed above will
provide audit levels that more closely
match the routine concentrations at the
site. There are probably other ap-
proaches that would be deemed accept-
able by EPA, so check with your Region.
Your QAPP should define how your or-
ganization will implement the require-
ment.

/

_ y

National Toxics Trends Site Network Assessment Presented at National Air Monitoring Conference

The NATTS Network collects ambient air
monitoring data on air toxics as part of the
Urban Air Toxic Strategy, which addresses
air toxics in urban areas. Air toxics include
hazardous air pollutants or HAPs, which are
pollutants that are known or suspected to
cause cancer or other serious health effects,
such as reproductive effects or birth defects,
or adverse environmental effects. Data gen-
erated by each NATTS site are quality as-
sured and submitted to the national Air
Quality System (AQS), EPA's repository of
ambient air data. These quality-assured data
can then be used for purposes such as:

—	Identifying trends in ambient air toxic
concentrations to facilitate tracking
progress toward emission and risk re-
duction goals.

—	Directly evaluating public exposure and
environmental impacts in the vicinity of
monitoring sites.

—	Assessing the effects of specific
emission reduction activities both
locally and nationally.

—	Providing quality assured air toxics
data for risk characterization.

—	Evaluating and subsequently improv-
ing air toxics emission inventories
and model performance.

—	Identifying additional monitoring
needs (e.g., new sites or additional
methods).

EPA conducted its NATTS Network
Assessment as part of the Air Toxics
Component of its overall National Moni-
toring Strategy, which requires that the
NATTS Network be evaluated and modi-
fied every 6 years. EPA's assessment can
be divided into two portions: quantitative
and qualitative. The quantitative portion
examined the pollutant datasets collected
by the monitoring stations and evaluates

the quality of those datasets in terms of
suitability for assessing trends. EPA used
the suitable datasets to identify trends of air
toxic concentrations over the 6-year period
2005 - 2010, as well as to identify national
trends of air toxics at individual sites. The
qualitative portion examined issues such as
whether the network design is appropriate
to achieve the network's goals and objec-
tives and whether changes to the sites,
pollutants, or means of measurement are
needed to refine the network. OAQPS
staff presented the summary of the NATTS
Network Assessment at the National Air
Monitoring Conference in May 2012. A
copy of the draft NATTS Network Assess-
ment is available upon request. Please con-
tact David Shelow @
shelow.david@epa.gov or Dennis Mikel at
mikel.dennisk@epa.gov to request a copy.


-------
PAGE 4

r

PM2.5 Bias Estimates Continue to Puzzle

Table I. 2008-
2010 bias
estimates by
method

At the 2009 Ambient Air Conference in Nashville, Dennis
Crumpler reported seeing a downward trend in the bias
estimates based on the PM2i5 Performance Evaluation Pro-
gram (PEP). In developing the 2008-2010 PM25 QA Report,
EPA was continuing to see this trend and is trying to evalu-
ate this data in a number of ways to determine a cause.
EPA, working with Sonoma Technology Inc. (STI), started
evaluating 12 years of the PM15 data by asking:

—	What are current levels of bias?

—	Has bias been changing over time?

—	When did bias start trending down?

—	Does bias vary by type of separator, WINS versus Very
Sharp Cut Cyclone (VSCC)?

—	Does bias vary by season?

—	Does bias vary by PM2.s concentration?

—	Does bias vary by region of the country?

Current Levels of Bias.

Table I presents the 2008-2010 bias estimates of the major
methods used in the PM25 network. Although we have some
continuous federal equivalent methods in the network, we
do not have enough PEP data to make any definitive state-
ments of bias. Table I
illustrates that all bias
estimates are negative,
meaning concentrations
from samplers operated
by primary quality as-
surance organizations
(PQAO) are lower than
concentrations from
PEP samplers, on aver-
age

Has bias been changing over
time?

Yes, as illustrated in Figure i. In the
2002-2004 and 2005-2007 time periods
the bias was fairly close and between
methods there was no consistency which
three year period was higher or lower.
However, for the 2008-2010 time pe-
riod, all methods are reading more nega-
tive than the other two 3-year periods.

When did bias start trending down?

It appears that the downward trend in bias, as illustrated in
Figure 2 started somewhere around 2007. At this point EPA

Method
Number

Maker

Single/
Sequential

WINS/
VSCC

Bias (%)

90%
Confidence

116

BGI

Single

WINS

-7.8%

+/- 3 %

117

R&P

Single

WINS

-12.4 %

+/-3 %

118

R&P

Sequential

WTNS

-11.8 %

+/-1%

119

Andersen

Single

WINS

-10.7 %

+/- 7 %

120

Andersen

Sequential

WTNS

-8.0%

+/-1%

142

BGI

Single

VSCC

-2.0%

+1-3%

143

R&P

Single

VSCC

-6.0%

+/-4 %

145

R&P

Sequential

VSCC

-5.9%

+/-1 %

153

Thermo

Single

VSCC

-6.1 %

+1-2 %

155

Thermo

Sequential

VSCC

-3.7%

+/- 7 %

is unsure what might have been the cause for this trend. Only
methods 118, 120 and 145 are shown because these three meth-
ods are used most often by the monitoring organizations. One
can see that in 201 I the downward trend may have stopped but
the data used in the assessment is preliminary and data validation
is not complete.

Fig 2. Bias trend for methods II 8, I 20, and I 45

Does bias vary by type of separator, WINS versus
very sharp cut cyclone (VSCC)?

Yes. In general it appears that with enough information present

for both method
designations

-•-118 WINS -~•145 VSCC

2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011

Fig 3. R&P Bias for WINS and VSCC

Table 2 Difference in Seperator by Major
Method Sampler Type for SLT Samplers

Sampler

Method

Difference in WTNS

Statistical Test of

Type

Numbers

and VSCC Median

WINS Bias to VSCC



Biases

Bias for SLT

(WINS and VSCC)
that the WINS has
a more negative
bias than the
VSCC when oper-
ated by the moni-
toring organiza-
tions. Figure 3
illustrates this for
the R&P sequen-
tial, and Table 2
provides the infor-
mation for the
other major
method designa-
tions.

BGI Single
R&P Single

WEN'S Bias <
VSCC Bias

117 vs 143

-9.2%

STI also ran a
similar assessment
of the PEP moni-
tors since the PEP
also uses a combi-
nation of WINS
and VSCC. How-
ever, the analysis
did not find a sig-
nificant difference between the two separators for the PEP sam-
plers (the majority of the PEP uses the BGI audit sampler).
Continued on page 5

R&P

Sequential

Andersen
Single

Andersen
Sequential

WINS Bias <
VSCC Bias

WINS Bias <
VSCC Bias

Not sipifican In-
different.

Not significantly
different. Too few
observ ations for 155.

THE QA EYE


-------
ISSUE 13

PAGE 5

PM2.5 Bias Continued from page 4

Does bias vary by season?

Fig. 4 Mean Concentrations for Winter, ug/m3,

of data und in Bias Calculations

Maybe. The trend is
still downward for all
seasons but in some
cases it appears that
the summer produces
a more negative bias
than other seasons.
Figures 4 and 5 repre-
§§l§§!o§i ser|t the annual mean

IN 12
ug/m3 did show a less negative
bias.

Does Bias Vary by Region of the Country?

Not in any pattern. Bias estimates were generally trending down
across the nation without a clear regional pattern.

PEP Cone

Median



ug/m3

Percent Bias



0-3

-10.2



3-6

-12.6



6-9

-10.7



9-12

-11.5



>12

-6.4



So What's the Answer

We are not sure. We can't answer the question about why we see
a trend but we do know that there appears to be more bias, in
general, in the summer and the WINS appears to produce more
negative results compared to the VSCC when the monitoring or-
ganization operates instruments with both types of separators.
Some things come to mind that were also brought up at the confer-
ence.

1)	Filter removal -The PEP program generally removes the filter
the next morning while the routine sample can remain in the
instrument for a maximum of 177 hours. The longer retrieval
time allows for greater volatilization which is suggested by
greater differences in summer months. However, it is not
clear how this protocol could cause the downward trend in
bias.

2)	WINS cleaning - Since the VSCC requires less maintenance,
there is a possibility that monitoring organizations are follow-
ing the VSCC cleaning schedule which might mean the WINS
are not cleaned as frequently (every 5 events). Some suggest
that a dirty WINS might have an effect.

3)	Cleaning of down tubes. Some have recently suggested that
maybe there is an accumulation of particles on the down tube
that is resulting in a loss of particles over time getting to the
filters.

4)	Aging of instruments and cassettes. A commenter at the con-
ference wondered whether we are seeing the aging of the
instruments. He also commented that there is a possibility we
are getting leaks around the cassettes which may cause a
change in concentration (since volume is a big part of the esti-
mate)

Next Steps

There are a number of questions we may try and answer. They
ind ude:

—	Do ambient temperatures play a role? As temperature in-
creases, does bias becomes more negative?

—	Do changes in speciation of PM2.5 play a role? As PM2.5 con-
centrations come down, is the volatile fraction of PM2.5 in-
creasing?

—	Do filters retrieved within 10 hours of the end of sampling
have smaller bias than those retrieved after filters experience
the heat of day?

—	What is impact on bias based on length of time between last
WINS cleaning and sample collection? Do longer times mean
lower concentrations (compared to the PEP)?

Answers to some of these questions may require more in depth
studies of data from monitoring organizations since some of this
information is not reported to the Air Quality System (AQS).


-------
PAGE 6

When Routine Data is Invalid Some QC Data Should Go

Not all App A
checks fit
nicely into the
paradigm.

QAPPs
should in-
clude word-
ing that
addresses
when to
retain and
when to
exclude QA
and QC
data from

The intent of the QC data that are
reported to the AQS is to provide an
estimate of precision and bias of the
routine data collected during a par-
ticular time period. For example the I -
point QC check is performed mini-
mally every two weeks for the gaseous
pollutants and so the data from the
check represents that the monitor was
within acceptance specifications for
that time period. Upon failure of the
QC checks and subsequent invalida-
tion of the data (should that occur) it
is expected that null value codes
would replace the routine data and
the QC check would not be reported
to AQS. Since the routine data would
not be available it would not be appro-
priate to provide a QC value that
would be used in an overall estimates
of precision and bias of that site. The
estimate of precision and bias for that
site should represent the valid routine
data being reported for the site.

It is suggested that only those QC
checks that are performed on each
monitor/sampler are subject to re-
moval and only for the checks within
the same time period that the routine
data were invalidated. As an example
if the Annual PE for ozone was per-
formed in April, 2012 and the ozone
data for Dec, 2012 were invalidated,
the April 2012 PE could remain in
AQS and only the I -point QC checks
for Dec. would be removed. Not all
App A checks fit nicely into the para-
digm. For example:

[Collocated data- since they repre-
sent a PQAO and not an individual site
it becomes more of a dilemma. How-
ever if routine data from a collocated

site were invalidated due to a finding
based on imprecision of the collo-
cated data then one would not want
to have these data represent the
other sites in the PQAO.

NPAP and PEP data. Similar to
the collocated data, this data repre-
sents the PQAO and is not often
used to invalidate data. However,
there are cases where NPAP data
have been used to invalidate routine
data and in that case it would not be
appropriate to report the NPAP
results to AQS.

Other concerns might arise in con-
nection with the Annual PEs, or
audits, mentioned above. Consistent
with many agencies' Quality Assur-
ance Project Plans (QAPPs), data
will not be invalidated on the basis
of an audit alone. Many agencies will
verify, such as by independent tests,
the results of a "failed" audit. It
might not be practical in all cases to
verify an audit result, immediately
recalibrate the "failed" channel and
schedule a second audit following
the recalibration. Accordingly, ex-
cluding the audit result that discov-
ered a problem in the first place
could cause the responsible agency
either to incur additional audit costs
or, alternatively, be "penalized" for
appearing to fail to meet the re-
quired number of audits. Many agen-
cies would be concerned about
having a less than complete audit
count appear in the AMP255 at the
time of annual data certification.

As suggested above, monitoring
agencies should keep in mind the

objective of reporting the results of
QA and QC checks to AQS: The
results of the reported QA/QC
checks should represent the preci-
sion and bias of the reported raw
data. The analysts who report these
data should be mindful that precision
and bias calculations can apply at the
monitor level or at the PQAO level.
Often, a result that falls outside crite-
ria indicates an out-of-control situa-
tion that is subsequently corrected
such as by invalidating data and recali-
brating. Under other circumstances,
after-the-fact review of QC checks
with poor, but "passing," results
might reveal a trend consistent with a
problem that was only discovered by
some other means.

Because of concerns such as these, it
is important to consider these rec-
ommendations in the context of cor-
rective action. It is recommended
that QAPPs include wording that
addresses when to retain and when
to exclude QA and QC data from
AQS and when to conduct replace-
ment QA/QC checks. However, it is
impossible to foresee every circum-
stance that might lead to a poor QA/
QC result and, in some cases, it
might not be obvious whether to
report or exclude a result. In these
cases decisions may fall to the re-
sponsible QA officers or managers.
Discussions between the EPA Region
and monitoring organizations might
also need to occur to determine the
best course of action.

THE QA EYE


-------
ISSUE 13	PAGE 7

Ambient Air Protocol Gas Verification Program-2nd Report Published

A second full year of implementation of
the Ambient Air Protocol Gas Verification
program wrapped up December 201 I.
EPA provided the specialty gas producers
an opportunity to review the last quarter
of verification data, take any corrective
action needed, and review the report
prior to publication which was posted on
AMTIC April, 2012.

PQAO/RO Specialty Gas Producer Use

(82 responses, 104 selections)

¦	AirGas

¦	Air Liquide

¦	American Gas Group
I IWSGas and Supply
I Linde

I Liquid Technology
I Matheson Tri-Gas

¦	Praxair
Red Ball

¦	Scott Marrin

In order to determine what specialty gas
producers were being used by monitoring
organizations, EPA asked each monitoring
organization to complete a web-based
survey. For the 201 I AA-PGVP, EPA
received surveys from 82 of a possible 122
monitoring organizations, which is about a
67% response rate. This was lower than
the input received from 2010 which was
around 75%. The table above illustrates
producer use based upon the responses
received.

Of the 82 respondents, 33 either did not
want to participate or were not receiving
a cylinder during the year. This narrowed
the participants down to 49. Of the possi-
ble participants, 15 monitoring organiza-
tions sent cylinders to EPA. EPA did not
have a monitoring organization volunteer
submit a cylinder from Linde, IWS, Red
Ball, or Liquid Technology. EPA invited
those producers to send a cylinder di-
rectly to EPA. In addition, although the
monitoring organization surveys did not
list, Global, Coastal or ILMO as a pro-
ducer currently being used, they inquired
about the program and submitted cylin-
ders for verification. Some of these cylin-
ders contained multiple pollutants so al-
though 37 cylinders were sent to the

RAVLs, 65 verifications were performed.
The Results

As required in 40 CFR Part 75 Appendix A,
EPA Protocol Gases must have a certified
uncertainty (95 percent confidence interval)
that must not be greater than plus or minus
(+) 2.0 percent of the certified concentra-
tion (tag value) of the gas mixture.
This acceptance criterion is for
the Acid Rain Program. The AA-
PGVP adopted the criteria as its
data quality objective and devel-
oped a quality system to allow the
RAVLs to determine whether or
not an individual protocol gas
standard concentration was within
+ 2% of the certified value. The
Ambient Air Program has never
identified an acceptance criterion
for the protocol gases. Since the
AA-PGVP has not been estab-
lished to provide a statistically
rigorous assessment of any spe-
cialty gas producer, the RAVLs report all
valid results as analyzed but it is suggested
that any difference greater than 4-5% is
cause for concern.

In general, the AA-PGVP 201 I verifications
have been successful. The quality system,
standard operating procedures, analytical
equipment and standards maintained the
data quality of the program. Results show
that of the 65 verifications, 64 were within
the + 4-5% AA-PGVP criteria, and 58 (89%)
were within the + 2% Acid Rain Program
criteria.

Survey Improvement-

In 2010, EPA had difficulties with monitoring
organizations naming production facilities.
Sometimes names were mispelled or
locations misrepresented. For example, a
number of distribution facilities were
identified that were not actually producing
standards. In 201 I, EPA implemented a web
-based survey that allowed monitoring
organizations to select (based on final 2010
data) the producers they were purchasing
standards from. If the suvey list did not have
a producer, the monitoring organization
could supply a new name and location. The
contractor who maintains the survey would
provide the new producer information to

EPA and if it was determined that it was
a legitimate producer, the contractor
would update the software so that the
new producer would be included on the
selection list. The new system has
reduced entry errors considerably.

Program Issues- Participation
EPA Needs Your Help!

Since the program is voluntary, EPA can
not force participation. Due to the
budget/resource issues, many monitoring
organization are more resource
constrained and since the AA-PGVP is
optional, it is treated as a lower priority.
Since the only added expense to
monitoring organizations is the shipping
cylinders to the RAVL, in 201 I EPA
started helping monitoring organizations
pay for the shipping cost. The first 2
quarters of 2012 show very light
monitoring organization participation
which may force EPA to invite the
specialty gas producers to send cylinders
directly from their facility which defeats
the objective of a blind verification.
Twenty five percent of last year's
cylinders came directly from producers.

We are grateful to the following
organizations that participated in last
years survey and we hope that more will
consider participating in 2012.

-	Arizona DEQ

-	City of Philadelphia

-	Mecklenburg County NC

-	Maricopa County Air Quality
Dept

-	Minnesota Pollution Control
Agency

-	New Jersey DEP

-	North Carolina DNR

-	Ohio EPA (Portsmouth)

-	Southern Ute Indian Tribe

-	State of Delaware

-	State of Florida

-	Texas Commission of Environ-
mental Quality

-	University of Iowa State Hygi-
enic Lab

-	Virginia DEQ

-	West Virginia DEP

/


-------
PAGE 8

N

Trace Gas NPAP Issues and Notes from the National Meeting

Mark Shanis has been working on a number of issues associated with
NPAP activities at NCore sites. The following are some brief updates
and helpful hints from his presentations at the National Ambient Air
Meeting in Denver.

Zero Air Generator (ZAG) Agreement Issues

The API-701 H seems to do a better job than API-701, but there is a
need to control heat from the convertor, especially if you use a case-
based audit system, or monitor with tightly packed instrument racks.
If you are auditing or monitoring trace level CO and have an hydro-
carbon (HC) convertor in your ZAG, especially if it is a 701 or 701 H,
you may need to have a CO convertor after the HC convertor. The
HC converter will convert some, but not all, HC that goes through it
into CO. If you have to add the CO convertor that API offers, it
may mean more heat output from your ZAG. You may wish to
check your ZAG against a good Ultra Pure Air cylinder, as the NPAP
program does, to be able to independently check the performance of
your ZAG.

Trace Level (TL) Calibrators

Generate the TL lower audit levels either with :

1.	a lower ratio of C0/N0/S02 in your blended gas (audit for
generation) cylinder; or

2.	adding 3rd lower flow rate pollutant mass flow controller (MFC)
to your I OOcc/min such as a 20 or I Occ/min MFC; or by increas-
ing your dilution MFC from 10 Ipm to 20 Ipm, or from 20 Ipm
to 30 Ipm (but you can only do that if you have a ZAG that is
designed to and can safely go that high;

Some combination of I and 2 may also work. The NPAP audit trailer
at RTP has a 20cc/min, I OOcc/min, and 30 Ipm MFC combination.
We use approximately 675ppm CO, 60ppm NO, and 30ppm S02,
and we have a 2nd blend for lower levels of ppm CO/NO/S02.

Region 2 has a I Occ/min, I OOcc/min, and 20 Ipm MFC combination,
and have just proposed using 450ppm CO, 30ppm NO and 15ppm
S02 blend for generating audit levels.

TL calibrator's ozone generators have been observed in the past 2
years to have problems generating the lowest I or 2 ozone levels for
GPT for NOy (or TL N02), due to software and/or hardware design
features of the generator; and/or to the ability of the auditor (or
station manager) to keep a stable, correct range temperature, espe-
cially in the summer. However, newer or serviced models of at least
API and Environics calibrator's ozone generators have been shown to
provide I -1 Oppb 03, with stability for an audit of NOy or GPT.

CO analyzer issues

Zero and 3-4 hour drift characteristics may not be as good as speci-
fied by the manufacturer. One factor causing the observed problem
has been temperature variability due to limitations in AC control of
the station, or the audit vehicle/container (trailer, truck, or case-
based systems).

\

^ _

S02 and NO analyzers seem less affected by temperature ef-
fects. Temperature control is a more important investment for
trace level monitoring, and auditing. A zero of +/-15ppb for a 0-
5 ppm full scale TL CO analyzer is probably as good as you may
get.

NOy analyzer issues

The audit gas will have to be delivered to the sampling stations
NOy convertor inlet, on top of a 10 meter tower. However, if
you carry an audit NOy, the convertor does riot have to be on
a 10 meter tower for a representative audit. However, some
NOy analyzer converters' components may have a wiring prob-
lem, resulting in a convertor temperature lower than is required
to do the NOy to NO conversion correctly, so be sure to see
the API notice about this issue.

Since convertor efficiency is an issue for a device based on con-
version of non-NO species to NO, and since the NOy technol-
ogy is still relatively new, the convertor efficiency (CE) must be
checked periodically, and audited. To do this, N02 is not really
sufficient, and NPN or IPN is available and must be used. N02
only tests conversion efficiency of N02, not NOy. Most of
what will be measured will be N02, not NOY, but if you don't
check, how will you know which is present and in what propor-
tions?

Regular range N02 audits of NO-NOx analyzers have been
shown to be accurate and reliable when based on measurement
of the diluted CO from a cylinder of a gas blend of CO, S02,
and NO, followed by reaction of the diluted blend with ozone
from the diluting calibration device. That is, we measure the
diluted CO on our audit CO analyzer, which we calibrate at the
site just before the audit, and assume the concentration of the
diluted NO, based on reliance on the blend ratio (CO & NO).
We use the audit station's NO-NOx response to test and calcu-
late the sampling station's NO-NOx convertor efficiency for
N02. We call this an N02 audit based on CO & GPT. The
SOP for this procedure is in the NPAP TTP Draft Operators
field SOP (7/28/201 I) on AMTIC: http://www.epa.gov/ttn/amtic/
npapsop.html

Reliance on this procedure for trace level NOy analyzers de-
pends on the performance of the CO analyzer, and control of
temperature and pressure. While we see that this SOP varia-
tion works, questions have been raised about the challenging the
lowest audit levels.

At RTP we have developed an alternate NOy calibration and
audit method, based on joint calibration of the CO & NOy ana-
lyzers with a set of NPN or IPN spans, and if needed, reliance
on the NPN calibrated NOy analyzer for an audit. Initial tests
appear promising. Confirmation tests are underway at RTP. If
successful, the new and/or older method will be tried out in
Region 4. Due to the added time for doing NOy audits by GPT,
an alternate shortcut method has been proposed, will be tested
around the Regions, and discussed for future use.

/

/

_ ^

THE QA EYE


-------
r

"the agency feels
this is an
appropriate use
for the filters and
therefore a
legitimate reason
for not archiving
filters that fall into
this category of
use."

9 _

PM10 Filters Serving Dual Purposes and a Reprieve for Filter Archiving

We have received questions on
the use of low volume PM|0 fil-
ters that can provide multiple
measurements of PM|0. specifi-
cally the PM|0 half of the PM|0_25
measurement, and subsequently
used in the analysis of PM|0-Pb.

Using the same filters for both
PM|0 and Pb analysis reduces the
number of samplers required at
the monitoring sites and creates
other efficiencies. EPA has en-
couraged this practice with guid-
ance to perform PM|0 mass
measurements prior to perform-
ing Pb analysis on the same filter
(QA EYE Issue 12). Now that
there is an approved FEM Pb ICP
-MS technique which will destroy
the PM|0 Teflon filter during sam-

ple extraction (the XRF FRM
technique is non-destructive and
therefore the filter can still be
archived), monitoring organiza-
tions have asked whether using
the PM|o filter for multiple uses
with a destructive ICP-MS Pb FEM
method is in conflict with the fol-
lowing 40 CFR Part 58.16 require-
ment that:

"The State, or where appli-
cable, local agency shall archive all
PM1S, PM/o and PMj0 25 filters from
manual low-volume samplers
(samplers having flow rates less than
200 liters/minute) from all SLAMS
sites for a minimum period of I year
after collection."

The requirement goes on to state
that the archived filters would be

made available to EPA or other
federal agencies, during the I -
year archive period, for sup-
plemental analysis. Therefore,
the archive requirement is to
ensure that the filters are avail-
able, in a viable condition, for
beneficial supplemental uses.
Since the approved Pb FEM
technique is available for use
and EPA has encouraged multi-
ple use of filters in order to
reduce capital and resource
costs, the agency feels this is
an appropriate use for the
filters and therefore a legiti-
mate reason for not archiving
filters that fall into this cate-
gory of use.

EPA Making Progress on New QA Transactions



The Ambient Air Monitoring Group
and the National Air Data Group
(the keepers of the AQS system)
have formed a Workgroup with a
number of EPA Regional and moni-
toring organization volunteers to
review the reporting requirements
for the required as well and non-
required QA data that is reported to
AQS. For many years we have tried
to "fit" all our QA data into a preci-

sion (RP) or accuracy (RA) trans-
action that although functional,
was not always a great fit. With
the possibility of building more
automated assessments and the
need for additional QA data re-
ported to AQS, this Workgroup
has met three times to review the
transactions and address issues
arising from the review. The
process was discussed during the

QA Session at the May National
Ambient Air Meeting in Denver
and will be a topic at the August
AQS Meeting.

The Workgroup has one or two
meetings remaining which should
get us to a stage for external
review to a larger audience and
programming.

Authors Contributing to the QA EYE- Have You Got Anything to Say?

We appreciate all those authors con-
tributing to this issue. They include:

Bill Frietsche for his work on the
Pb Analysis Audit (page 2)

Dennis Mikel who authored the
National Toxics Summary on page 3

Shelly Eberly (Geometric Tools)
and Mike McCarthy (STI) who

provided the evaluations for the
PM2.5 Bias estimate piece on
pages 4 and 5

Joe Delwiche (R8) and Chris
Hall for contributions to the
data validation article on page 6,
and,

Mark Shanis who authored the

Trace Gas NPAP article on Page
8.

We ore always looking for
interesting articles for the QA
EYE Please take a few mo-
ments out of a day to write up
something you feel would help
the QA community.

NEWSLETTER TITLE


-------


33
\



UJ

o
o*

4/v

^ PRO"^-0
EPA

EPA-OAQPS
C304-02
RTP, NC 27711

E-mail: papp.michael@epa.gov

The Office of Air Quality Planning and Standards is
dedicated to developing a quality system to ensure that
the Nation's ambient air data is of appropriate quality
for informed decision making. We realize that it is only
through the efforts of our EPA partners and the moni-
toring organizations that this data quality goal will be
met. This newsletter is intended to provide up-to-date
communications on changes or improvements to our
quality system. Please pass a copy of this along to your
peers and e-mail us with any issues you'd like discussed.

Mike Papp

Important People and Websites

Since 1998, the OAQPS QA
Team has been working with the
Office of Radiation and Indoor Air
in Montgomery and Las Vegas and
ORD in order to accomplish it's
QA mission. The following per-
sonnel are listed by the major
programs they implement. Since
all are EPA employees, their e-
mail address is: last name.first
name@ epa.gov.

The EPA Regions are the pri-
mary contacts for the monitoring
organizations and should always
be informed of QA issues.

Program

Person



Affiliation

STN/IMPROVE Lab Performance Evaluations

Eric

Bozwell

ORIA- Montgomery

Tribal Air Monitoring

Emilio

Braganza

ORIA-LV

Statistics, DQOs, DQA, precision and bias

Rhonda

Thompson

OAQPS

Speciation Trends Network QA Lead

Dennis

Crumpler

OAQPS

OAQPS QA Manager

Joe

Elkins

OAQPS

Standard Reference Photometer Lead

Scott

Moore

ORD-APPCD

Speciation Trends Network/I M PROVE Field Audits

Jeff

Lantz

ORIA-LV

National Air Toxics Trend Sites QA Lead

Dennis

Mikel

OAQPS

Criteria Pollutant QA Lead

Mike

Papp

OAQPS

NPAP Lead

Mark

Shanis

OAQPS

PM2.5 and Pb PEP Lead

Dennis

Crumpler

OAQPS

STN/IM PROVE Lab PE/TSA/Special Studies

Jewell

Smiley

ORIA-Montgomery

STN/IM PROVE Lab PE/TSA/Special Studies

Steve

Taylor

ORIA-Montgomery

Websites

Website

EPA Quality Staff
AMTIC

AMTIC QA Page

URL

EPA Quality System

http://www.epa.gov/ttn/amtic/

http://www.epa.gov/ttn/amtic/qualitv.html

Description

Overall EPA QA policy and guidance
Ambient air monitoring and QA
Direct access to QA programs


-------