The OA EYE
ISSUE 17
DECEMBER, 2014
EPA OFFICE OF AIR
QUALITY PLANNING AND
STANDARDS
SPECIAL POINTS
OF INTEREST:
• August 2014 National
Ambient Air Monitoring
Conference
• RP transactions going
away for collocated data
in 2015
• RA and RP transaction
going away March 2015
INSIDE THIS
ISSUE:
National Monitoring Con- I
ference
Assessment Highlight I
PM2.5 Bias Assessment 2
2105 Guidance Docu- 6
ments for Development
Direct N02 Measure- 7
ments
AQS Corner 8
Review of Units and Dec- 9
imal Places Reported for
Criteria Pollutants
National Ambient Air Monitoring Conference Update
Region 4 Training
I I
The National Ambient Air
Monitoring Conference held
August 11-14 in Atlanta had
the largest attendance on rec-
ord at 598 . This year we
paired up with the AQS con-
ference which was helpful since
monitoring, quality assurance
and data reporting cross lines
in many areas and it was good
to have AQS support in train-
ing sessions. The OAQPS
quality assurance staff was
quite busy throughout the
week with a full day of QA
training on Monday and a half-
day presentation session on
Wednesday. Presentations
and training materials for the
Conference are posted on
AMTIC at http://www.epa.gov/
ttnamti I /2014present.html .
Based on comments from the
2012 Conference in Denver,
we expanded the QA Training
course to a full day and even
with that we could not get into
too much detail. We had over
150 personnel attend the QA
Training session and responses
to the session were positive.
The OAQPS QA Staff are con-
templating more focused train-
ing on specific facets of the QA
program for the next confer-
ence and really getting into the
weeds. OAQPS tried something
new for this conference by set-
ting up an afternoon of round
table questions and answers.
Twelve table where set up with
technical experts manning each
table. Attendees floated among
the tables to ask questions of
the experts. It was quite a lively
session and we answered a lot
of questions but there was
some comments on ways to
improve this activity for the
next conference. Many of the
articles in this issue are derived
from conversations at the con-
ference or emails sent in since
the last Issue. The conference
would not have been a success
without the help from the EPA
Regions and State/Local /Tribal
monitoring organization. For
the QA Team we'd like to
thank Melinda Ronca-Battista
(Northern Arizona University
representing the TAMS Cen-
ter), and Stephanie McCarthy,
EPA R4 who helped out on the
training session and Yousaf
Hameed (Clark County, NV)
and Susan Kilmer (Michigan
Dept of Environmental Quality)
for their help facilitating the
presentation sessions. In talk-
ing to many people I was
amazed at how many individu-
als drove from as far away as
Chicago and Michigan to cut
down on travel expenses.
With that kind of effort we
know we need to keep up our
own efforts to provide the best
conference we can.
Data Quality Issues in I I
PM2.5 Labs
NATTS QA Update 12
Final Thoughts 12
Assessments- A Highlight During Air Monitoring Conference
During the Plenary Discussion provided by
Janet McCabe, Chet Wayland, and Lew Wein-
stock there was some emphasis placed on the
importance of our quality systems. Over the
last year or so there have been a number of
cases where significant amounts of data have
been invalidated due to exceedances in the
acceptance criteria describe in our QA regu-
lations or methods. In some cases this has
affected our ability to make NAAQS deci-
sions. The initial costs for collecting this data
and the additional costs of review and evalua-
tion, not only on the monitoring agencies but
on the EPA and regional and headquarters
staff make it very clear that identifying quality
issues as soon as possible and taking immedi-
ate correct action is highly beneficial. And
cost effective
Continued on Page 1
-------
PAGE 2
PM2.5 Bias Update at Atlanta Ambient Air Conference
The following are selected section from a report written by
Mike McCarthy (Sonoma Technologies) and Shelley Eberly
(Geometries Tool). The full report will be posted on AMTIC
in December. Since about 2007, EPA noticed an increased
negative bias trend for many of the PM2.S monitors. At the
May 2012 Ambient Air Conference, Mike and Shelly pre-
sented findings of their evaluation of the bias issue which
was published in Issue 13 of The QA EYE. EPA asked
Sonoma to follow up on feedback received at the 2012
Denver meeting; specifically, to investigate if sampler clean-
ing, precision data, and chemical composition data could
provide insight into bias trends. The results were presented
at the 2014 Conference in Atlanta and are summarized be-
low.
Bias and precision were calculated by and compared across
method designations. For these analyses, PM2.5 federal
reference method (FRM) and federal equivalent method
(FEM) method designations were considered. These are
listed in Table !. For some analyses, only designations with
sufficient sample size were included. These designations
were typically I 18, 120, 145, and 170,
Table 1PM2.5 FRM and FEM method designations
tions. It is noted that bias for WINS (wells impactor ninety-six)
size selective inlet is more negative than bias for VSCC (Very
Sharp Cut Cyclone) inlets.
15
10
Mean arid 95% Confidence Intervals for 2011-2013
Bias Estimates by Method Designation
9 A
S -S
-10
¦15
P*
~
f
~
i
ff
PM2.5 Sampler Manufacturer
and Type of Sampler
Method
Designation
with WINS
Method
Designatio
n with
VSCC
BGI PQ200/PQ200ASingle
Channel
1T6
142
R&P 2000 Single Channel
117
143
R&P 2025 Sequential Sampler
1 13
145
Andersen/Thermo RAAS2.5-
100 Single Channel
1 19
153
Andersen/Thermo RAAS2.5-
300Sequential Sampler
120
m
BAM 1020
17
0
How Has Bias
Been Changing
Over Time?
Prior to 2006,
annual bias of
the most fre-
quently used
methods (IIS,
120, and 145)
wiggled be-
tween -5% and
+5% with no
obvious trend,
as shown in
Figure 2. From
2006 to 2009,
annual bias
declined. Since
2009, bias for
these methods
dropped to a
range between
-15% and -5%, as shown in Figure 3. For these figures, bias is ad-
justed for PQAO and season, excludes pairs with concentrations
less than or equal to 3 |Jg/m3, and excludes pairs with bias great-
er than 50% or more negative than -50%. (Continued on page 3)
s
2
1
* a
Figure 12011-13 bias estimates by method designation
Many of the calculations were performed at the seasonal
level. Seasons were defined according to the months:
Spring: March-May
Summer: June-August
Fall: September-November
Winter : December-February
The bias database was developed for 201 1-2013 by combining
PEP data acquired directly from the U.S. EPA and SLT data ac-
quired from the Air Quality System (AQS). The precision data-
base is identical to the one developed for the 201 I -2013 PM2.5
Quality Assurance Report and its development is documented
in that report.
What are Current Levels of Bias?
For 201 I -2013, bias continues to be negative for most of the
FRM methods, as shown in Figure I. A negative bias means
that the SLT concentrations are less than the PEP concentra-
Fig. 2.Annual bi.ts for 1999-2006 for three major method designations
Fig 3.Annualbiasfor 200&-2013 for three major method designations
THE QA EYE
-------
ISSUE 17
PAGE 3
PM2.5 Bias Assessment Update (Con tinued from Page 2)
Does Sampler Cleaning Impact Bias
Why is bias for WINS different than bias for VSCC for SLT but
not for PEP? One idea is that bias varies by the elapsed time
since the WINS was last cleaned. PEP cleans the WINS and
VSCC each sampling event, whereas SLT cleans the WINS
every five sampling events and the VSCC every month. The
bias potentially became more negative as the WINS became
dirtier because the cut point became smaller. Data for the
elapsed time since the WINS was last cleaned was not readily
available data, so a surrogate of sampling frequency was used. If
we assume that sampling frequency is a proxy for sample clean-
ing frequency, then we can hypothesize that daily samplers
would have dirtier WINS (i.e., more negative bias) than every
third day samplers, which would have dirtier WINS than every
sixth day samplers due to more frequent sampling. If this is
true, then bias from daily samplers should be most negative,
bias from every sixth day samplers least negative, and bias from
every third day samplers in the middle.
However, this hypothesis was not consistent with the ob-
served bias grouped by sampling frequency. No consistent
relationship between bias and sampling frequency was found
for the sampler types with sufficient data.
Are Changes in PM2.S Composition Contributing to Bias Trend?
The trend in negative bias in PM2.5 concentrations was com-
pared to trends in PM2.5 composition by analyzing SAND-
WICH (Sulfate, Adjusted Nitrate, Derived Water, Inferred
Carbon Hybrid) data (Frank, 2006) from the Chemical Specia-
tion Network (CSN).
The PM2.5 components
estimated through the
SANDWICH technique
include sulfate mass,
nitrate mass, crustal
mass, organic carbon
mass, elemental carbon,
and passive (filter con-
tamination).
bias (between I and
sition and trends.
10%), and looked for differences in PM2.5 compo-
To test this hypothesis, CSN data from 2006 to 2012 were acquired and
precision and bias data of PQAOs from 2008 to 2010 were referenced
to identify representative CSN sites. Twenty CSN sites were selected in
or near PQAOs that also had acceptable precision (coefficient of varia-
tion less than 7%). Among these sites, I I were associated with large
negative bias, which exceeds the data quality objective (DQO), and 9
were associated with moderate bias, which meets the DQO.
The composition of PM2.5 SANDWICH data were analyzed for sites
with large negative bias and sites with moderate bias. The trend of
the mass and ratio to PM2.5 of the individual components were ex-
amined, and summary statistics calculated for the ratios of compo-
nents to PM2.5 mass.
The results indicated that PM2.5 mass shows a declining trend from
2006 to 2012 in both groups of sites (Figure 4 top row). However,
variations are seen in the relative contribution of different compo-
nents of PM2.5 (Figure 4 bottom row) between bias groups.
The top three components of PM2.5 by ratio are sulfate, nitrate,
and organic carbon mass. The average, median, 90th, and 10th
percentiles as a fraction of PM2.5 mass are displayed in Figure S.
Both the large negative and moderate bias groups of sites show
decreasing sulfate mass from an average of 40% of PM2.5 mass in
2006 to about 30% of PM2.5 mass in 2012. In contrast, the ratio of
nitrate mass to PM2.5 is higher at sites with greater bias (~I2% vs.
7%). Continued on page 4
Bias < -10% - Exceeds DQO
Bias > -10% - Meets DQO
We hypothesized that
changes in the relative
contributions of PM
components-specifically
sulfate, nitrate, and or-
ganic carbon -may result
in changes in PM volatility
and may cause decreased
retention of PM mass on
filters. In this analysis, we
selected CSN sites near
PQAOs with large nega-
tive bias (more negative
than -10%) and moderate
SflNGJMfJted PMj s Components
at 11 CSN Sites - Mass
»K no; ko» Mm lc:c km
SANBMCttSL PM,, Components
at 11 CSN Sites - Ratio to PM,,
sCfllkl*)
•«
Mill
¦ oc lltu
¦ Wit* M*M
i OS
S
9 6.4
%
m
0.2
a
1006 iCQJ KG* JC0» JfilO xm 2012
Vmt
ate
¦ N'UM* Wait
¦ocm.u
aW«t» Mim
S^NQWICHed PM, % Components
at 9 CSN Sites - Mass
s
1 E
i i
^ ,
|
1
i
i i
1 i
: ¦
m auutw
; ;
I
. ¦«
|
1
;
\
H
W
W
W
H
W
1— «
-------
PM2.5 Bias Assessment Update (continued from page 3)
Bui < -19% - ijccercte DQO
04*5 > -10% - M«ts DQO
W4KMMiM)PMnrRW MMt
m OTf sm xv ft vi mi tv\s
NHrite Una Bo #M,, f HW MUu
na 3cur su saw jaw jki ku
i*mmt Mhi to PM„ FRM M«u
rxf tern me nit xiu
Nitrate Mai to , MM Uni
K» MH K« JOIN J09B Jfflt WU
OfpnkCiitgnMlHtBminniMMin
-MUNritOIp
-10th P«»il »t
Figures. Fraction ofPM2.5 rnassofthesulfateftop), nitrate (mid die),
and organic carbon (bottom) masscomponentsatCSNsiteswith large
negative bias (left) and moderatebias(rsght).
SO
40
M
Stof»»-03/vr
M«4un ¦ 0
Mope = -0.7/yr
Median * 3.8
0»
s
b
a
>
3
"5
Note that the fraction of nitrate mass peaked in 2009
and 2010, which corresponds to the most negative
bias in PM2.5. Finally, there is a large increase in or-
ganic carbon mass, from 30-35% in 2006 to 40-45%
in 2012. The combined change in PM volatility from
declining sulfate, increasing organic carbon, and inter-
annual variability in nitrate has resulted in a more
volatile PM2.5 mixture over the 2006 to 2012 time
period. The peak in nitrate composition is associated
with the most negative bias in PM2.5 concentrations.
In addition, the differences in the relative contribution
of nitrate and organic carbon mass between the
groups of sites coincide with the PM2.5 bias trends.
The findings from this analysis are consistent with the
hypothesis that the trend in bias may be associated
with changes in the volatile fraction of PM2.5 mass.
Note that this does not prove that the trend in PM2.5
mass is associated with the changing composition of
PM; additional study is needed to see if these compo-
sitional changes are consistent across other monitor-
ing sites and over additional years.
Can Precision Data Give Insight into Bias Trends?
PM2.5 precision data might give insight into bias
and trends in bias. Data from collocated monitors
that show a consistent relative difference indicate a
bias is present in one or both samplers. The slope
in the relative differences indicates trends in bias in
one or both samplers.
Figure 6 shows relative differences from two sites.
The site on the left shows the ideal precision data:
relative differences are tightly clustered between -
10% and +10%, the median relative difference is
zero, which indicates both samplers have similar
bias, and the slope of the regression line through
the relative differences has a small slope of just -0.3
per year. The precision data for the site on the
right shows more variability but,
more importantly, it shows a median
relative difference of -3.8% suggesting
that one or both of the samplers are
biased.
10 ; 10 - 1 ' : '*• . . . - I
¦U
j/i/u rn/ii
Ifl/H ttuu l/J/U Wl> 1/VU
so
1/1/11
7/iyit i/j/sj 7/iiu i/J/H tflto i/W*
Ideal: tight, slope - 0, median - 0.
Blast light, slope ~ 0, horizontal — -4%.
Block showi Median Rekiilv# Dtfferenc« (%),
Fig. 6Ti m e-series graphshawing relative precision differences; trends i n precision can be indicative of bias
For each collocated site, the median
relative difference was calculated for
201 1-2013. The distribution of these
median relative differences is summa-
rized in Table 2. Calculations exclude
pairs with low concentrations (less
than or equal to 3 |tg/m3) and in-
clude site-years with at least 30 pairs.
The median was used to reduce im-
pact of outliers.
Continued on Page 5
THE Q A EYE
-------
ISSUE 17
PAGE 5
PM2.5 Bias Assessment Update (Continued from page 4)
The median of all the site-level median percent relative differ-
ences is close to zero for Methods I 17, I 18, 120, and 145, and
the 5th and 95th percentiles of the site-level medians are approx-
imately -5% and 5%, respectively. Sites with medians larger than
5% or more negative than -5% should be investigated because the
precision data are showing consistent differences between the
two samplers. For method 170 (compared to other method 170
instruments), the precision data suggest large biases. This is based
on a limited number of sites, but the data indicate strong biases
(e.g., +33% or -20%).
Table 2. Distribution of site-leve! median relative percent differences.
Conclusions
Analyses to date have not found one clear cause for the down-
ward trend in bias seen between 2007 and 2010. The decrease in
PM2.5 concentrations and its impact on the bias statistic may
contribute somewhat. Trends in PM2.5 speciation, namely nitrate,
sulfate, and organic carbon, also may contribute. But these to-
gether do not seem sufficient to explain a drop of nearly 10% in
bias. Several in attendance at the 2014 conference offered other
suggestions for contributors to the trend. These suggestions are
summarized in the next section
Feedback at the 2014 National Ambient Air Monitor Conference
2011
3011
3013
J#
-sw
OH
111
w/
2011
51
«4H
0%
6H
2012
3024
47
•5%
-i%
2H
lit
2013
2903
40
«tH
0*
5H
120
2011
836
16
•5%
oh
4H
w/
2012
715
14
3%
0H
5H
120
2013
511
10
•2%
OH
3H
143
2011
2758
35
-2%
OH
2H
w/
2012
2885
36
.4*
¦1%
4H
145
2013
2847
39
-4H
OH
2H
170 wt
2011
622
2
.19*
-9H
OH
170
2012
754
3
-8%
2H
6%
'"•JT*
2013
1097
4
-iih
I2H
33H
Graphing relative differences as time series may provide infor-
mation not just related to precision or bias. Figure 7 is for a site
with a continuous monitor and an FRM monitor collocated. This
graph clearly shows the seasonal differences in PM2.5 collected
by the two methods. The regression line suggests a large upward
slope to the relative differences of 8 per year, meaning that the
average relative difference is growing by 8 each year
The first set of comments is directly related to bias and the
second set to broader quality assurance issues.
Does the method for cleaning filter cassettes impact bi-
as? One person described calling several labs and finding that
there is no consistency in the cleaning of filter cassettes. Changes
in cleaning protocols show up in trip blanks. If these blanks start
to drift up, contact the lab to ask for better or different cleaning
methods, then assess whether trip blanks drop after making this
change.
Are monitors properly maintained? For example, are the
PM10 heads properly maintained, especially the O-rings? When
doing flow rate checks, is the PM2.5 head removed incorrectly?
As a corollary, did cuts to staffing and operational budgets in the
2007-2010 recession result in improper maintenance? At first,
even with cuts, monitor and lab personnel did what they had to
do to maintain the instruments and process filters. As the cuts
continued and deepened, the operators could not continue to
cover all bases so things began to be maintained less thoroughly.
Around 2010, a new norm for field maintenance and lab opera-
tion was found, at which time bias began to stabilize.
50
ry -40 * Medians -1*5
-50
1/1/11 7/3/11 1/2/12 7/3/12 1/2/13 7/4/13 1/3/14
Figure 7, Relative differences from precision for a site with a
continuous primary monitor collocated with a FRM collocation monitor
Does filter retrieval time impact bias? PEP filters are re-
trieved the morning after sampling. SLT filters are retrieved with-
in a week. One suggestion was to collocate two PEP samplers
with an SLT sampler; collect one of the PEP filters according to
PEP protocol and collect the other PEP filter when the SLT filter
is collected. Another suggestion was to collocate two PEP sam-
plers with an SLT sampler: PEP collects one filter according to
PEP protocol, and the SLT collects the other PEP filter when the
SLT filter is collected, and the second PEP filter goes through the
SLT lab. The first approach addresses filter retrieval time only.
The second approach combines more possible causes for differ-
ences: filter retrieval time, laboratories, and filter transportation,
to name a few. Some people suggested that it would be possible
and reasonable to acquire dates of sampling, retrieval, and weigh-
ing for some PQAOs and/or states.
Continued on page 6
\
\
V
I
/
*
-------
PM2.5 Bias Assessment Update (Continued from page 5)
Does filter equilibration time in the lab affect bias? Regula-
tions require filters to equilibrate in the analytical lab for at least 24
hours, but there is no upper limit on equilibration time. Do weights
change as filters sit longer?
In addition, there were some comments not specifically related to
bias but that likely impact bias:
Regarding the use of primary monitors. Comparing PEP or
precision collocation for monitors identified as primary is too lim-
ited because another monitor's data may be used when the primary
monitor does not record data. Maybe PEP should be used to evalu-
ate all monitors at the site, possibly separating results for primary
monitors and all others. Same is true for precision
Regarding method codes. There are cases of monitors running
WINS in one season and VSCC in another, yet all data are reported
under just one method code. There are other cases where the field
changes the separator; however, the lab does not know this and
reports the data to AQS under the incorrect method code. It is
believed that the incorrect coding goes both ways: VSCC monitor-
ing data are reported as WINS and WINS monitoring data are
reported as VSCC.
Regarding cassette changer. Some have indicated that the
new Thermo cassette changer is not rotating properly. This is
impacting completeness, and data from other monitors are sub-
stituted more often than one might expect. Completeness less
than 70% is not uncommon.
Regarding mean-adjusted Coefficient of Variation (CV). In
the original PM2.5 ruling, CV was not mean-adjusted. Later, CFR
was changed such that CV is now mean-adjusted so that it more
accurately reflects variability only, not systematic differences. The
mean adjustment is useful information. It was suggested that the
mean adjustment be stated on AQS reports so that drifts in
precision can be more easily identified. Showing the mean would
also allow people to aggregate CV to other time intervals or
other spatial areas. Without the mean, there is not enough infor-
mation on the AQS reports to complete aggregations.
References
Eberly S. and McCarthy M.C. (2012) Bias in PM2.5 filter-based
methods. Presented at the National Air Quality Conference: Ambi-
ent Monitoring Denver, Colorado, May 14-17, by Geometric Tools,
LLC, WA, and Sonoma Technology, Inc., Petaluma, CA. STI-
910320-5366.
Eberly S., McCarthy M., and Huang S. (2014) Bias in PM2.5 filter-
based methods. Presented at the National Ambient Air Monitoring
Conference, Atlanta, Georgia, August, by Geometric Tools and
Sonoma Technology, Inc. STI-910515-603 I.
Eberly S.I. and McCarthy M.C. (2013) 3-year quality assurance
report: calendar years 2008, 2009, and 2010, PM2.5 ambient air
monitoring program. Draft report prepared for the U.S. Environ-
mental Protection Agency, Research Triangle Park, NC, by Sono-
ma Technology, Inc., Petaluma, CA, STI-910407-5712-DR, July.
Frank N.H. (2006) Retained nitrate, hydrated sulfates, and carbona-
ceous mass in federal reference method
PM2.5 for six eastern U.S. cities./ Air Waste Manage., 56(4), 500-
511, April.
2015 Guidance Documents for Development
Over a number of years EPA has been
asked to pursue a number of guidance
documents that we just have not gotten
time to complete or pursue. The follow-
ing are what he hope to have drafted by
this time next year:
QA Handbook Vol II. The draft we
completed in 2013 will be revised with
the anticipation of a Jan 2016 publish date.
We plan to have a Handbook Review
Workgroup conference call in Jan. 2015.
Flow Rate Transfer Standard Guidance
- Similar to the ozone transfer standard
document we plan to develop guidance foe
flow rates so that monitoring organization
can certify a Level 2 primary standard in
their labs that would be verified/certified
annually and then use their Level 2 standard
to verify transfer standards that are used in
the field. The goal would be to have a draft
completed by mid-year for review and a
published version by the start of 2106.
Some work has been done on this guidance .
Electronic Notebooks- EPA does not
have a policy in place for electronic note-
books. Monitoring organization have asked
about this and there are a few organiza-
tions that have developed procedures for
there internal use. OAQPS planned to
have a webinar in November 2014 to
illustrate one potential technique but it
was postponed till the new year. Organi-
zation who have been pursuing these tech-
niques or are interested in the webinar
should contact Mike Papp (back page)
NEWSLETTER TITLE
-------
ISSUE 17
Data Quality Assessments (continued from Page 1)
PAGE 7
Region 4 has taken a proactive role in
the data quality arena by offering four
three-day training sessions on ambient
monitoring for the State/Local/Tribal
monitoring agencies in the region. Alt-
hough the training provides good cov-
erage on many facets of air monitoring
(grants, siting, network design etc.) they
spend a good portion of the time on
quality assurance activities including
from QMP/QAPP/SOP development,
quality control implementation, data
reporting and assessment. See the arti-
cle on page 9 for additional information
about this training.
Technical systems audits have come up
on the radar screen and OAQPS and
the Regions will be working together
this year to talk about the various ways
the Regions perform the audits, identify
some innovative ways to assess data
quality and develop some consistency
in auditing across the Regions. Greg
Noah will be coordinating with the Re-
gions on this activity.
In reviewing some of the data quality
issues that have occurred over the last
year it is also apparent that some
QAPPs lack the detail necessary to de-
fend the operations in place at some
monitoring organizations. As one goes
through an audit of data quality or is
performing a technical system audit, issues
related to data quality will be identified.
As we review this information EPA steps
through the following process is:
• What does CFR say?
• What does the QA Handbook say?
• What does the monitoring organiza-
tion QAPP say?
CFR and the QA Handbook Vol II will
always (should always) say the same thing
when in regard to a regulation. The Hand-
book does add additional checks and ac-
ceptance criteria that represents the best
guidance available at the time of publica-
tion. If a monitoring organizations QAPP
is in conflict with a regulation this should
be caught and corrected unless it is inten-
tional and there is some documentation
that EPA has accepted this alternative.
Handbook guidance can be modified for a
QAPP but changes to consensus estab-
lished acceptance criteria in the Handbook
should be identified to QAPP reviewers
and some rationale for why this criteria
has been changed. In most cases it's not
the information in the QAPP that's a prob-
lem; it's the lack of information. In prepa-
ration for technical systems audits EPA
reviews data form various AQS reports.
When issues related to data quality in
those reports are found the auditors will
go back to monitoring organizations
QAPPs/SOPs to find the process they use
to validate data. In some cases there is
not enough detail in these QAPPs/SOPs
to determine what corrective action pro-
cedures are taken.
Revisions to Our Handbook Guid-
ance on QAPP Preparation.
We are currently in the process of revis-
ing the 201 3 QA Handbook. Over the
last year or so we have found a few er-
rors that need fixing. In addition, we will
be incorporating validation templates for
the NCore network. We are going to
add language to the Handbook, as well as
a technical memo to the EPA Air Direc-
tors that will require any monitoring
agencies that plans to deviate from any
CFR requirements (that does not have an
EPA technical memo allowing for the de-
viation) to work with the EPA Regions
prior to QAPP implementation on ap-
proval of this deviation. If this deviation is
approved it will be formally documented
in the EPA approval section of the QAPP.
If QAPP approval has been delegated to
the monitoring organization EPA expects
a memo to the EPA Regions on any CFR
deviations.
DirectN02... GPT... IPX... XPX... Help!!!
With the approval of the Teledyne Ad-
vanced Pollution Instrumentation, Model
T500U CAPS (API CAPS) and the Envi-
ronment S.A. Model AS32m CAPS Nitro-
gen Dioxide Analyzers as federal equiva-
lent methods (FEMs), questions have
come up in regards to the manner in
which the instruments can be calibrated
and checked. This instrument provides a
direct reading of N02 and can be calibrat-
ed with an N02 standard rather than
using gas phase titration which is what is
called for in the Reference Method (40
CFR Part 50 Appendix F). With the cali-
bration requirement in place for GPT, moni-
toring orgs on wondering what position EPA
has on the newer direct reading technology.
The API CAPS operations manual covers
both N02 cylinder and GPT calibration tech-
niques. However, it would certainly defeat
the purchase of the monitor if one needed
to purchase additional equipment for
GPT. Russel Long, Research Chemist from
ORDs National Exposure Research Labora-
tory is aware of these questions. ORD has a
number of CAPS analyzers in addition to
other direct measuring N02 instruments and
will be evaluating calibration and challenge
procedures involving N02 cylinders and
GPT over the next few months. If the
results from testing using both techniques
are comparable, EPA could issue a tech-
nical memo allowing the use of N02 gas
standards for these direct methods.
Similarly, ORD is also currently investigat-
ing calibration/challenge issues associated
with NOy analyzers. Particularly, ORD is
looking into the use of IPN and NPN gases
during the required biweekly I-point QC
check for NOy vs using N02 via cylinder
or GPT ( See QA EYE Issue 12 pg. 8).
-------
PAGE 8
N ^
AQS Corner
March 2015 No more RP and RA Transaction
For the past few years we have been advocating
for the use of the QA transactions that have been
developed and in use for a few years now. Start-
ing in March, 2015 the QA transactions will be the
only procedure available for reporting quality as-
surance data to AQS. The AQS team has had a
number of training session in 201 3 and 2014 to
train AQS users on the use of the new transac-
tions. In addition, we will be advertising for addi-
tional training sessions in Jan and February and
posting those on AMTIC and the AQS Website.
Data Certification Updates
We have been through two years with the annual
data certification /concurrence process and each
year the system gets more and more refined. We
thank the monitoring agencies and EPA Regions
for identifying issues. We do our best to resolve
them as soon as possible. A few things that we
have refined from last year include:
M-Flag- We had a request to have an M-flag,
which identifies data that is modified, be placed on
data that have been initially certified by the certify-
ing agency but than has been changed before be-
ing evaluated by the EPA Regions. We recognize
that if the monitoring organization recertifies the
modified data prior to the EPA evaluation the M-
Flag will be removed. In fact EPA encourages
recertification of modified data.
U-Flag- The U-flag identifies data that should be
certified that has not. Starting with the 2014 certi-
fication (May 1, 2015) AQS will apply a U-Flag for
any data uncertified after July I.
QA Collocation data will be reported as raw data
in 2015
Since 2006 (see QA EYE Issue 2 page 5) EPA has been
advocating the use of primary monitors and the identifi-
cation of the CFR required QA collocated monitor to be
identified in the collocations table and requesting the
collocated data to be submitted as raw data. This elimi-
nates the need for monitoring organization submission of
a precision transaction (RP) for this information. Since
the new QA transactions are completed, use of the RP
transaction for collocated data will be eliminated in De-
cember 2014.
In order to implement this reporting procedure, the
primary monitor and the collocated monitor must be
identified in the "Monitors Collocation Period" using the
"MJ" transaction for the primary and collocated monitor.
Contact the AQS helpline for further information and
help setting this up.
QMPs Now Posted and TSAs Open for Business
The National Air Data Group (they run AQS) has post-
ed the quality management plan (QMP) dates from the
last Excel report that was posted on AMTIC in 201 I.
The hard part of the initial posting is now completed so
our expectation is the QMPs, like the QAPPs, can be
kept up-to-date by either the monitoring organizations
or the Regional QA Staff.
As mentioned in the last QA EYE, (Issue 16) technical
systems audit data can be posted on AQS. This is op-
tional for internal monitoring organization audits but we
expect the Regional TSAs to be posted. We would hope
to have 2014 TSAs posted but minimally this will start in
2015.
I
/
~
THE QA EYE
-------
ISSUE 17
PAGE 9
A Review of Units and Decimal Places Reported for Criteria Pollutants
A review of units and decimal places for reporting of criteria
pollutant and related parameters has been performed. This
was done since different monitoring objectives (e.g., NAAQS,
AQI, trends, modelling) may necessitate submitting more deci-
mal places that might otherwise be expected for a pollutant.
For example, CO calculations for the AQI use units of ppm
truncated to one decimal place; however, for stations using
CO trace gas analyzers, the method can provide statistically
significant data down to the ppb level. Having CO data availa-
ble to the ppb level allows for better use of the information in
model evaluation and for other data uses. Therefore, we en-
courage monitoring agencies to submit data to the units and
decimal place where the method can provide statistically signifi-
cant data and allow the data systems (AQS and AIRNow) to
perform the appropriate computations.
Background
Table 14.1 of the QA handbook Volume II provides a summary
of the expected standard units and number of decimal places to
report. We have replicated that table here with some addi-
tional notes to accommodate methods that may vary from
what is normally expected.
AQS Reporting Notes:
Data may be reported in standard or other available units. For
example, CO can be reported as raw data in units of ppm or
ppb; however, all raw data are converted and saved as standard
units (e.g., CO standard units are ppm).
Table I. QA Handbook Table 14.1 with Additional Notei
• Data are saved as both raw data and again as standard
units;
• Historically, up to 5 values to the right of the decimal
place could be loaded and stored in AQS; however, re-
cently AQS was modified to allow more values as neces-
sary.
• Summary Scale. This AQS field provides for the number
of decimal places available in standard units. For conven-
tional CO methods the summary scale is one with stand-
ard units of ppm (e.g. a value of 0.6 ppm); while trace gas
methods have a summary scale of 3 in units of ppm (e.g.,
0.226 ppm). For all other parameters, the summary
scale is the same whether it's a conventional or trace gas
method (i.e., one for all S02 and N0/N02 methods in
units of ppb).
• It's perfectly acceptable to report more decimal places
than expected for a pollutant as AQS will appropriately
handle all computations; however, never report less dec-
imal places than what's expected.
• For NO/NOy, use the same units and decimal places as
N02. (i.e., ppb to one decimal place).
Continued on page 10
Pollutant
Units
Decimal
Places
Example
Minimum reporting requirement
(as described in 40 CFR Part 50)
Notes
(these notes provide additional
information not described in table 14.1)
PMl.5
pg/m3
1
10.2
Shall be reported to AQS in micrograms per cubic
meter (ygfim3) to one decimal place, with additional
digits to the right being truncated (App. N)
The Met One BAM 1020 provides hourly data in
whole numbers (i.e., there are no decimal places
when using hourly pg/m3}
Therefore, hourly concentration data should be
reported as whole numbers.
PMid
Mg/m3
1
26.2
No descriptor) found
Lead (Pb)
pg/m3
3
1.525
Pb-TSP and Pb-PM 10 measurement data are reported
to AQS in units of micrograms per cubic meter
(U^rn:r) at local conditions (local temperature and
pressure. LC) to three decimal places; any additional
digits to the right of the third decimal place are
truncated (App. R).
03
ppm
3
0.108
Hourly average concentrations shall be reported in
parts per million (ppm) to the third decimal place,
with additional digits to the right of the third decimal
place truncated (App. P).
SO2
ppb
1
35.1
Reported to AQS in units of parts per billion [ppb), to
at most one place after the decimal, with additional
digits to the right being truncated with no further
rounding (App. T)
NO!
ppb
1
53J2
Reported to AQS in units of parts per billion [ppb), to
at most one place after the decimal, with additional
digits to the right being truncated with no fuither
rounding (App. S)
CO
ppm
1
2.5
No description found
Recommend reporting trace gas CO analyzers
to three decimal places (in ppm) to take
advantage of the higher sensitivity of these
methods.
PM10-15
1
10.2
No description found - fbJtow PM2.5 refjuirements
-------
E I 0
N
\
Decimal Place Reporting for Criteria Pollutants (continued from page 9)
For data submitted to AQS how many decimal
places are being included?
We reviewed reporting of criteria and related pollu-
tants and in almost all cases we are seeing either the
appropriate number or more decimal places than ex-
pected. However, for reporting of CO with trace gas
methods, we see a number of sites with less decimal
places than expected (i.e., 3). Figure I is a histogram
of 201 3 CO hourly data for values below I ppm.
Note: large spike at 0.25 is due to use of the Zi MDL
policy for conventional trace gas methods. MDL of
conventional CO FRMs is 0.5 ppm. MDL of trace gas
methods is 0.020 ppm. The mean of all hourly data
collected in 2013 is 0.321 ppm. The mean of hourly
data with methods having an MDL of 0.020 ppm or
better is 0.247 ppm.
¦»».« .....I 1 J ,.„l 1, 1
0.Q05 0.05 0 095 014 0.185 0.23 0 275 0.32 0.385 0 41 0.455 0 5 0S45 0.59 0 835 0 68 0.725 0.77 0.815 0 86 0.905 0.95 0 995
Sample Measurement
.1
Figure 2 is a Histogram of hourly data with methods
having an MDL of 0.020 (trace gas methods) or better
and concentration of I ppm or less. The figure illus-
trates how values reported with less decimal places
than expected (I or 2 instead of 3), results in data
being grouped at the tenths or hundreds of a ppm. If
all trace gas data for CO were reported with 3 deci-
mal places we would expect a smoother histogram
than this.
Data submitted for the CO trace gas monitor
and number of decimal places
This was generated by looking at CO monitors with
an MDL of 0.020 or better. Monitoring data were
organized by looking for at least a thousand hours of
data reported in ppm to either I (tenths of a ppm), 2
(hundreds of a ppm), or 3 (thousands of a ppm) deci-
mal places between -0.2 and 2.0 ppm. A total of 99
trace gas CO monitors were available in AQS to
support this review. All of these monitors should
report to 3 decimal places.
• Reporting to I decimal place - 9 monitors
• Reporting to 2 decimal places - 28 monitors
• Reporting to 3 decimal places (as is expected) -
62 monitors
Regions will be notified of the 37 trace gas CO moni-
tors where we do not see ppm with the expected 3
decimal places.
0008 0 056 0 104 0 152 0 2 0248 0.296 0 344 0 392 0.44 0.488 0.536 0 584 0 632 0.68 0.728 0.776 0 824 0 672 0.92 0.968
Sample Measurement
:
Fig 2
THE OA EYE
-------
ISSUE 17 PAGE II
Region 4 on the Training Trail
Over the last 7 months Region 4
monitoring staff in Atlanta and QA
staff in Athens have teamed up to put
on an impressive three-day air moni-
toring and QA training course that
they have taken on the road. The
first session kicked off in Nashville,
TN (June 24-26) and has since been
to Athens, GA (Oct 7-9) and Orlando
(Nov 4-6), A future session is sched-
uled for Montgomery, AL (Feb 3-5).
Florida has asked for a repeat engage-
ment. The training is primarily fo-
cused on the technical monitoring staff
with the goal to demonstrate how im-
portant their day-to-day work is in en-
suring that data is acceptable for
NAAQS attainment decisions. This
training went through a lot of the facets
of the monitoring program from grants,
to network design, implementation of
the monitoring networks, attainment
demonstrations, our major regulations
in 40 CFR parts 50, 53 and 58, to data
quality and data quality assessment.
Each phase was covered in enough detail
to identify the important aspects and
have interactive conversations on its
importance to the ambient air program.
The Team did a great job describing the
technical systems audit process, how
they prepare for a TSA and what they
look for. The Team found many data
quality related issues right in AQS re-
ports. They went through these reports
with the participants and asked them
what they saw. Through dialogue at
the training session, the participants
were able to identify the data quality
issues that should have been correct-
ed prior to data submittal. Using this
process they were able to train the
participants on what to pay attention
to in their day-to-day operations that
can solve data quality issues before
they become larger problems. A lot
of effort has been put into this training
program. Kudos to:
The Atlanta Staff: Todd Rinck, Darren
Palmer, Dan Garver and Ryan Brown
The Athens Staff: Laura Ackerman,
Richard Guillot, Dougjager, Mike
Crowe and Stephanie McCarthy.
Data Quality Issues in PM 2.5 Labs
Over this last year serious data quali-
ty issues have been identified in a
number of PM2.5 weighing laborato-
ries that have been of concern to
EPA. Some of these issues have
resulted in the invalidation of enough
data to defer NAAQS decisions.
There seems to be a similar thread to
many of these data quality issues and
they derive from the laboratory con-
ditioning requirements.
Temperature and Humidity Require-
ments
40 CFR part 50 Appendix L describes
the PM2.5 field and laboratory meth-
ods. Requirements include:
Mean temperature: 20 - 23 °C
Temperature control: ±2 °C over 24
hours
Mean relative humidity: 30-40 %
Relative Humidity control: ±5 % over
24 hours
Filters must be conditioned at the same
conditions (humidity within ±5 relative
humidity percent) before both the pre-
and post-sampling weighings.
Some exceedances of these require-
ments were identified while performing
technical systems audits and reviewing
laboratory data. Some labs did not have
data recording devices to document
adherence to mean conditions or to
demonstrate control over 24 hour peri-
ods. The monitoring guidance called for
the temperature (+ 2o C over 24
hours) and humidity (±5 % over 24
hours) control requirements to be
based on a 24-hour standard deviation
of the temperature or humidity values.
In order to evaluate this data, data re-
cording of these conditions (5 min val-
ues are suggested) are required in or-
der to confirm the mean and a reasona-
ble standard deviation over this time
period. Some labs weighed and report-
ed data within the required tempera-
ture and humidity conditions but could
not document that conditions were in
control. There were also cases
where the labs could not document
that the relative humidities at pre-
weighing (prior to sampling) and post
weighing (after sampling) were within
+ 5 percent.
EPA is aware that samples need to be
weighed within 10- 30 days of the
time they are sampled and this re-
quirement puts considerable pressure
on laboratories to weigh these filters
quickly. However, our best guidance
is to not weigh filters outside the re-
quirements in 40 CFR Part 50 Appen-
dix L. Weighing labs should consider
alternate facilities when conditions
cannot be achieved in the laboratories.
EPA has a national weighing contract
that can be quickly implemented to
cover weighing activities when neces-
sary. Contact your EPA Region for
more information on this contract.
-------
G E 12
*S
NATTS Update
Dave Shelowand Greg Noah have instituted some
changes to the NATTS proficiency test (PT) pro-
gram which were discussed at the Atlanta Ambi-
ent Air Conference. The PT results will be pre-
sented to monitoring organization in three differ-
ent comparisons:
• By the spiked value reported by the QA con-
tract lab
• By the average of the concentrations reported
by three referee laboratories, and
• By the average of the concentrations reported
by the NATTs labs.
•
The labs seem to be on board with reporting the
data by these three approaches. In the past we have
had only one referee lab. We feel having three labs
will provide more confidence in the PT values.
In addition, EPA will also continue to review the
data by the Youden technique describe in QA EYE
Issue 16 to determine within the population of
NATTS laboratories whether there are any labs
considered statistically different or consistently
report lower or higher results then the other
NATTS labs.
NATTS technical system audits will be scheduled for
2015. QA EYE issue 16 (page 9 ) provided some
discussion on this. We are committed to work with
the EPA Regions to schedule visits so the Regions
can participate. In addition, we have made a change
in how we plan to report the results of these TSAs.
We will report; findings, observations and comments.
We have replaced the term recommendations with
the term comments. We will also be committed to
following up on finding to ensure corrective action is
addressed.
The NATTS 201 I -2012 QA Annual Report (QAAR)
has been posted to AMTIC at http://www.epa.gov/
ttnamti I /airtoxqa.html. Recommendation for the
QAAR include:
• Require the reporting of MDLs to AQS
• Include fields in AQS to specify the meaning of
various POCs, and require the population of
these fields
• Include fields in AQS to capture the results of
ongoing flow audits performed by the monitor-
ing agencies, and require the population of these
fields
• Standardize the units of concentration used in
AQS, and require that results be uploaded in these
units only
Final Thoughts
The National Ambient Air Meeting
was a wonderful opportunity to
discuss many QA related issues
with monitoring organizations.
We received a lot of questions and
we are trying to pursue as many as
we can get to.
We've had questions about the low
level annual performance evalua-
tions and whether the low level
audits are meeting the acceptance
criteria. Similarly our new Appen-
dix A proposal has a lowering of
the I-point QC check and a re-
quirement to select the I-point
QC concentration at the mean or
median concentration of data
within the PQAO. We will be
doing a major assessment of both
routine concentration data as well
as how well monitoring organiza-
tions are achieving low level au-
diting for the agencies attempting
these lower concentrations.
We had some questions about
siting criteria and how far one
takes the requirements about ob-
stacles. The monitoring regula-
tions require the distance from
the obstacle to the probe inlet,
or monitoring path must be at
least twice the height that the
obstacle protrudes above the
probe, inlet, or monitoring path
For example, if you monitor in a
valley or base of a hill can a
mountain range/ hill count as an
obstacle? How far does one
need to take the siting require-
ments? We plan to address
these and other questions this
year and include it in our next
Handbook revision in 2016.
~
THE QA EYE
-------
?
73
\
PRO^
EPA
EPA-OAQPS
C304-02
RTP, NC 27711
\
uj
o
T
The Office of Air Quality Planning and Standards is
dedicated to developing a quality system to ensure that
the Nation's ambient air data is of appropriate quality
for informed decision making. We realize that it is only
through the efforts of our EPA partners and the moni-
toring organizations that this data quality goal will be
met. This newsletter is intended to provide up-to-date
communications on changes or improvements to our
quality system. Please pass a copy of this along to your
peers and e-mail us with any issues you'd like discussed.
E-mail: papp.michael@epa.gov
Mike Papp
Key People and Websites
Since 1998, the OAQPS QA
Team has been working with the
Program
Person
Affiliation
Office of Radiation and Indoor Air
STN/IMPROVE Lab Performance Evaluations
Eric
Bozwell
ORIA- Montgomery
in Montgomery and Las Vegas and
Tribal Air Monitoring
Emilio
Braganza
ORIA-LV
ORD in order to accomplish it's
Speciation Trends Network QA Lead
Dennis
Crumpler
OAQPS
QA mission. The following per-
OAQPS QA Manager
Joe
Elkins
OAQPS
sonnel are listed by the major
Standard Reference Photometer Lead
Scott
Moore
ORD-APPCD
programs they implement. Since
all are EPA employees, their e-
mail address is: last name.first
name@epa.gov.
National Air Toxics Trend Sites QA Lead
Criteria Pollutant QA Lead
NPAP Lead
PM2.5 PEP Lead
Greg
Mike
Mark
Dennis
Noah
Papp
Shanis
Crumpler
OAQPS
OAQPS
OAQPS
OAQPS
Pb PEP Lead
Greg
Noah
OAQPS
The EPA Regions are the prima-
Ambient Air Protocol Gas Verification Program
Solomon
Ricks
OAQPS
ry contacts for the monitoring
STN/IM PROVE Lab PE/TS A/Special Studies
Jewell
Smiley
ORIA-Montgomery
organizations and should always
STN/IM PROVE Lab PE/TSA/Special Studies
Steve
Taylor
ORIA-Montgomery
be informed of QA issues.
Websites
Website
EPA Quality Staff
AMTIC
AMTIC QA Page
URL
EPA Quality System
http://www.epa.aov/ttn/amtic/
http://www.epa.gov/ttn/amtic/guality.html
Description
Overall EPA QA policy and guidance
Ambient air monitoring and QA
Direct access to QA programs
------- |