EPA OFFICE  OF AIR
QUALITY  PLANNING
AND STANDARDS
SPECIAL
POINTS OF
INTEREST:

.   AA-PGVP pro-
    duces I st An-
    nual Report.
    Results Accept-
    able

    New Concepts
    for Gaseous QC
    samples

INSIDE THIS
ISSUE:
PM2.5 Continuous    I
Monitoring Data

AA-PGVP Annual    I

Air Toxics Meeting   3

PEP/NPAP Training  5

Method2.l2Correc-  5
tion

One Man's Opinion   6

Pb Audit Strips      7

8x10 Filters...Do     7
Sides Matter

Sunset Carbon       8
Evaluation

NOy Audit Progress  8

TAMS Center QA    9

Jigs for URGs        10

Web Entry for Pb    10
PEP Collocated
Samples
    The  QA  EYE
    ISSUE
                                                                         MAY,   2011
 2008-2010 PM2.5 Continuous Monitors.. How do They Compare to
                                     the FRMs?
OAQPS and the monitoring
community have been very
interested in the pursuit of
continuous PM25 methods in
order to alleviate the filter
preparation/analysis burdens
and the slower reporting that
comes with filter based meth-
odologies.  In 2008, the first
continuous method was ap-
proved as an FEM.  Since then,
five more continuous methods
have received approval.

With approval and  deployment
there has been some concern
about performance. Perform-
ance, in this case, is defined as
how well the continuous 24-
hour value compares to a fed-
eral reference method collo-
cated at the same location. In
some areas/sites, they perform
very well; in other areas they
do not.  At this point we are
unsure whether these differ-
ences are related to instrument
malfunction, operator related
issues, or environmental issues
(constituents of PM, temp, hu-
midity, etc.).  It is  likely that all
three play some role.

Based on observations made at
the 2010 NACAA Steering
Committee Meeting, OAQPS
started looking at sites that had
collocated continuous and FRM
data.  Data for this evaluation
was pulled from AQS for the
years 2008-2010.  Any sites
with collocated samplers were
evaluated (not just "required"
collocated sites for QA). Every
sampler (method designation/
POC) was paired with every
other PM25  sampler/monitor at
the site. Data from a site was
included if the site had > 15
sample pairs in order to have a
reasonably representative num-
ber of data points.   The fol-
lowing is some of our prelimi-
nary evaluations. Statistics
used to estimate precision
and bias are those specified in
40 CFR Part 58 Appendix A.

Precision  Evaluation
Summary

The coefficient of variation was
estimated by site for each sample
pair.

Continued on Page 2
                                Ambient Air Protocol Gas Verification Completes Year 1
                                                      Annual Report Posted
                         The first year of the Ambient Air
                         Protocol Gas Verification Program
                         (AA-PGVP) has been completed
                         and an annual report produced
                         which is posted on AA-PGVP AM-
                         TIC Webpage at http://
                         www.epa.gov/ttn/amtic/
                         aapgvp.html. EPA would like to
                         thank the State/Local/Tribal com-
                         munity for their participation.
                         These organizations are acknowl-
                         edged in the report. In addition,
                         we thank EPA Region 2 (Avi Teitz,
                         Mustafa Mustafa) and Region 7
                         (Thien Bui) and their management
                         for supporting this endeavor and
                         making it a success. Some impor-
                               tant facts from the Report in-
                               clude:

                               For the 2010 AA-PGVP,  EPA
                               received surveys from 88 of a
                               possible 118 primary quality
                               assurance organizations/reporting
                               organizations (PQAO/ROs),
                               which is about a 75% response
                               rate.

                               Out of the 88 survey respon-
                               dents, EPA received 109 re-
                               sponses for specialty gas produc-
                               ers since some surveys listed
                               multiple specialty gas producers.
                                Figure I identifies, as a percent-
                                age of the total responses, how
                                often the PQAO/ROs listed a
                                particular specialty gas producer.
                                As mentioned above, only about
                                75% of the PQAO/ROs re-
                                sponded so this cannot be con-
                                sidered a complete survey.

                                Ten specialty gas producers
                                were identified in the survey.
                                EPA provided verifications to all
                                but two specialty gas producers.

                                Continued on Page 4

-------
              PAGE
                                        2008-2010 PM2.5 Continuous Monitors (Continued from Page 1)
Tahta 1. *,:;,:*•, *.:.,- J:L- .
Vcthcd  116.116 116_117
   ?QOfl   T78   4.98
   2DW   'J.IJ
   JQ10   4.W
AVERAGE    it?   455
                       16.170
                        19.JG
                        16.81
                        20.11
                                 iij.118 U7.170  ii8.ii» iu.i»5 111.170 Data from the sites
                                   'I*   u..   «l   ':"   ^ then aggregated by method
                                   I'M   nil   IK  "«   uoi designation pairs and the
                                                           per-site CVs averaged.
Method  lla_lai 111 184  119_119 119_120 119 170  UO_UO 12Q_145  IB) 170 142_142 142 170 '
  JOM        »ijj  ";.«   "MI         ";.ii   %.i/  UM  "*«      Table I provides annual and
  S   S   ""  -"t   'i55   SS   '"    ™  "n   ^   !^ 3-year estimates of each
AVERAGE   .5.2,   u.n   7.7,   ii.i,   ».»   ,.M    B.M  ».»   ..,o   i7.es method code combination.
hhthed  143.143 143,170  145,145 145,170 145,191  145,184 153,170  155,155 170,170 170,181
  3008   6-??         3.17                         7.04   6.96
  2009   «..'!   tt.»   /.B6   Id.bB   Id.M    lt.W>   <1..1b   'JBb  U.79
  JMO   686   19.69   7.38   16.98   1749    16 aJ   1678  1840  1706   11.3!
AVERAGE   S.M   17*9   7.80   U.lt   15 56    IS £5   19.11  11.77  11.37   11.31
               2008-2010 PM2.S Precision Estimates
                                                           In table  I, the "method"
                                                           rows indicate each pair of
                                                           methods being compared
                                                           by AQS method codes; the
                                                           method codes, concate-
                                                           nated with the sampler
                                                           names, are listed below.
                                                           Figure I was generated to
                                                           show 3-year precision esti-
                                                           mates (sorted by increasing
                                                           imprecision) when the rou-
                                                           tine and collocated sam-
                                                           plers had the same method
                                                           code (Like MD), manual
                                                           FRMs with unlike method
                                                           designations (Unlike  FRM),
                                  and manual FRMs compared to continuous instru-
                                  ments (FRM/FEM).  Thus, there is a marker in figure I
                                  for each pair of methods listed in table I. As the pre-
                                  cision data indicate, most collocations with the same
                                  method codes and the unlike FRMs meet the 10%
                                  precision data quality objective (DQO) goal.  The
                                  estimate with the highest imprecision (13.3%) of like
                                  method codes is a site with two collocated Met-One
                                  BAM instruments. One site provided these data in
                                  2008 and 2009 and two sites provide data in
produced a positive bias (FEM high compared
to FRM) with about 35% of the comparisons
meeting the ± 10% DQO goal. Figure 3 (on
page 3)  is a repetition of the FEM/FRM bias
estimates (green line) in figure 2 with each
FRM/FEM pair labeled for easier identification.

Additional evaluation work has been per-
formed by Tim Hanley of the Ambient Air
Monitoring Group. Tim  has evaluated the con-
tinuous FEMs against collocated FRMs run by
the same monitoring agency using the per-
formance criteria for acceptance of class III
FEMs. The class III performance criteria are
defined in 40 CFR Part 53 and are a different
set of statistics than those used in Appendix A.
Tim provided an assessment that was included
in the PM25 docket and  included the following
remarks.

The lack of acceptable performance data from
some FEMs as compared to collocated FRMs, on a
24-hour basis,  calls into question the use of these
continuous FEMs.   For the PM25 primary standard,
monitoring agencies have the option of continuing
to use FRMs, or where applicable, using a well
performing continuous FEM. The annual monitor-
ing network plan (described in §58.10),

Continued on Page 3
Method
BGI PQ2CO/200A
BSI PQ200-VSCC or PQ200A-VSCC
R B P Partisol-FRM 2000 PM-2.E
R & P Partisol-FRM 2000 R.1-2.E [FEM]
"tiermo Scientific Partisol 2000-FRM
R S. P PartisoLPIus 2025 PI.I-2.E Seq.
R 5 P Partisol-Plus 2025 PI.I-2.5 [FEU] Seq.
Thermo PartisokPius 202E Sequential
Graseby Andersen RAAS2.5-100
Thermo Electron RAAS2.S-100 FEM
Grasefcy Andersen RAAS2.5-30Q
Thermo Electron RAAS2.S-300 FEM
BGI PQ200-VSCC or P0200A-VSCC
R S, P Partisol-FRM 2000 PM-2.5 [FEM]
Thermo Scientific Partisol 2000-FRM
R & P Partisol 2000 PM-2.5 FEM Audit
R & P Partisol-Plus202E PM-2.S [FEM] Seq.
Thermo Partisol-Plus 2025 Sequential
Thermo Electron RAAS2.5-100 FEM
Thermo Electron RAAS2.E-300 FEM
Met One BALM 020 PM-2.E [FEM]
ThermoTEOM® UOOa with Series 8500C
FDMS®
"hermo Scientific Model 5020 SHARP
Method
Des.
116
116
117
117
117
118
118
118
119
119
120
120
142
143
143
144
145
145
153
155
170
151
1S4
                                    2010.  All FRM to FEM collocated compari-
                                    sons were greater than the  10% DQO goal.

                                    Bias Evaluation Summary

                                    Data selection and preparation was the
                                    same as the procedure used for precision.
                                    The percent difference (PD) calculation was
                                    used.  Each percent difference pair measured
                                    at a site is then averaged to  calculate the site
                                    PD. Then, each site's average PD is averaged
                                    for all  sites/PQAOs with a particular method
                                    code combination. Table 2 provides annual
                                    and 3-year estimates of bias for each method
                                    code combination. Figure 2 was generated
                                    to show 3-year bias estimates for routine
                                    and collocated samplers with the same
                                    method code (Like MD), manual FRMs com-
                                    pared  to unlike method designations (Unlike
                                    FRM),  and manual FRMs compared to con-
                                    tinuous instruments (FRM/FEM).  All collo-
                                    cations of same method designations and
                                    collocations with unlike FRMs met the ±
                                    10% DQO Goal.  All FEM/FRM collocation
                                                                                 Table Z. 6<« tUt!m»tn
                                                                                 Method  116_116 116J17 116_118  116_170 117_117  117_11» 117_170  118_118 118_14S  118^170
                                                                                    200S   -0.14  -l.es          12.46   -0.05   1.55         3.30    J.79  52.19
                                                                                    2009    3.27        -1.20   20.22    0.05   -0.33   36 JO   1.39    1.56  JO.46
                                                                                    2010   -0.51              27.82    0.73   2.73   44.49   2.66    i.41  34.02
                                                                                 AVERAGE    0.88   -1.65    1.28   20.11    0.22   1,30   40.30   2,12    1.92  38.89


                                                                                 Method  11B_131 I1B_184 119_119  119^120 119_170  120_120 LM_1«  120_170 142_142  142_17O
                                                                                    2008         35.84    1.01   -O.M         1.18    2.62  20.62   -0.96
                                                                                    3009   -fl.O3   31.6-1    0.41   -3.61   24.51   0.09    6.93  30.84    0.56  ll.U
                                                                                         25.30        -1.26        28.99   1.85        46.54    '3.71   7.63
                                                                                    2010
                                                                                 AVERAGE
                                                                                         10.64
                                                                                               31.74
                                                                                                     0.05
                                                                                                          -2,25
                                                                                                               26. K
                                                                                                                     l.M
                                                                                                                           4.78
                                                                                                                                34.00
                                                                                                                                      0.10
                                                                                                                                           9.47
                                                                                 Method !«_!«  143_170 145_145  145.170 145_18J  14S_184 1S3_170  155,155 170_170  170.181
                                                                                    2008    1.15         0.25                         C.55    2.51
                                                                                    2O09    0.14   12J4    1.27   27.69    9.63   lfi.97   13.30   -3.67   11.84
                                                                                    2010    -D.33   24.65    C.SB   20.W   16.09   -1.40    7.14   -8.69    4.47   1.99
                                                                                 AVERAGE    0.32   18.59    0.80   24.34   12.85   7.79   10.22   -3.60    6.M   1.9Qi
                                                                                                 2008-2010 PM2.5 Bias Estimate
                                                                                         45.00
                                                                                         40.00
                                                                                         2530
                                                                                         20.00
                                                                                         10.00

                                                                                          5.30
                                                                                         •530

                                                                                         -10.00
                                                                                                              Method DcMpiatKMi Paints
                    THE  QA   EYE

-------
 ISSUE  II
                                                                                                         PAGE  3
 2008-2010  PM2.5 Continuous  Monitors (Contiiwed/rom Page 1)
due to the applicable EPA Regional Office by
July I of each year, is the appropriate place
for monitoring agencies to identify the meth-
ods and sampling frequencies it will
operate in its network..
In cases where a PM25 continuous
FEM is not meeting the pan 53 per-
formance criteria, we recommend
keeping the PM25 FRM as the Primary
monitor while working towards im-
provements in FEM data quality. For
those agencies with well performing
PM25 continuous FEMs, we support
the  use of these instruments in the
agencies network.

Tim's memo is available at: http://
www.epa.gov/ttn/naaqs/standards/
pm/data/HanleyandReff0407l l.pdf.
      Tim has also engaged the vendors in confer-
      ence calls in order to determine if there are
      any helpful hints to improve the monitors
       2008-2010 Bias Estimates of FRM to FEM
operations, any additional checks that can
help identify malfunctioning instruments,
and services to help monitoring organiza-
          tion trouble shoot their instru-
          ments. In the meantime, please
          pay particular attention to your
          FEMs. If you have had success
          with your instruments and
          you've done something not
          currently in an operations man-
          ual that has improved your
          system, let Tim Hanley know
          (hanley.tim@epa,gov).

          Special thanks goes out to
          Rhonda Thompson and Adam
          Refffrom the Air Quality Analy-
          sis Group for their help in this
          and  Tim Hanley's evaluations.
    U.UJ

Figure 3. Bias Estimates
                      Air  Toxics  Meeting Has  Good Turnout
   EPA / OAQPS held an Air Toxics
   Monitoring and Data Analysis
   Workshop, April 4 - 7, 201 I, at the
   US EPA Region 6 Headquarters in
   Dallas TX. The workshop agenda
   and presentations are available at
   http://www.epa.gov/ttn/amtic/
   toxmeet.html. Nearly  100 air qual-
   ity professionals from US EPA Pro-
   gram and  Regional Offices, State,
   Local, Tribal and nonprofit agencies,
   and academia participated in the
   workshop providing a  refreshing
   breadth of perspectives.  In addition
   to some very informative presenta-
   tions on a variety of monitoring and
   data analysis topics, there were
   panel discussions on several topics
   to include EPA's response to air
   quality concerns associated with the
   BP oil spill, fugitive emissions from
   oil and gas field operations, and
   ambient Hg monitoring.  However,
   the aspect of this workshop that
       enhanced the communication and pro-
       ductivity beyond "the norm" were the
       brainstorming sessions to discuss
       what's worked or hasn't worked in air
       toxics, what have we been doing that's
       no longer needed, what haven't we
       been doing that js needed, what are the
       program strengths and weaknesses,
       obstacles, suggested improvements,
       etc.  To begin the process, a plenary
       brainstorming session was held as the
       last session on Tuesday, April 5th.
       Following this plenary discussion, a
       handful of EPA folks worked to identify
       up to four themes into which these
       specific suggestions were segregated;
       among the topic areas were technical
       concerns, working with community
       groups, and programmatic issues.
       Breakout sessions were held during the
       last session on Wednesday, April 6th,
       during which  time the themes and spe-
       cific topics were discussed. Recom-
       mendations were delineated by each
 group and presented by a spokesper-
 son from each of the breakout
 groups on the morning of Thursday,
 April 7th.  During the workshop, a
 presentation was made on the QA
 aspects of the National Air Toxics
 Trends Stations (NATTS).  The pre-
 senter outlined the Data Quality
 Objectives of the program and illus-
 trated the QA data (precision, bias,
 completeness and detectability) of
 the data from the inception of the
 program in 2004 through 2009.
 Questions can be directed to Dennis
 Mikel at mikel.dennisk@epa.gov.
 There were several "projects" that
 were promising in terms of both
 benefit and feasibility to complete
 within a reasonable timeframe (i.e.,
 one year). Further detail regarding
 outcomes from these discussions will
 likewise be posted, as available, at
 the AMTIC website identified above.

-------
    PAGE  4
                        Ambient Air Protocol Gas (Continued from Page  1)
   PQAO/RO Specialty Gas Producer
                Use
          (109 Responses)
    I Air Gas

    • AirLiquide

    • MathesonTri-Gas

    American Gas Group

    Red Ball
I Scott-Marrin

I Praxair

I Linde

 Liquid Technology

 Specialty Air Technologies
 Fig. 1 Specialty gas producer use
The two gas producers, Red Ball and Linde, that were not verified were only providing
standards to one PQAO/RO survey respondent each. They did submit cylinders in the
first quarter of 201 I.

Table I  provides the final tally for the verifications occurring each quarter. Some cylin-
ders were multi-pollutant which is why the pollutant total is different from the cylinder
total.

As indicated in 40 CFR Part 75 Appendix A, EPA Protocol Gases must have a certified
uncertainty (95 percent confidence interval) that must not be greater than plus or minus
(+) 2.0 percent of the certified concentration (tag value) of the gas mixture.  However,
this acceptance criterion is for the Acid Rain Program.  The AA-PGVP adopted the crite-
ria as its data quality objective and developed a quality system to allow the RAVLs to
determine whether or not an individual protocol gas standard concentration was within
± 2% of the certified value. The Ambient Air Program has never identified an acceptance
criterion for the protocol gases. Since the AA-PGVP has not been established to
 Table  I. Cylinders and  Pollutants Analysed by RAVL by Quarter.
Region
2
7
Quarter 2
Cylinders
4
6
Pollutants
6
10
Quarter 3
Cylinders
6
4
Pollutants
12
5
Quarter 4
Cylinders
0
4
Pollutants
0
9
Total CY20 10
Cylinders
10
14
Pollutants
18
24
provide a statistically rigorous assessment of any specialty gas producer, the RAVLs report all valid results as analyzed but it is sug-
gested that any difference greater than 4-5% is cause for concern.

Results show that of the 42 standards that were verified, 41 were within the ± 4-5%  AA-PGVP criteria, and 39 (92%) were within
the ± 2% Acid  Rain Program criteria. One result did not meet our criteria.

Survey Improvement

We did not get 100% completeness on surveys.  In order to correct this, EPA developed a web-based survey. This survey has a
point of contact email address for all 118 PQAO/ROs.  The survey lists the ten 2010 specialty gas producers along with their
multiple production facilities. The point of contact must select one of those facilities (or mutiples) from the pick list or add a new
production facility. If a new facilty is added, EPA will ensure it is a legitimate production facility (not a distributor) and will add it to
the pick list for other points of contact to use.  Every two weeks, EPA will determine which points of contact have not completed
the survey and send a reminder email to them indicating that the survey has not been complete. EPA hopes this will inspire all
PQAO/RO to complete the survey. We need the monitoring organizations help in completing this survey.

Participation Improvement

Since the program is voluntary, EPA cannot require particiption. We hope that the PQAO/ROs will see the benefit of  an
independent verification of their cylinder and we will get at least  10 cylinders per RAVL per quarter. PQAO/ROs did have difficulties
with  some shippers (in particular UPS) in the transport of these cylinders to the RAVL. In some cases they were never shipped due
to these difficulties.  EPA has worked with UPS to develop a set of shipping instructions that  may help the PQAO/ROs in the future.

Verification of Each Production Facility

Since the intent of the AA-PGVP is to be a blind verification, meaning the gas standard used for the verification is unknown to the
producer, we rely on the PQAO/ROs for particpation. However, with some specialty gas producers being used by only a few
PQAO/ROs, EPA will inform those specialty gas producers earlier in the year that they may want to provide the RAVL with a gas
standard. At a minimum, EPA will make sure there is capacity in  the last verification quarter  for those production facilities to send
the RAVL a gas standard when a standard representing that producer has not been sent by a PQAO/RO.
        THE  QA  EYE

-------
            PAGE 5
Training personnel
involved in NPAP
through the probe
                    Performance Evaluation Training Completed for Another Year
             With all the un-
             certainty sur-
             rounding govern-
             ment shutdowns,
             OAQPS managed
             to get in the Na-
           P tional Perform-
             ance Audit
             (NPAP) and PM25
             and Pb Perform-
             ance Evaluation
Program (PEP) training and certi-
fications accomplished the week
on April 18.

Dennis Grumpier started out the
week with PM2 5 and Pb  PEP
training.  Dennis had provided
three webinars for PM25 and two
for Pb prior to the actual hands-
on training. These webinars have
been very cost-effective in pro-
viding training on the areas that
do not require sampler  set-up,
verifications, and sample re-
trieval. The webinars save close
to a day and a half of travel from
each pollutant training course
and allowed us to complete all
three program training/
certification/recertifications in
one week. Dennis had about 40
personnel at the PM25 and Pb ses-
sions. Most were being recertified
but we did have about 5 new audi-
tors that included State, Local and
Tribal monitoring personnel.

Mark Shanis conducted NPAP
through-the-probe training/
certification/recertifications on
Thursday and Friday of the week.
Mark had two regular range cased-
based systems (one seen in the
picture), two regular range truck-
based NPAP TTP systems, one
trailer-based system and one trace
-level case-based system on hand.
Similar to the PEP,  Mark had three
                                                   Chris St Germo/ne (EPA Reg/on I)
                                                   checking out cased-based NPAP sys-
                                                   tem.
webinar sessions, prior to the
training session, to go over the
program details that did not
require hands-on implementa-
tion.

Training would not have been
successful without the help of
those in the EPA Regions who
assisted with the implementa-
tion. Thanks goes out to Greg
Noah from Region 4,  Thien Bui,
James Regehr and Lorenzo Sena
from Region  7, and Chris St
Germain from Region I. Thanks
also to Solomon Ricks from
OAQPS who helped out during
 the week and to  RTI's Jeff
 Nichols and Jenney Lloyd for
 assisting in the training activi-
 ties.

 Over the past year we have
 been getting inquiries from
 contractors needing to be
 trained in these performance
 evaluations to implement them
 at PSD sites. OAQPS will be
 providing more detailed infor-
 mation in the future in order to
inform contractors early enough
to attend both  seminars and
hands-on certification.
                                          Corrections to PM9<  Method  2.12
                                                2.5
                        A set of very sharp eyes caught a discrepancy in Method 2.12. The error is in Section I 3.2, the flow
                        rate audit. The current version of the method lists the equation for  "AD (%)" as:
                        The formula should be :
                                                            x
                        This change means that the flow rate of the sampler measured by the audit device must be within
                        5% of the 1 6.67 L/min design flow rate.  The text within the section was correct but the incorrect
                        equation used the samplers indicated flow rate versus what would be considered the true flow rate
                        from the audit device.
                THE  QA  EYE

-------
 ISSUE  II
                                                                                                              PAGE  6
Operating Ranges, Calibration  Ranges,  Zero, Span                               ?
Precision Checks and Flexibility...  One Man's  Opinion        •••
Recent QA EYE articles (see News-
letter # 10) and technical memos
have provided for the expansion of
the gaseous criteria pollutant per-
formance evaluation audit levels
from 5 to  10 and the allowance of a
new statistic to evaluate the lower
two levels. Our recent guidance has
been based on the objective of hav-
ing the estimates of precision and
bias reflect the precision and bias of
the routine concentrations.  Much of
the data we see for both the per-
formance evaluations and the one point
precision checks are at much higher
concentrations than the routine data.
This could provide a false sense of the
precision and bias of the  routine data in
AQS. Why has this occurred and what
can we do to change it?
When the ambient air QA
regulations and guidance
were initially promulgated
we had higher routine con-
centrations, different meth-
ods, different and less sensi-
tive monitoring and calibra-
tion technologies and a dif-
ferent quality of gas stan-
dards. All of the technologi-
cal change has been for the
better and should allow us
to be precise and unbiased at lower
concentration ranges. In addition, older
guidance may have suggested that moni-
tors had to be operated and calibrated
at one of the ranges for which they
were approved. Our current thinking
here in OAQPS is that this is not the
case.  Figure I represents how many
monitoring organizations conceive of
the QC requirements for gaseous
monitoring. The data shown is 3 years
of ozone data for a PQAO. The moni-
toring organization has selected the 0-
500 ppb operating range. They calibrate
         Routine
          Qata
1 Point
QC DJU
                           Span Check
Operating
Range
         Figure 1. Ozone example
          using 4 upscale points with the highest
          point at 450 ppb. The span check, based
          on guidance of 80-90% of the operating
          range, is at 400 ppb.  The one point QC
          check, based on the requirement of 10 -
          100 ppb, is at 90 ppb and the routine data
          3-year average is about 40 ppb.

          During the 2008 revision of the QA Hand-
            Befbre
RQutirtE a-Point   Span     Operating
 Data  QC Data   Check    Range
                             After
                 Routine  l-point  Span Check  Meas. flange
                  Data   QC     (J20pplj)    (150 ppb)
                       Data
 Figure 2r Alternative qualitycontrol procedure
          book we had comments from monitoring
          organizations to change guidance related
          to the span check. Based on those rec-
          ommendations, we included the following
          language in Section 7.
          The span check concentration should be se-
          lected that is more beneficial to the quality
          control of the routine data at the site and EPA
          suggests: I) the selection of an appropriate
          measurement range and 2) selecting a span
          that, at a minimum, is above 120% of the
          highest NAAQS (for sites used for designation
          purposes) and above the 99% of the routine
          data over a 3-year period.  The multi-point
verification/calibrations that are performed at
a minimum annually can be used to chal-
lenge the instrument and confirm linearity
and calibration slope of the selected operat-
ing range.

This guidance provides the concept of
selecting both an appropriate measure-
ment range (other than the operating
range) for calibrating the instrument and
developing the appropriate quality control
procedures. Figure 2 might be considered
a new approach where a measurement
range of 150 ppb is established and cali-
bration points selected within that meas-
urement range. Then both the span and
the l-point QC check concentration can
be lowered.

Some monitoring organizations have been
              hesitant to lower the I -
              point QC, suggesting the
              higher concentration can
              be used to reflect the qual-
              ity of the data around the
              NAAQS.  This is a legiti-
              mate  rationale but the
              span check in the alternate
              procedure can be used to
              that effect and the  I -point
              QC check can then be
              used to represent the
              precision and bias of the
              routine concentration. So
in summary, we think that monitoring
organizations have flexibility to choose
the appropriate instrument measurement
range calibration points, span check and
I -point QC relative to the concentrations
they measure at their sites. Should the
high end of the calibration range be above
the NAAQS levels?  Yes, in order to be
protective of the NAAQS and any natural
or man-made pollution events that might
occur.  However, there is no reason to
base the QC concentrations on the FRM/
FEM designated operating ranges of the
instrument.

Continued on page 7

-------
   ISSUE  II
                                                                                                                   PAGE  7
           Operating Ranges.... One Man's  Opinion (Continued from page 6)
  As an example, the monitoring organization in
  Figure I might use the procedure in Figure 3 to
  select the appropriate QC ranges.  The I -point
  QC concentration could be selected at the
  same concentration as their average routine
  value. From the 2/17/201 I assessment report
  that was posted on AMTIC, the I -point QC
  concentrations at the low end are currently
  being achieved by monitoring organizations.
  The QA Handbook that's currently undergoing
  revision can be expanded to reflect this guid-
  ance. Just one man's opinion.
      New procedure for selecting
      Operating Ranges and QC checks

     1.  Take 5-year 8-hour or 1-hour max value
     2.  Multiply value by 1.5, that's the
        measurement range.
     3.  If calculation in step 2 is below NAAQS,
        use 1.5x of NAAQS (if sites used for
        regulatory purposes.
     4.  Take 80% of new operating range,
        that's your span  check value. Span
        check can now serve as a check around
        the NAAQS
     5.  Use current CFR  and routine data to
        select 1-pointQC check concenration
     Figure 1 Case
     Assume 101 ppb is 5-year, 8-Hour max
        Routine
         Data
                                                                                               Span Check
                                                                                                (120 ppb)
                                                           Meas.Range
                                                           (150 ppb)
PC Audit Audit Level Ngmbej AveASS  AveAU
 Level  conclpsb] ofpain   PD   Olfflppb]
      4.0-5-9    0     NA    NA
      6.0-19
      20.0-39
      40.069
      70.0-S9
      90.0-119
      120,0-139
      1W.O-1S9
      170.0-189
                                                             1001
                                                             noo
                                                             4020
                                                             1233
                                                             504
                                                             2090
                                                             2972
                          4.50
                          4.19
                          3.20
                          J-34
                          2-01
                          2.U
                          2.11
                          2.02
                                                      190.0 259  2755
:tS

XI
      From assessment report- Audits at 20-39
      ppb are achieving  DQOs
NOTE (A Regional Perspective)
While the above article may represent "one
man's opinion," at least one of the EPA Re-
gional Offices has been separately thinking
about the importance of the I  point QC check,
and setting proper instrument ranges on the
various pieces of monitoring equipment. One
would think that there would be little value in
calibrating a police speed radar gun up over
500 mph when they'd never "see speeds" that
high. And if you are routinely seeing low concentrations of a pollutant, a QC check should be used to ensure the data you are collecting
has meaning and represents the true value of what you are seeing in the environment. As monitoring measurement equipment im-
proves, and ambient concentrations are reduced- we should continue to challenge the equipment appropriately.
                                             Figure 3 Possibly new QA Handbook guidance
Pb Audit  Strips  Developed for 17 Pb  Labs  in 2011- Call fill Go Out in  July for 2012 Strips
  In October 2010, EPA contacted the
  monitoring organizations sampling for Pb
  and asked whether the organizations
  wanted EPA to develop Pb analysis audits
  (Pb audit strips) for the upcoming year.
  Pb analysis audit strips are required in 40
  CFR Part 58 Appendix A Section 3.3.4.2.
  We received orders from 17 monitoring
  organizations  (about 45% of those
  polled) for the audits needed for 201 I.
  RTI, EPA's QA contractor, completed
  development of the audit strips in
  December and sent out three sets to
  the referee labs: Region 9 Pb PEP Lab,
  the Region 7 Air Monitoring Lab, and the
Office of Radiation and Indoor Air. The labs
tested 7 strips at each concentration level
and had to be with +/- 5 percent relative
standard deviation from the average of each
labs determined value and the average
concentration for each range had to be
within within 7% of the contractors  (RTI)
established concentration. All three referee
laboratories  results met the acceptance
criteria. Since some monitoring
organization's laboratory managment are
witholding concentrations of the audits  from
lab staff, OAQPS will publish the results of
the referee anlaysis after the 201 I sampling/
analysis season.
         We have had interest from laboratories
         for development of these strips next
         year. We plan on developing a memo or
         including this information in the next
         NPAP/ PEP self implementation decision
         memo that is usually distrubuted in July
         each year.  STAG funds would be
         required fo the development of these
         strips. Based on last year's cost, we
         estimate the cost to be about $300 for a
         year's set of 24 strips (12 per
         concentration).
                    When  it Comes to  8x10 Filters, Do  Sides  Matter?
   We received a question about whether
   there was a correct side for the 8x 10
   high-volume PM10 filters. In reviewing
   Method 2.1  I (1997), Section 3.3.1  did
   distinguish an "up" side as a side with
     slightly rougher texture.  However
     more recent discussions with Whatman
     (now part of GE Healthcare) described
     the filters as "bi-directional" and the
     unique ID numbers could be stamped
              on either side during manufacture.
              We suggest placing the filter ID side
              down during sampling  so when the
              filter is folded for transport the ID
              number can be seen.

-------
             PAGE  8
                             Sunset  Carbon Instrument Under Evaluation

Beth performing
calibrations on
the Sunset.
                                   mi ^
                Elizabeth (Beth)
                Oswald is the Am-
                bient Air Monitor-
                ing Group's most
                recent hire.  She
                comes to OAQPS
                by way of EPA
                Region 4 where
                she was in a rota-
                tional intern pro-
                gram that had her
                eventually gravitate
                towards ambient
air monitoring.  In November, she
came to OAQPS and we've since
loaded her down with a number of
important projects. Welcome Beth!

One such project Beth is leading
(with the help of Joann Rice and Dave
Shelow) is being conducted out at
our Ambient Air Innovation and Re-
search Site (AIRS) in Research Trian-
gle Park (RTP),  North Carolina. One
of the major areas of interest in air
monitoring continuous monitoring
technologies that could potentially
lead to a reduction in filter based
technologies. In the CSN alone, over
180 sites are collecting 24-hr filter
based  samples that are analyzed for
mass, trace elements, major ions, and
organic carbon/elemental carbon
(OC/EC). OC/EC samples are col-
lected on quartz filters every third or
sixth day and shipped to Research
Triangle Institute (RTI) for analysis.
The cost of sample preparation, ship-
ping and analysis for the carbon net-
work is approximately $2M per year.
In an effort to move towards con-
tinuous,  higher time resolution sam-
pling and reduce the need for expen-
sive, time consuming, filter based
N
sampling, AAMG purchased eight
Sunset Semi-Continuous OC/EC
instruments for future deployment
to monitoring agencies and is
evaluating two of them at our re-
search site.

The key or critical parameters to
be collected are thermal OC/EC,
optical EC, and optical BC. The
following equipment will be in-
stalled and operated at the AIRS
Monitoring Site:
• Two (2) Semi-Continuous OC/
  EC  Instruments (Sunset Model
  4) - thermal OC/EC and optical
  EC;
• One (I) Sequential Particulate
  Speciation Sampler (URG
  3000N) - thermal OC/EC; and
• One (I) Aethalometer (Magee
  Scientific AE-21) - optical  BC

The primary study objectives are
to gain an understanding of how
the Sunset instrument works
(routine operation and mainte-
nance); determine how to optimize
operation through various experi-
ments; develop a SOP for the in-
strument; establish precision and
detection limits; and determine
how well the Sunset compares
with the URG 3000N and the
Aethalometer.

As a secondary objective, the in-
formation from the study will be
used, to the extent possible, to
inform and gain insight regarding
the questions below.
• What are the space considera-
  tions for operating the Sunset
  analyzer in a shelter?
• Do any special considerations
  need to be made for control of
  shelter temperature and rela-
  tive humidity?
• What interferences exist that
  may be problematic for imple-
  mentation?
• What important parameters
  should be tracked or docu-
  mented for QC purposes (e.g.
  laser correction value, oven
  temperature, pressures, etc.)?
• How should the data be vali-
  dated?
• What is the Sunset instrument
  data capture rate?
• What is the ideal sample col-
  lection period for a rural sam-
  pling location similar to AIRS?
• How often should sucrose
  standard injections be per-
  formed?
• What is the typical value of
  nightly blanks?
• What type of denuder (parallel
  plate or carbon monolith) re-
  moves organic vapors more
  efficiently and is more practical
  to use?
• What are the capital costs,
  including additional equipment,
  for operating each analyzer?
• What level of effort and train-
  ing are necessary for routine
  operation?
                         Work continues on NOy NPAP Audits.. Success at Low Levels Being Achieved
                        Mark Shanis, OAQPS NPAP Lead, has
                        been working for the past year or
                        two on the low level NPAP audits
                        needed for the NCore network. He
                        has been working with the contractor
                        Keith Kronmiller at our Ambient Air
                                   Innovation and Research Site (AIRS)
                                   to develop the TTP systems that can
                                   provide reliable low level concentra-
                                   tions for these audits. Region 4 and
                                   now Region 3 NPAP TTP auditors
                                   have used the RTP CO and SO2
                                  Trace Level audit system and
                                  procedures. Region 4 will take
                                  the NOy equipment and proce-
                                  dures in the field in Region 4
                                  starting this summer.
                                  Continued on Page 9
                 THE  QA  EYE

-------
ISSUE  II
                                                                                                                  PAGE  9
Work continues on NOy NPAP Audits..  (Continued from Page  8)

 Recent work with NOy has successfully
 achieved audit levels I and 2 with the new
 1.5 ppb difference statistic. Our tests have
 used an Environics 9100 calibrator with a
 3rd, lower mass flow controller (20cc/
 min), a higher dilution mass flow control-
 ler (30 Ipm); and an improved ozone gen-
 eration system that accommodates the
 low ozone levels needed for GPT at the
 lowest NOy or NO2 audit levels. We
 have obtained an improved API Calibrator
 with a lower MFC and an ozone generator
 that now accommodates the lower ozone
 concentrations needed for low level GPT.
 OAQPS also got an improved zero air
 generator that has been recently redes-
 igned to have a 30 Ipm maximum flow
 capacity, improved water vapor removal
 capacity, water vapor measurement capac-
 ity, and adsorbent regeneration cycles.
 To accomplish the lower 5 levels of the
 10 level audit ranges and the ability to
 generate all 3 trace level NCORE Gases
 that are generated from compressed gas
 cylinders, we have found it necessary to
 use 2 blended gas (BG) cylinders. Agency
 operators may not need to do all the
 levels in the table, may not need to be
 able do all 3 of the trace level, non-ozone
 gases at the same time, and may not have the
 MFC and zero air capacity that we have here
 in RTP. We have them so we can demon-
 strate what can be done. It is up to the agen-
 cies and their Regions to decide what they
 need to do.

 At this point, we are going to go in the field
 with one or both of the following BG cylin-
 ders: 680 ppm CO, 60 ppm NO, and 16 ppm
 SO2 (high BG cylinder); and/or 24 ppm CO,
 I ppm NO, and I ppm SO2 (Low BG). As the
 TTP audit program (see our SOP on AMTIC)
 relies on calibrated analyzers in the field, and
 because we use CO as the most stable gas to
 do calibrations with- especially at the normal
 SLAMS ranges- we also bring a high and a low
 span CO cylinder for our audits. We are
 using a little under 5 ppm as our high span,
 for typical 0-5 ppm full scale range analyzers,
 and  0.50 ppm CO for our low span.

 In order to address concerns that could arise
 about the accuracy and reliability of GPT for
 generating  NO2 or NOy at low concentra-
 tions, we have some additional cylinders for
 independent checks on our GPT values at the
low levels. For this we use a 5 ppm NO2
cylinder and a special blend of 200 ppm
CO and I ppm NPN. As NPN is used for
checking the NOy converter efficiency, as
it is more stable than NO or NO2 and
more stable and less hard to deal with
than nitric acid (an NOy of concern),  it is
also used to do both our lowest audit
concentrations and to check MDLs. The

CO addition allows us to independently
insure that our calibrator is still working
as expected by indicating that certain CO
points have not changed.

Mark provided some information on the
earlier work at RTP on NOy in the QA
EYE Newsletter #10  He presented a
summary of the latest RTP NOy work
and answered  many questions about that
work at the recent NPAP training/
certification/recertification on April 21 st.
We are currently writing up a standard
operating procedure for the NOy TTP
Performance audit method for the NPAP
operators. After an internal review and
testing, we expect to have it completed
and distributed on AMTIC by July.
                TAMS  Center  Providing  7-Session QA  101  Training
The Tribal Air Monitoring Support Cen-
ter (TAMS) was created in 2001  with a
mission to provide technical and data
management support to the tribal air
community.  Over the  10 years of the
TAMS existence they have done a superb
job developing QA training activities and
QA Tools for the tribes.  In addition to
training, they provide technical assistance,
equipment loans (including the school air
toxics equipment), filter weighing ser-
vices for PM2.5 filters and onsite support.
They have developed tools like the Tribal
Data Toolbox and Turbo-QAPP (see QA
EYE Article # I). For those not familiar
with TAMS you can find out much more
about thier mission and programs on
their website at: http://www4.nau.edu/
tarns/

Just recently, Melinda Ronca-Battista had
the idea for a 7-week, 7-session webinar,
starting April 7 through May 19th  to
cover the quality assurance activities of
the criteria pollutants.  The course starts
off explaining the reasons behind a quality
assurance program, why they are needed,
and the EPA requirements for them. It then
proceeds into general explanations of the 40
CFR Appendix A requirements and  ends up
discussing more specific details of the QA/
QC of the criteria pollutants. Melinda has
kept the sessions lively with her own pres-
entations but has invited experts from the
tribal ambient air monitoring community as
guest speakers to provide their wisdom and
experience on these topics. Although spe-
cifically geared for the tribal monitoring
community, attendance has included State
and Local participants and EPA Regions.
Completion certificates are available for
those who complete short homework as-
signments for each session. So far the webi-
nars have attracted about 50 participants
per session and number seems to climb as
more become aware of the webinars.
Melinda is attempting to record these ses-
sions so they can be made available  to any-
one missing the course or a particular ses-
sion. Powerpoint files of the session are
also posted on their website.  Great Job
Melinda!
 Melinda Ronca-Battista on a recent
 family vacation to Trieste, Italy. In-
 stead of touring Italian architecture,
 she became intrigued -with CO Moni-
 toring Stations

-------
 ISSUE  II
                                                                                                               PAGE  10
   New Jigs  for Agencies to Regenerate Their URG Audit and  Verification  Cartridges
 We are pleased to announce the arrival
 and availability of a "jig" that agencies
 and auditors may use to load URG
 3000N performance verification and
 audit cartridges with new filters.
 Through the efforts of Jeff Lantz at EPA
 ORIA-Las Vegas, EPA had these manu-
 factured at no cost to the monitoring
 agencies.  The first distribution will be
 made to every agency that operates a
 URG 3000N sampler, unless they al-
 ready have a jig that they are satisfied
 with. Since these instruments will see
 infrequent use we could not justify the
 purchase of one for every site, and in
 fact hope that each "jig" will serve all
 operators and auditors  of the agency to
 which it is deployed.  Please contact
 Solomon  Ricks
 (ricks.solomon@.epa.gov: 919-541 -
 5242) if your agency has a special  need
 for two "jigs." The supply of extra
 "jigs" is limited.
We have also purchased Paulflex quartz
filters from RTI International (out of the
current production batch) to distribute
with the "jigs."  Typically, cassette position
I  will  be the only one used for a verifica-
tion or audit. The first distribution should
allow  for changing the filter every quarter
for both verifications and audits. The op-
erator or auditor should inspect the filter
prior  to every use; if it is dirty, torn or
dislodged from the cassette, it should  be
replaced. Send  an e-mail to Solomon
Ricks  if additional filters are ever needed.
Otherwise we will annually ship-out a set
for each verification and  audit cartridge in
use by each agency.
If you would like a "jig" please send an e-
mail to Solomon Ricks and courtesy copy
crumpler.dennis@epa.gov. Put "Send
Cassette Jig" in  the subject line.  We need
a shipping address for your agency, a con-
tact person, an e-mail address and a tele-
phone number.
Electronic Entry of State, Local or Tribal Collocated  Pb Sampling COC/FDS Forms
 State, Local and Tribal agencies that monitor for ambient lead
 (PB) are required to collect either 4 or 6 collocated filter sam-
 ples, depending whether their network consists of S 5  or >5
 Pb monitoring sites, respectively. The filters from the collo-
 cated samplers are to be submitted to EPA's Region 9 Analytical
 Support laboratory in Richmond, California, for Pb analysis.
 The migration of data from the Region 9 lab to AQS has been
 slower than we would like.

 A hard-copy, combined, chain of custody and field data sheet
 are initiated for each collocated filter  by its sponsoring agency.
 The COC/FDS accompanies the filter at all activity steps along
 its lifespan. A scan of the COC/FDS along with the mass per
 filter data is transferred to  RTI International, who provides
 support to EPA's Performance Evaluation Program for the na-
 tional Pb monitoring network, for validation and reporting to
 AQS.  The data on the FDS is critical to an accurate determina-
 tion of the ambient concentration measured by the collocated
 sampler during the appointed sampling event. Unfortunately,
 the data from scanned forms must be entered by hand  at RTI,
 which is costly and is subject to produce errors in translation,
 or increase the need to follow-up with the agency personnel.
                   To expedite the data transfer, reduce errors, and thereby re-
                   duce the cost of the data management process, RTI was com-
                   missioned to create a website with an online COC/FDS that can
                   be filled out by the agency operator. We are requesting that
                   anyone who collects the collocated Pb filter samples and com-
                   pletes the hard-copy COC/FDS register on the Website and
                   subsequently complete the COC/FDS on-line. To "Register" go
                   to ht±ps://airqa.rti.org/.  It may take 24 hours to have your ac-
                   count activated.  Send an e-mail to Ed Rickman at RTI,
                   eer@rti.org.  and identify yourself as a Pb collocated sample
                   provider. To electronically report your COC/FDS:

                     I.   Go to the Website
                    2.   Log-on
                    3.   Click on the "Pb-Performance Evaluation Program"
                    4.   Click on "5) State Collocated Chain-of-Custody Form and
                        Field Data Sheet"
                    5.   Fill out all the sections that are highlighted in Red.
                    6.   If you have any problems or questions Click on "Contact
                        Us" or e-mail Jennifer Lloyd: jml@rti.org

-------
              I
EPA Office of Air Quality
Planning and Standards

 EPA-OAQPS
 C304-02
 RTF, NC 27711

E-mail: papp.michael@epa.gov
                The Office of Air Quality Planning and Standards is
                dedicated to developing a quality system to ensure that
                the Nation's ambient air data is of appropriate quality
                for informed decision making. We realize that it is only
                through the efforts of our EPA partners and the moni-
                toring organizations that this data quality goal will be
                met. This newsletter is intended to provide up-to-date
                communications on changes or improvements to our
                quality system.  Please pass a copy of this along to your
                peers and e-mail us with any issues you'd like discussed.

                Mike Papp
   Important  People and Websites
    Since 1998, the OAQPS QA
    Team has been working with the
    Office of Radiation and Indoor Air
    in Montgomery and Las Vegas and
    ORD in order to accomplish it's
    QA mission. The following per-
    sonnel are listed by the major
    programs they implement. Since
    all are EPA employees, their e-
    mail address is: last name.first
    name@ epa.gov.


    The EPA Regions are the  pri-
    mary contacts for the monitoring
    organizations and  should always
    be informed of QA issues.
          Program
          STN/IMPROVE Lab Performance Evaluations
          Tribal Air Monitoring
          Statistics, DQOs, DQA, precision and bias
          Speciation Trends Network QA Lead
          OAQPS QA Manager
          PAMS & NATTS Cylinder Recertifications
          Standard Reference Photometer Lead
          Speciation Trends Network/I MPROVE Field Audits
          National Air Toxics Trend Sites QA Lead
          Criteria Pollutant QA Lead
          NPAP Lead
          PM2.5 PEP Lead
          STN/IMPROVE Lab PE/TSA/Special Studies
          STN/IMPROVE Lab PE/TSA/Special Studies
Person
Eric
Emilio
Rhonda
Dennis
Joe
Suzanne
Scott
Jeff
Dennis
Mike
Mark
Dennis
Jewell
Steve

Bozwell
Braganza
Thompson
Grumpier
Elkins
Beimer
Moore
Lantz
Mikel
Papp
Shanis
Grumpier
Smiley
Taylor
Affiliation
ORIA- Montgomery
ORIA-LV
OAQPS
OAQPS
OAQPS
ORIA LV
ORD-APPCD
ORIA-LV
OAQPS
OAQPS
OAQPS
OAQPS
ORIA-Montgomery
ORIA-Montgomery
   Websites
  Website
  EPA Quality Staff
  AMTIC
  AMTIC QA Page
  Contacts
URL
http://www.epa.gov/quality1/
http://www.epa.gov/ttn/amtic/
http://www.epa.gov/ttn/amtic/guality.html
Description
Overall EPA QA policy and guidance
Ambient air monitoring and QA
Direct access to QA programs
http://www.epa.gov/ttn/amtic/amtic contacts.html  Headguarters and Regional contacts

-------