The QA EYE

ISSUE 18

JULY, 2015

OFFICE OF AIR
QUALITY PLANNING
AND STANDARDS

SPECIAL POINTS
OF INTEREST:

•	RA and RP transaction
gone, QA transactions
now the requirement

•	Guidance being revised
for the QA Handbook,
PM 2.5 Method 2.12, and
developed for TSAs and
Electronic Logbooks

INSIDE THIS
ISSUE:

TSA Guidance Develop- I
ment

QA Handbook and	I

Method 2.12 Revisions

Village Green Monitoring 2
Platforms

One-Point QC Check 4
Proposal Causes Com-
ments and Response

Automated Data Certifi- 7
cation Completed for
2014

NPAP and PEP get	8

"LEAN"

Author Acknowledg-	9

ments

Annual Box and Whisker 10
Plots coming in July

Electronic Logbook Guid- 10
ance Coming

AQS QA Transactions 10
and enhancements

NATTS Collocated, Du- I I
plicate, Replicate QC

PM.2.5 Colocation Re- 12
quirements

NOY Update	13

Joe Delwiche Remem- 14
be red

EPA Regions Working Together to Develop Technical Systems Audit Guidance

A Technical Systems Audit
(TSA) is an on-site review and
inspection of a monitoring or-
ganization's ambient air moni-
toring program to assess its
compliance with established
regulations governing the col-
lection, analysis, validation, and
reporting of ambient air quality
data.

During the revision of the QA
Handbook Volume II in 2008,
the EPA Regions got together
and revised the TSA Checklist
that can be found in Appendix
H. It was not revised in the
201 3 Handbook revision. EPA
has received some comments
from monitoring organizations
that the Regions' approaches
to conducting TSAs — what
they assess and what they con-

sider findings -- are not con-
sistent. Recent TSA data quality
findings have affected NAAQS
decisions. In light of this, the
Regions have begun sharing
their TSA reports, as well as
their auditing practices and as-
sessment techniques. This
dialogue has led to the Regions
and OAQPS working together
to develop a TSA guidance doc-
ument.

The intent of this new docu-
ment is to provide guidance to
assist auditors in understanding
the TSA requirements, and to
provide guidance and tools to
aid in conducting TSAs of ambi-
ent air monitoring pro-
grams. While the document is
geared primarily for federal
auditors conducting TSAs of

monitoring programs required
by the CFR, the principles and
tools provided in the document
could be a framework for any
auditor performing a TSA of any
ambient air monitoring net-
work. The guidance document
is intended to present "best
practices" that, if implemented
and followed, would result in
the best assessment of a moni-
toring organization's ambient air
monitoring program.

The document will not replace
the TSA Checklist, but rather
supplement it with other tech-
niques and guidance. The
Workgroup formed in March
2015 and has been having calls
every three weeks. The goal is
to complete the new guidance
by winter of 2015.

QA Handbook Vol II and PM2.5 Method 2.12 being Reviewed and Revised

OAQPs is in the process of reviewing and revis-
ing both the QA Handbook for Air Pollution Meas-
urement Systems Vol II and the Quality Assurance
Guidance Document 2.12-Monitoring PM2.5 in
Ambient Air Using Designated Reference or Class I
Equivalent Methods.

QA Handbook

Since the last revision in 201 3, OAQPS has been
collecting comments for revisions and additions
to the Handbook. The QA Handbook Revision
Workgroup has had a number of meetings to
discuss these comments. Since many monitor-
ing organizations utilize the validation templates,
which have a tendency to change more often

than the Handbook, we've placed the Valida-
tion Templates on AMTIC at http://
www.epa.gov/ttnamti I /qalist.html and included a
table that tracks changes made to the template.
The next version of the Handbook will include
validation templates for NCore, NOy and direct
N02 monitoring and newer PM25 continuous
methods.

Method 2.12

The PM2 5 method has not been revise since the
original version published in 1998.

(Continued on page 3)


-------
Village Green Monitoring Stations Are Popping Up Across the Country

Through recent near-source air monitoring field
studies, such as field campaigns monitoring air pol-
lution adjacent to highways, it is understood that air
pollutants can significantly vary with time and space
in an urban environment. Current regulatory moni-
toring stations provide information on regional air
pollution levels. However, they are usually insuffi-
cient in number to address research questions on
local-scale air pollutant trends.

Researchers wanting to evaluate local-scale air qual-
ity trends currently balance the desire for spatial
information with the desire for temporal infor-
mation, with cost as a practical limiting factor. A
key technology gap are lower cost air pollution
monitoring systems that would allow for longer-
term sampling at a greater number of locations.
Presently, the cost and complexity of implementing
multiple traditional-style air monitoring stations
leads to researchers often utilizing mobile methods.
In addition to the high cost of traditional-style air
monitoring stations, siting in public environments is
often quite challenging due to the large physical
footprint, poor aesthetics, and lack of public en-
gagement in the research.

Through an E-Enterprise initiative and joint leader-
ship between OECA, ORD, and OAR, an oppor-
tunity was created for state agencies interested in

piloting the stations to propose to join as a partici-
pant. Twenty-two proposals were received and re-
viewed by the Village Green Project team, led by
Esteban Herrera (OECA). The selected participants
were the DC Department of the Environment, City of
Philadelphia's Air Management Services, Kansas De-
partment of Health and the Environment, Oklahoma
Department of Environmental Quality, and Connecti-
cut Department of Energy and Environmental Protec-
tion .

The VGII station is the full integrated system, which is
designed to operate only on solar power and utilizes
small real-time air monitoring instruments that are
expected to require infrequent maintenance. The
system also has an on-board microcontroller and
cellular modem that provides real-time data streaming
to an EPA-hosted AirNow database. To support
public engagement, an accompanying website enables
the real-time data to be displayed on a website. The
station was designed to be integrated with a park
bench - this smaller footprint, improved aesthetics,
and public outreach associated with this system pro-
vide easier siting and an opportunity to engage with
community members. The VG physical structure is
made out of recycled materials and provides secure
and weatherproof storage to the scientific instru-
ments.

Measurements currently taken are ozone, PM25,
wind speed, wind direction, ambient temperature and
humidity. In addition, a low cost sensor for nitrogen
dioxide is being evaluated at the new stations.

Over time additional measurements may be taken at
these sites.

At present we are looking at the VGII sites as a re-
search project and are gathering data and comparing
it against regulatory monitors in the vicinity of the
sensors. A QA project plan was developed for the
project but the frequency and acceptance criteria for
the checks are minimal compared to regulatory moni-
tors. (continued on page 3)

This research conundrum has led ORD to develop
the Village Green air pollution station (henceforth
called the "VG station") to address this technology
gap. The VG station was designed with a goal of
providing real-time pollutant data for several meas-
urements of interest (ozone and fine particles),
being self-powered, having a lower/smaller physical
footprint, and providing aesthetics and public en-
gagement elements that would expand siting op-
tions and augment EPA outreach efforts, in addi-
tion, a key goal was an overall lower total cost
compared to a traditional monitoring station, by
nominally an order of magnitude.

The first VG station was set up June, 2013 outside
of a public library in Durham, North Carolina. The
system provided good information and was com-
pared against local monitoring stations operating
federal reference and equivalent methods.

Monitoring agencies are expressing interest in im-
plementing Village Green monitoring stations to
help inform the public on ambient air quality in their
community. Village Green provides a solar-
powered air monitoring system that will take con-
tinuous readings of several air pollutants and weath-
er conditions. The measurements are then
streamed to the Village Green and AIRNow web-
sites.

Ozone and PM monitors behind the bench

Q A EYE


-------
ISSUE 13

Village Green Monitors (Continued from Page 2)

— — — — — — — — — — — — —

With this interest there has also been concern expressed by
monitoring agencies on the potential use for this data for
National Ambient Air Quality Standards (NAAQS) decisions.
The Village Green monitors are not intended to be used for
any NAAQS related purposes. Although the monitors used in
the projects are intended to be as accurate and precise as
possible, and there may be monitors that have been approved
as federally equivalent methods (FEMs), they will not be sited
in manner required for regulatory monitoring nor will they
implement the same quality control requirements necessary
for use in regulatory decisions making.

Data from these monitors are not required to be reported to
AQS and not required to be certified on an annual basis. If
monitoring agencies decide to report data to AQS for these
monitors EPA will work with AQS programmers to set up a
specific Network Affiliation Code and the monitoring organi-
zations will be instructed to use a NAAQS exclusion code on
the monitor records. These reporting conditions will ensure
data from the Village Green Monitors will be excluded from
any regulatory decision making. The Village Green Monitoring
system will not be considered a special purpose monitor
(SPM) under the 40 CFR Part 58 requirements and therefore
will not be required to become a regulatory monitor if oper-
ated for longer than two years.

For more information, please visit the Village Green Website

at: http://www2.epa.gov/air-research/village-green-proiect

The station above is located in the children's farm area at the
Smithsonian's National Zoological Park. With an average of two
million visitors yearly, the Village Green Project at the National
Zoo in DC increases visitor's awareness of air quality and local air
quality conditions while they explore the zoo

This station is located in Independence National Historical Park near the National Constitution Center. This site was chosen because
of its proximity to vehicle and pedestrian traffic. The real-time data generated by the site will be used to educate visitors and resi-
dents about street-level pollution exposure.

(j \ Handbook and PM2.5 Method 2.12 Revision (Continued from Page I)

Since that time there has been a num-
ber of changes in the PM25 method,
including the development of the very
sharp cut cyclone, by more than one
manufacturer. Filter weights have
changed due to the award of a different
filter manufacturer which means a mod-
ification in the check weight guidance is

required. In addition, due to recent findings
during technical systems audits, there is a
need for additional detail and clarification in
the pre- and post filter weighing laboratory
sections. OAQPS sent a memo out after the
August 2014 National Ambient Air Confer-
ence asking for comments on this document.
Both the EPA Regions and monitoring organi-

zation have provided about 10 pages of
comments that we are currently wading
through. Our goal is to have a draft of
Method 2.12 completed in September
2015 and a draft of the QA Handbook in
December 2015.


-------
PAGE 4

Proposed 1-point QC Revision Causes Push Back.... What's the data say

In the course of considering potential
changes to quality assurance requirements
as part of the proposed rule - Revisions to
Ambient Monitoring Quality Assurance
and Other Requirements -79 FR 54356 -
OAQPS received a number of comments
related to the proposal to lower the con-
centrations of the one point QC check
and to require the selection of the check
based on the mean or median concentra-
tion of the measurements within the ambi-
ent air monitoring network. The com-
ments that were received are currently
under consideration. During our review
of the comments, EPA performed some
additional assessments of monitoring data
that was used to provide some rationale
for our initial decision to propose the
changes in the regulations. These addi-
tional details are provided in this article.

BACKGROUND

The EPA proposed to lower the audit
concentrations (current section 3.2.1) of
the one-point quality control (QC) checks
to between 0.005 and 0.08 parts per mil-
lion (ppm) for S02, N02, and 03
(currently 0.01 to 0.1 ppm), and to be-
tween 0.5 and 5 ppm for CO monitors
(currently I and 10 ppm). With the devel-
opment of more sensitive monitoring
instruments with lower detection limits,
technical improvements in calibrators, and
lower ambient air concentrations in gen-
eral, the EPA felt this revision would bet-
ter reflect the precision and bias of the
ambient air data being measured at the
site.

The majority of the comments (19 of 26
responding to the quality assurance pro-
posal) received on appendix A related to
this proposed change. Most of the com-
menters expressed similar technical con-
cerns which can be categorized below:

The SLAMS network is in place mainly for
decisions related to the NAAQS there-
fore QC checks should be around
NAAQS values.

Some of the federal reference methods
(FRM) or federal equivalent methods
(FEM) that are still in use may operate
acceptably at concentrations around
the NAAQS but these older methods
are not as sensitive at lower concentra-
tions (i.e., mean or median concentra-
tions) so QC checks at these lower
levels are beyond the limits of the in-
strumentation.

The instrumentation necessary to chal-
lenge the monitors at the lower con-
centrations (calibrators with additional
mass flow controllers or gas cylinders
of lower concentrations) would be
required to comply and therefore rep-
resent an added expense and burden.

The lower concentrations affect the
percent different statistic so there is
more chance that the QC check will
fail the acceptance requirements and
therefore invalidate data that the moni-
toring organization feels is of accepta-
ble quality.

In order to provide some context to
the proposal, EPA extracted routine
data and all one-point QC data for the
4 gaseous criteria pollutants for SLAMS
sites for calendar year 2013. The fol-
lowing evaluation provides summary
information about the routine data and
the one-point QC checks reported by
states. It must be noted that the one-
point QC regulation, prior to the pro-
posal, suggested that " the QC check

gas concentration selected should be
related to the routine concentrations
normally measured at sites within the
monitoring network in order to appro-
priately reflect the precision and bias at
these routine concentration ranges".
Based on the 2013 data, it is evident that
many monitoring organizations did not
follow this recommendation and was the
reason EPA formally proposed the revi-
sion to this requirement.

Nitrogen Dioxide- In Figure I (and
similar figures for the other gaseous
pollutants), the graph on the left repre-
sents the mean, 99th percentile and max
value for I-hour routine ambient air
data for 43 states and territories (x-axis)
reporting N02 data in 2013. The y-
axis is concentration in ppb. The green
line (80 ppb) represents the proposed
upper range of the one-point QC check
while the red line represents the cur-
rent upper range (100 ppb) for the one-
point QC check. Of the 43 states re-
porting hourly N02 data, only three
states had maximum hourly values
(highest value for the year) above the
NAAQS (100 ppb) and 99 percent of all
the states hourly values were below 60
ppb. The graph on the right is a frequen-
cy distribution of the one-point QC
checks for 2013. 85 percent of the I -
point QC checks reported for 2013
were above 99 percent of the routine
data, (continued on page 5)

NO,

2013 Mean/99th%/Max
N02 Concentrations (ppb)

Overall Mean 10.00 ppb

i—tr~rr

; v\AyV y/7

Current - 10-100 ppb; Proposed- 5-80 ppb

- Current 1-Point QC check upper limit
Proposed 1-Point QC check upper limit

af Oo> QC Chttk

85% of the QC data above
99% of data values
in the country



4 IG 20 30	50 W ?D SC

Figure 1

Q A EYE


-------
ISSUE IS

One-Point QC Checks (continued from Page 4)

PAGE 5

Figure 2 (and similar figures for the other gaseous pol-
lutants), provides an assessment of the one point QC
checks using box and whisker plots. In Figure 2, the
graph on the left provides the difference between the
measured value and the audit standard value; the graph
on the right is the same data set but reports the per-
cent difference which is the statistic presently in use
for the gaseous pollutants. The one point QC data is
segregated into 10 ppb segments (0-10 ppb, 10-20 ppb
etc.) in order to evaluate whether the lower QC con-
centrations have an effect on the precision and bias
estimates. The green vertical line on the graph on the
right represents the proposed 80 ppb high QC range
and the blue shaded area represents where the QC
checks would be selected if monitoring organizations
selected a QC concentration related to the mean or
median routine air concentration. The red lines rep-
resents the current percent difference acceptance
criteria (+15%). In this case, the N02 variability,
based on the 25th-75th percentile spread, does not
appear to be significantly different at lower QC con-
centrations than the higher QC concentration data.

Ozone (Fig.3) - Of the 51 states and territories re-
porting hourly ozone data, 48 states had maximum
values over the NAAQS (75 ppb) but had 99 percent
of their hourly values below the NAAQS. In addition,
90 percent of the one-point QC checks were above
99 percent of the routine data.

Similar to N02, figure 4 presents the differences (left
graph) and percent differences (right graph) of the
one point QC checks segregated into the ten concen-
tration ranges. The ozone acceptance criteria is tight-
er than N02 (+7 percent difference) but the variabil-
ity of the 25th - 75th percentile spread at these low-
er ranges does not appear to be significantly different
from the one-point QC check variability at higher
concentration ranges, (continued on page 6)

C*!f»r»rK*t by Asmsmmm

£&&££&& & it' §

g g g g g g . , .

*»»«««« tev<» imOi

g g S f

i I

Figure 2

Current - 10-100 ppb; Proposed- 5-80 ppb

20i3 Mean/99th %/Max
03 Concentrations (ppb)

- Cwrrcflt 1 'Point QC check upper limit
¦ Pnjpoied 1-Pwn! QC check upper limit

13 57 9 11 13 15 1719 21 23 25 27 29 3133 35 37 39 4143 45 47 49 SI

Figure 3

Drtf*r»n<« bj MimnKrn L<

-rXlsIXl

T
*





i I I f I i I I ! s

1 f I lilt

^	^ 1

rt Level {ppb;

Figure 4


-------
PAGE 6

One-point QC Checks (continued from page 5)



Sulfur Dioxide ( Fig. 5) -In order to provide a
readable graph, the left hand graph of Figure 5 does
not include maximum values since some states did
measure S02 values that were quite high. Of the
48 states and territories providing hourly sulfur
dioxide data, 23 states had maximum values over
the NAAQS (75 ppb) and 47 states had 99 percent
of their values below 36 ppb. 75 percent of the one
-point QC checks were above 99 percent of the
routine data.

Figure 6 presents the differences (left graph) and
percent differences (right graph) of the one point
QC checks segregated into the ten concentration
ranges. The S02 acceptance criteria is +10%
(percent difference). The variability, based on the
25th - 75th percentile spread, at these lower rang-
es does not appear to be significantly different from
the one-point QC checks at higher concentration
ranges.

Carbon Monoxide (Fig. 7)- of the 51 states and
territories reporting hourly carbon monoxide data,
no state was reporting maximum values greater
than the one-hour NAAQS (35 ppm) and only
three states were reporting hourly maximum val-
ues above the eight hour NAAQS (9 ppm) In ad-
dition, 99 percent of all states hourly data was be-
low 3.0 ppm. States were providing one-point QC
values at lower ranges (graph on right) but about
60 percent of the one-point QC checks were being
performed at concentrations greater than 99 per-
cent of the routine ambient air data.

Similar to the other gaseous pollutants, the variabil-
ity of percent differences, based on the 25th - 75th
percentile spread of the box and whisker plots,
does not appear to be significantly different be-
tween the higher and the lower one point QC con-
centration values, (continued on page 7)

Current - 10-100 ppb; Proposed- 5-30 ppb

so.

2013. Mean/99th % SG2 Concentration < ppb)

Overall Mean 1.52 ppb

-	Current 1-Poin* QC check upper limit

-	Proposed 1-Poww Q.C check upper limit

75% of the QC data above
99% of almost all sites

Figures

Qrtf«r»nc»t by Aw wnml U«v»l • iMM Diond*

Ptrprnj Qiw»r»in;»t by Ai

.1 - J*rfurCwiid.

Range of State Means

I I I f I I I I I I S

E§&satS?st;8	e £ § 1 I | t I 1 8

8 I



Figures

CO

2013 Mean/99th!4/Max
CO Concentrations (ppm)

Overall Mean 0.33 ppm

Current - 1-10 ppm; Proposed- 0.5*5 ppm

- Current 1-PwrW QC chuck upper lim
¦ Proposed 1-Point QC check upper II

1 3 S ? 9 1113 IS !7 19 2123 2517 29 3133 3S 37 39 41 43 4* 47 49 SI

—fclfcin 9$lh Mil

figure 7

Mrmcn huuiMM Ltvtfl.	Monoxid*

Range of State
Means

5 2 ^ £

Figures

Q A EYE


-------
ISSUE 18

PAGE 7

One-point QC Checks (Continued from Page 6)

As has been shown, monitoring agencies can test and
achieve acceptable precision and bias results at lower con-
centration levels. Providing data users with estimates of
precision and bias where the majority of our ambient air
data are measured should be a programmatic goal and mon-
itoring organizations should be working with EPA Regional
Offices to develop the budgets necessary for purchasing the
updated equipment and revising related procedures. The
EPA will continue to endorse this approach to make the
QC checks more meaningful and will consider future revi-
sions to Appendix A to either require QC checks at two
concentration levels (i.e., one around the mean concentra-

tions and one related to the NAAQS) or require the span
check to be reported to AQS. In addition, to alleviate con-
cerns about failing the acceptance criteria at lower QC
concentrations, EPA will evaluate suggestions by monitor-
ing organizations to raise acceptance criteria or look at
alternative acceptance criteria (e.g. difference instead of
percent difference). Since acceptance criteria is included in
guidance, EPA will have the opportunity to perform the
evaluations without effecting the regulation. In 201 I, EPA
developed similar guidance for lower concentration levels
of the annual performance evaluation audits.

Automated Data Certification Activities Complete Another Year

This is the third year of using the AMP600 report for certifica-
tion. It appears the process is taking hold since we have re-
ceived fewer questions about the process this year. The sys-
tem still has a few issues we need to address.

Evaluation of PEP and NPAP Data Suspended for
CY2014 Certification.

OAQPS has had some key retirements in 2014 as well as turn-
over of data reporting to a new QA contractor. These chang-
es have slowed and in some cases stopped the reporting of
NPAP, PM25 PEP and Pb-PEP data to AQS. Therefore, the
AMP600 will report completeness and bias data of any PEP
values reported to AQS but will not perform any automated
evaluations of that information.

I-point QC Check Completeness.

It was suggested that the evaluation of the I -point QC check
should be more detailed since there were findings during
technical systems audits that monitoring organizations were
not be performing checks every two weeks but performing
checks more frequently at the end of the year to "make up"
for missed checks. The CY-13 AMP256 and AMP600 reports
simply counted all the I -point QC checks over the whole year
and divided that number by 26. For CY-14 the I -point QC
completeness data were evaluated in the following manner:

•	Count the number of checks in each 14 day interval start-
ing with the Jan 1-14 interval. For each 14 day interval,
multiple checks will only count as one.

•	Divide the total number of checks in #1 by 26

For certification, a green Y is >_75%. That means a monitoring
organization could miss 6, 14 day intervals (meaning a check

past the 14-day interval) and still get a green Y. For a yel-
low flag, they could miss 9, 14-day intervals and get a warn-
ing. Missing 10, 14-day intervals will elicit an N flag which
seems very reasonable in light of the CFR requirement.
We have received some suggestions to build the intervals
around weekends rather than starting on January 1-14. For
2015 data certification, we will review the current proce-
dure to determine the most equitable evaluation of this
data.

In previous certification periods there were a number of
discrepancies between the AMP256 report and the
AMP600. The following fixes have been made so both re-
ports should provide the same information:

Collocation completeness for PM10 - The AMP256
and the AMP 600 will only count sites where a manual sam-
pler is the primary sampler. However there may be times
when a site had a manual sampler as primary for a period of
time and switched to a continuous monitor. These sites will
be included in the manual count if the manual sampler oper-
ated as the primary for any time during the year.

Collocation for PM2.5- The appendix A regulation re-
quires that a PQAO collocate 15% of the monitors in each
method designation used as a primary monitor. The
AMP256 has been revised to assess whether there is 15%
collocation for each method designation of only the primary
monitors and should therefore match the result in the AMP
-600 report. However there may be case where more than
one method designation was used at a site as a primary
monitor. Any method designation used as a primary at any
time during the year will be counted towards the colloca-
tion evaluation. So if one ran a method I 18 for 6 months
and a 143 for 6 months at the same site, the AMP-600 will
expect to see collocation for each method designation.


-------
PAGE 8

NPAP and PEP Programs get "LEAN"

With a recent change in the OAQPS QA Contract and re-
tirement of key individuals who uploaded NPAP data,
OAQPS has had major delays in reporting both NPAP and
PEP data to AQS. This has also caused us to delay some of
the data flagging and PEP and NPAP data to appear incom-
plete in the Annual Data Certification Report (AMP600) and
Data Quality Indicator Report (AMP256).

Two groups in OAQPS; the Ambient Air Monitoring Group
(AAMG) and the National Air Data Group (NADG) have
been working together on the data reporting issues but have
not been able to make as much progress as either group has
wanted. Both Group Leaders and Divison Directors of
these groups discussed the reporting issues and agreed that
the implementation of the LEAN Six Sigma process might
help bring both groups together to discuss a path forward
on improving the reporting of the NPAP and PEP program
data to AQS.

The LEAN Six Sigma Process

LEAN Six Sigma is a methodology that relies on a collabora-
tive team effort to improve performance by systematically
removing eight kinds of waste: defects, overproduction,
waiting, non-utilized talent, transportation, inventory, mo-
tion, extra-processing.

In January 2015, AAMG and NADG engaged in conversation
to submit the NPAP/PEP program as a LEAN Six Sigma pro-
ject. EPA has a number of trained LEAN Six Sigma facilita-
tors to implement the process. OAQPS submitted a re-
quest and was accepted for implementation. In order to
start the process, OAQPS and EPA Regions expressed the
following concerns about the NPAP/PEP data acquisition
process:

•	Process for handling field audit data is cumbersome

•	Attempts to load audit data into AQS are often reject-
ed

•	Reporting of audit data to agency and clients is delayed

•	QA assessments based on audits are outdated due to
data loading delays

We then expressed a number of goals for a future NPAP/
PEP Program:

•	Reduce data reporting "lag-time" from the current year
-long delay to no more than 90 days.

•	Simplify the handling of data

•	Remove obstacles to timely AQS reporting of QA data

•	Decrease accompanying FTE commitment and contract
costs

The LEAN Process Results

Four full days (March 30 to April 2) were set aside for group
leaders and key members of both AAMG and NADG to:

•	Review current NPAP/PEP implementation processes
and identify areas of inefficiencies or where errors might
occur

•	Identify a future process that was simpler, less error
prone, and would reduce report time and save time for
all those implementing the process

•	Identify products that would be required for the
"future" process

•	Identify leads from both OAQPS and NADG and partici-
pants from the Regions to help move the process along.

Current NPAP

Three areas of inefficiency in the current process were iden-
tified where reductions in lead time could be improved.

1.	The development of the annual audit list and schedule- It
was identified that using network plans or some form of
AQS report to identify the universe of sites from which
to select the years audits (20% of sites within PQAO)
was inefficient.

2.	Upon completion of the audits a printout of the audit
results was prepared by the auditor and provided to the
monitoring organizations site operator. Upon the NPAP
auditors return to home base, the audit could be re-
viewed by the ESAT contracts technical manager as well
as the EPA contract officer representative. This review
cycle could be quite lengthy and there did not appear to
be a time limit on this review.

3.	Past procedures required audits to be sent to OAQPS
for entry. Entry would be attempted but if it failed they
would be sent back to the Regions (who might work
with the states) for further review and correction. This
process was also very lengthy with no apparent time
limit on corrective action. Due to the retirement of
two key entry individuals at OAQPS, this process no
longer occurs and it is ether up to the EPA Regions or
OAQPS to report the data.

NPAP Summary

The improvement in NPAP reporting will depend on NADGs
development of an improved reporting feature to help sched-
uling and the development of a new entry program that will
capture the most recent calibration information in a data set
that is automatically transfered into a new empty workbook
at the site so that block copying of information does not
become a source of error.

(continued on Page 9)

Q A EYE


-------
ISSUE 18

PAGE 9

NPAP and PEP LEAN Process (continued from page 8)

ESAT contractors and federal auditors
will then be required to enter the appro-
priate information at the site and directly
confirm this information with the site
operator.

In order to reduce review time and error
corrections, we will attempt to create a
program and acquire technologies that in
most cases will allow data to be uploaded
to AQS at the site. All indications are that
review of successful audits (without up-
load to AQS) just provides for a possibil-
ity that audits will be forgotten to be
reported. Therefore, it is proposed that
all audits be immediately uploaded to
AQS in a pre-production area ( so that
they are immediately stored in the best
location) and provide a minimum amount
of time for review (7 days) of successful
audits and for a longer period (30 days)
for those audits that may have had an
exceedance.

Current PEP

Several areas of inefficiency in the current
process for PM25 PEP and Pb-PEP were
identified where reductions in lead time
could be improved.

1.	The development of the annual audit
list and schedule- It was identified that
using network plans or some form of
AQS report to identify the universe of
PEP sites from which to determine the
years audits was ineffective for all sites.

2.	The current weighing lab support data-
base for the PM25 PEP has become too
large for the software platform which is
creating problems. Problems resulting
from the database size must currently
be identified and corrected by the la-
boratory staff.

3.	Field data review of PM25 PEP data can

be lengthy due to the travel schedule of
the auditors who are responsible for the
review. Field data must be reviewed to
ensure the integrity of the hand entered
field data into the database.

4.	Past procedures required PM25 and Pb-
PEP audit data to be sent to OAQPS for
entry. Entry would be attempted but if it
failed they would be sent back to the lab
manager and the Regions for further re-
view and correction. This process was
very lengthy and labor intensive with no
apparent time limit on corrective action.
Unresolved valid audits may never be
successfully uploaded into AQS because of
unresolved coding issues.

5.	Before PM25 audit data can be uploaded
to AQS, a state valid result must be pre-
sent in AQS to pair the data. If routine
data has not been uploaded to AQS by the
SLT, the audit data upload could be de-
layed or even overlooked during the next
upload sequence.

6.	The Pb-PEP program requires the auditor
to enter audit/run data into a website
where the run data and laboratory data
can be paired to generate a concentration.
Many times this web entry was not com-
pleted leaving valid laboratory data un-
paired. To resolve this issue, scanned
copies of the field data would need to be
hand-entered by a third party or the audi-
tor would need to be notified for the in-
put. In both cases, the process is very
labor intensive and a calculation of a con-
centration would be significantly delayed
or data would remain unpaired.

7.	All Pb-PEP data is required to be
"approved" by the regional Pb-PEP con-
tact. A significant amount of time can pass
before these audits are approved due to
regional priorities, staffing levels, or other
problems.

PEP Summary

The improvement in the PEP programs will
depend on NADG development of a web
based application, similar to that of NPAP,
to improve scheduling and the develop-
ment of a new entry program within this
application that will provide the capture of
the field data information on site. This
new application will also provide a much
less labor intensive process for uploading
the audit data to AQS.

In order to reduce review time and error
corrections, we will attempt to create a
program and acquire technologies that in
most cases will allow data to be uploaded
to AQS upon post-weighing and validation.
As with NPAP, all indications are that re-
view of successful audits (without upload
to AQS) just provides for a possibility that
audits will be forgotten to be reported.
Therefore, it is proposed that all audits be
immediately uploaded in a pre-production
area (so that they are immediately stored
in the best location) and provide a mini-
mum amount of time for review of these
audits.

Next Steps

Following the implementation of the new
NPAP process, NADG and AAMG pro-
gram leads will schedule a series of meet-
ings to develop an implementation plan for
the new program. AAMG will reach out
to few Regions for participation. Using
models of the PEP implementation plan,
we will identify the key attributes of the
new system that will identify the new re-
quirements for the auditor as well as the
other key personnel .

In the interim, we will also look at some
improvements on the current PEP system
that will reduce errors and improve re-
porting times while the new system is
under production.

Author Acknowledgements for this Issue

OAQPS appreciates and acknowledges the
folks that helped put this Issue of the QA
EYE together: Stephanie McCarthy Region 4)
for her contributions to the TSA Guidance
Document Article (pg. I), Gayle Hagler
(ORD) for assistance on the Village Green

article (pg. 2), Greg Noah, Dave Shelow and
Nick Mangus (OAQPS) on the NATTS QC
article (Pg. I I) Robert Coats (NADG) on
the PM2.5 QA Collocation (pg 12) and AQS
article (pg. 10) and Nealson Watkins on the
NOy update (pg 13). We try to get a QA

EYE issue out every 4-6months so pro-
vide us some feedback or an article
you'd like posted to
papp.michael@epa.gov


-------
ISSUE 18

PAGE 10

Annual Box and Whisker Plots Coming Out in July

poc 1	!

CV	3.12	3.49	5.33

Bias	+5.23	+3.2?	+5.65

3 Obs .45	53	45

Method 05+	X4	SAG

10

Nick Mangus of the National Air Data
Group (NADG) is preparing 2014 data
for reporting the annual box and whisker
(B&W) plots of the four gaseous pollu-

Monitoring organizations have been sug-
gesting the use of electronic logbooks (e-
logbooks) for ambient air monitoring pro-
grams for a number of years. OAQPS has
organized a Workgroup with the EPA
Regions and monitoring organizations to
develop a guidance document that pro-
vides the minimum requirements for the
use of e-logbooks to replace the traditional
hardcopy logbooks used in our monitoring
networks. Monthly conference calls start-
ed in April. During each call, monitoring
organization participants have provided

tants. The report will be posted on AM-
TIC later in July at http://www.epa.gov/
ttnamti I /qareport.html.

Our hope is that this is a last time Nick
has to perform this activity. It is expected
that QA data will be reported to the
DataMart in July and that we and the am-
bient air monitoring community can start
developing automated reports at that site.
We'd like one of those reports to be the
box and whisker plots. In Nick's review of
the data, he has noticed a few things to be
aware of.

Assessment values of zero. The as-
sessment value for the one-point QC

presentations of their e-logbook systems and
we have also had presentations from Sonoma
Technologies and Agilaire.

The purpose of this guidance is to establish
minimum requirements for documenting and
maintaining e-logbook information for the
Ambient Air Monitoring Program. This docu-
ment is not intended to be inclusive of all
electronic records initiatives presently being
conducted in the Agency, but rather is seen
as a starting point for an e-logbook structure
to ensure some consistency across all the
monitoring organizations utilizing e-logbooks

check is the value of the audit standard. This
value should never be zero. Since the audit
standards value is in the divisor, and one
can't divide by zero, any assessment values of
zero are being eliminated from the assess-
ment

Data reported in Assessment Number
2 Field. A monitor had reported a single
assessment on a day, but reported it in the
assessment #2 field . The data should be
reported in the assessment #1 field. The
AMP256 does not recognize the data in field
#2 (unless there are multiple values) so the
B&W statistics may be slightly different than
the 256 values. We plan to fix the AMP256
report to cover this in the future.

\

for ambient air monitoring in accordance
with 40 CFR Part 58. In addition, tradi-
tional use of hardcopy logbooks is not
being discouraged.

The goal of this document will be to en-
sure that the salient features of good
logbook practices are presented so that
this data is captured and maintained in a
manner that is secure, tamper proof and
legally defensible. We hope to have the
guidance completed by the end of the
year.

/

Electronic Logbook Guidance Being Developed

Some More AQS Info-QA Transactions and Enhancements

Most QA transactions have a field for
"Performing Agency". For most QA transac-
tions, it is recorded by AQS as metadata, but
not used; it is explicitly not used for security
access - i.e. allowing the user to submit data for
a monitor. The exception is the following four
"Transactions for Labs" where it is required:

1.	Pb Analysis Audit,

2.	Lab Proficiency Test,

3.	Ozone SRP,

4.	AA-PGVP.

Access control:

For all transactions except the four
"Transactions for Labs" access is allowed by
one of the Agency Roles (e.g. PQAO, Re-
porting, Analyzing, Audit & etc) assigned to
the monitor identified by the transaction.

For the four "Transactions for Labs", Only
users assigned to the Performing Agency or
PQAO on the transaction are allowed to
submit the data.

Enhancements:

We are working on an enhancement to
allow one agency to be defined as the
"child" of another agency. The typical
case would be that a local or district
agency would be the "child" of a state-
level agency. When this enhancement
is implemented, if a child has access to
a monitor, then its parent would be
allowed access also. This should be
available by end of July, 2015.


-------
PAGE II

1

NATTS - Collocated, Duplicates and Replicates QC Data

Duplicate and replicate analyses and collocated data are reported to AQS in the NATTS program. The AQS "RA" and "RP"
transactions have been retired and can no longer be used to report data; all QA data must now be submitted using the
new QA transaction format. Each of these datasets (replicate, duplicate, collocated) now has its own format for uploading
to AQS using the new QA transactions as opposed to the more generic and confusing transactions of the past. Below are the
definitions of collocated, duplicate and replicate according to the NATTS, and some schematics to clearly illustrate the differ-
ences. The new QA transactions will clearly identify the QA data so it can be properly characterized, stored, and used. We
are currently developing guidance for preparing and submitting the new QA transactions which should be available in the com-
ing weeks.

Collocated Samples





Primary

|

H



Monitor N



u

	~ a

	 b





Collocated

	r

M



Monitor C



I J



Collocated Samples

Collocated samples are samples collected simultaneously at the same location
using two completely separate sampling systems. Assuming neighborhood scale,
the recommended horizontal spacing for sampling inlets of collocated samplers is
I to 4 meters for low volume samplers and 2 to 4 meters for high volume sam-
plers. Recommended vertical spacing is within I meter.

Duplicate Samples

Duplicate samples are samples collected simultaneously using one collection
system and the same inlet, and then analyzing the samples and comparing the
results obtained.

Replicate Samples









f—1		 3 L



Monitor N



b c







	' . c

Duplicate samples

Monitor N

c

Replicate Analysis

Replicate assessments are the analysis of one discrete sample multiple times to
yield multiple measurements from the same sample. These are also known as
"split" sample analyses.

Combining Duplicates and Replicates

In some cases, replicates of duplicate samples may be conducted (not required).
This is often referred to as a duplicate/replicate sample. In this case (see sche-
matic below), there are two duplicate samples, " I" and "2". Duplicate Sample
" I" has three replicates: "a", "b", and "c". Duplicate Sample "2" has three repli-
cates: "d", "e", and "f".

Duplicate/ Replicate Samples

Monitor N

Collocated Replicate Samples

Primary
Monitor N



R



1



Collocated



H

Monitor C



\

Combining Collocated and Replicate Samples

It is also possible (not required) to make replicate analyses of collocated samples.
This is often referred to as collocated replicate samples. Assuming neighborhood
scale, the recommended horizontal spacing for sampling inlets of collocated sam-
plers is I to 4 meters for low volume samplers and 2 to 4 meters for high volume
samplers. Recommended vertical spacing is within I meter.

Q A EYE


-------
ISSUE IS

PAGE 12

PM2.5 OA Collocation Requirements...What's Official... Mat's not?

When it comes to QA Collocation requirements there is still a lot of confusion out there. Some of the confusion stems
from the term "collocation" being used in the generic sense; meaning that sites exists where there is a primary PM25 moni-
tor and there are other PM25 monitors "collocated" at the site for purposes other than meeting the "QA Collocation"
requirements. In this article "QA collocation" refers to the primary/QA collocated monitors paired to meet the 40 CFR
Appendix A QA requirements.

Since 2006 (see QA EYE Issue 2 page 5), EPA has been advocating the submission of the QA collocation data as raw data
and eliminating the need for monitoring organization submission of a precision transaction (RP) for this information. This
requires the identification of the QA collocated monitor in the QA collocations table on the maintain monitor form in
AQS. In 201S, the RP transaction for QA collocated instruments was discontinued.

In addition, after 2012, the collocated monitor must be
paired with the NAAQS primary monitor. Prior to 2012, a
QA collocated monitor may have been paired with any
monitor at a site. AQS has not attempted to identify
these sites.

The AMP 256 reports and the AMP600 report will not
recognize any collocation where the QA collocated moni-
tor is not paired with the NAAQS primary. It will also not
report any monitors where the methods codes are not
appropriately paired as required in CFR.

If the site has already been set up you can check the ID of
the NAAQS primary as follows:

Go to maintain site form and enter the state/ county /
sitelD. (Fig I) and click on "Primary Monitor Periods".
The primary monitors are listed on this record (see Fig 2).

Now, since all PM25 sites are not required to have QA
collocation, when the PQAO decides to use a site to meet
its Appendix A requirements, both the NAAQS primary
and QA collocated monitors must be identified in AQS.

1.	To identify the NAAQS primary, first retrieve the site
in the Maintain Monitor form (Fig. 3), and click on the
QA Collocation tab.

2.	Then, on the QA Collocation form (Fig. 4 Page 13),
enter the begin date for the NAAQS Primary
monitor that it will start being used in a collo-
cated pairing, and enter 'Y' in the "Primary Sampler"
need to be filled in.

(Continued on page 13)

sit Daift Agency Rom | Tangtn 36125 Or"" "*'.M | cenwnb | prmry weM* Pirodi i

Sue Identification

Click on Primary Monitor Periods

State Code [~	| |Hortt Carofcna

County Code [Ti5 	J|wake	StteM JoOU	Status Ind |F~

User Coordinates
Horizontal Datum |'|ADS3

Latitude |35.8S611t

UTM Zone p

Lookup Geography

Fig. 1 Partial view of the maintain site form

Maintain Site (Read Only UpdateAnseit notallowed)	-1 Ol x]

Basic Ste Dam | AehStonnl S« Oota ] Afl*noy Roles | Talent Roads | Open Paths | Cwnmnnts Prmary Mentor Penods |

> [37 [iii joou

Parameter Code

1-12602

i'

|19890601

183101

11 -

|l&99010t

|2005«16

|S3l0l

1*

-

12M50617

f20050325

]83101

1'

|203£Q32€

[20060310

183101

1*

-

(20060611

[200S0S01

183101

1



120060802

I20060802

J8310I





|2006CS03

[20060905

183101

1'

|20060906

{20101231

183101

|3

ii>

(20110101

<-H		

1&S129

1' 1

[20111229

r~

r~

I I



Fig, 2 Primary monitor period record

BllainfiUft-Uwiitar; ffteadQn? UpdatMnseit

County

Code Cod* tuM Carta	roc Satualnd

j]7 rgfl. ItiB M Toon f?{ jwia t _jjj (3 [p-

POC 3
is primary
monitor

^saixj

Jfteirfor Bas»c
Sarnie Penods

P^ajedCust 01

UtuSeve .!,'.»>so».G0D H
Preftsiwuiwi jSRtXHiP LEVEL SutPOR
(VOW Veil Dot
Simp Rm Tie*

One Cote

DCfanati SoarM [iJJE*
^ j

PnsbfMejK

[	jj

iwt$.np0»&
IfantDnajAgfecyiOwne) |577fl

Ulttcst Ffew

Mi

I €*fCk-l !>£< 0! Eft. rtfmefrl

Maintain Monitor Form

Click

TymAiiiy

Network Afliattors

AgtncyRohi

OB)ec1h«

Req Frequencies



Fig. 3 Partial view of maintain monitor form to enter the M AAQS primary monitor

column, then click the save icon. The monitor ID does not


-------
PAGE 13

PM2.5 Collocation Requirements (continued from Page 12)

Next, we identify the Collocated QA Monitor:

1,	Go to the Maintain Monitor Form (Fig 3 page 12) and
enter in the state/county/sitelD/Parameter Code and
POC of the monitor you want to identify as the OA
Collocated Monitor and click on the QA colloca-
tion button. In this case (see Fig. 5) the monitoring
organization identified the POC I monitor as the QA
collocated monitor.

2.	Enter the Begin Date for the pairing (which should be
the same date used as the begin data for the MAAQS
primary monitor), Distance from Primary Sampler"
and "N" in the "Primary Sampler?" column to indicate
that the QA collocated monitor is not the NAAQS
primary sampler. Enter the monitor ID of the primary
monitor.

Some other information about the system

One cannot change a primary unless you change a collocat-
ed first (if the site is a QA collocated site). So in the above
example:

12 iaimvi- i

n (RnwJOWy L'D<£ia.lfuqrt tiCA	i

QA ColtodMlon

I

NAAQS Primary monitor ID and POC

	J	 -JMst



2ononr







fr

-













































































F















j	[

II

i





[	

[_

-



/

POC3

Fig. 4 Partial view of QA collocation record for the NAAQS primary monitor

POC (1) of the Collocated Monitor

Sample Petvods
Type As-wgn

Agency Roles
Obj^CtnrM
| FUq Ff»qu«o<«*i
QACotocation
Methods

Mentor Basic ¦
Periods

Typtftuly

Network AfihatiOns
Agency Roles
Object***

1

the collocated oACcflocMioo

Fig, 5 Partial view of QA collocation record

•	one would have to place an end date in the POC I QA collocated monitor and add a new POC for the primary (Fig 5)

•	Go to the maintain site form click on the primary monitor periods, put an end date for POC 3 (this example figures I
and 2) and add the new NAAQS primary POC that was revised in the QA collocation table along with a new begin
date.

NOy Update

We last discussed NOy issues in issue i 2 (page
8) of the QA Eye. We wanted to take an op-
portunity in this issue to provide a few remind-
ers and updates on NOy related activities.
NOy analyzers provide NO, NOy. and NOy-
NO concentration data and are currently re-
quired at NCore multi-pollutant monitoring
stations. Most models are simply chemilumi-
nescence NOx analyzers that have had their
plumbing modified to accommodate an exter-
nal molybdenum converter. The external con-
verter is necessary to allow sampling intake
manifolds at the recommended 10 meter
height, which allows for improved ability to
measure higher order oxidized nitrogen spe-
cies, known as NOz, which includes species
such as nitric acid and peroxyacetyl nitrate. In
many cases, as part of the instrument modifica-
tion process (at least with older units), the
vendors may not have changed the analyzer

operating software to reflect the fact that the
analyzer had been modified to be a NOy
analyzer. As a result, those units may actually
display measurement outputs for NOx and/or
N02 instead of the appropriate NOy and
NOy-NO metrics. OAQPS is aware of multi-
ple instances where this has caused confusion,
and a state thought they had N02 data com-
ing from a NOy instrument. The key point
here is that NOy analyzers do not report
N02 .

In the previous QA Eye article on NOy, we
discussed reporting QC data from our NOy
analyzers. As an update and correction from
the previous QA Eye article we want to em-
phasize that the focus of I -point QC checks
for NOy analyzers is on the NOy channel.
The preferred challenge agent is still NPN or
IPN. As a secondary option, if NPN or IPN
are not available for QC checks, N02 is the

next best challenge agent for NOy ana-
lyzers. N02 can be generated via GPT or
used via certified cylinders of N02. The
tertiary option is to perform the QC
checks with NO. This prioritization has
been agreed upon at EPA between the
Office of Air Quality Planning and Stand-
ards and the Office of Research and De-
velopment's National Exposure Research
Laboratory (NERL). We want to note
that currently, EPA-ORD-NERL is begin-
ning to evaluate the merits of using NPN
and IPN as a challenge agent for NOy
analyzers compared to using N02 gener-
ated from GPT and certified cylinders.
When their studies are concluded we
will communicate the results and. if war-
ranted, change our preferred QC chal-
lenge agent prioritization to reflect any
new findings.

Q A EYE


-------
PAGE 14

Joe in the Field

\

Joe Delwiche Remembered

L

Joe's passion for lifelong learning took
him and his wife Diane Brunson to muse-
ums wherever they went. Before his

When I started my career with
OAQPS back in 1995, I did not
have that much experience with
ambient air monitoring so I did a
lot of listening on our monthly
EPA calls with the Regions. At
that time some dominant players
like Norm Beloin (Region I), Ted
Erdman (Region 3) and Mary
Kemp (Region 6) provided a lot of
sound technical advice. There was
one other person who usually
spoke towards the end of conver-
sations. He spoke quietly, elo-
quently and with a knowledge that
seemed to make great sense.

That was Joe Delwiche.

Born December 3 I, 1950, Joe grew up in Berkeley and
Davis, California, the third in a family of six boys. After
graduating from Davis High School in 1969, Joe attend-
ed San Jose State University for a year then joined the
U.S. Coast Guard on June 8, 1970. In the Coast Guard
he served aboard the USCG cutter Acushnet on the US
Atlantic coast, and on Iceberg Patrol over the north
Atlantic aboard CI30 aircraft. He was honorably dis-
charged June 7, 1974. He promptly enrolled at Cornell
University, in Ithaca, NY, and completed a Bachelor of
Science degree in Meteorology in 1977.

After graduation, Joe pursued a career in air quality
monitoring. Based initially in Southern
California, Joe worked for Rockwell In-
ternational and its antecedents and trav-
eled extensively with a technician partner
and a gas chromatograph sampling for
wellhead hydrocarbon emissions through-
out the oil patch. He stayed with that
activity and took employment in Denver
with the US Environmental Protection
Agency in 1991, where he became an Air
Quality Monitoring Specialist. Joe traveled
extensively for EPA monitoring air at
designated sites across the western US
until he became ill in October of 2014.

death Joe completed a certification course in Paleontol-
ogy, in preparation for working in the fossil lab at the
Denver Museum of Nature & Science, a plan for Joe
that was sadly not to be. joe was active in Amateur
Radio in the Denver area, and his particular interest
was working radio contacts with portable equipment
from remote mountain peaks (pictured below) in Colo-
rado in their annual "Fourteener" event.

Joe did a lot of work with the QA community. He
worked and contributed on almost every QA regula-
tion and guidance document we have distributed over
the last 20 years. Most recently Joe and Chris Hall
from Region 10 helped revise the Prevention of Signifi-
cant Deterioration (PSD) regulation that is currently
undergoing review and the PSD Technical Note posted
on AMTIC in 2012. His editing skills and mastery of
the English language helped to create clarity and reada-
bility to many of our guidance documents. In addition,
my one-on-one conversations with Joe on QA issues
revealed his passion for the work and dedication to the
quality of our ambient air data. He objectively ex-
pressed the views of the monitoring organizations
within Region 8 and helped me see the issues facing the
organizations which are not often as clear here in RTP.
He represented Region 8 issues with objectivity, grace
and tact.

joe was soft spoken, intelligent, understated, and car-
ing. He was an avid reader with a keen eye for lan-
guage. As we continue to have our regional confer-
ence calls many of us will miss waiting for those final
words of wisdom from Joe. Thanks Joe for sharing
your wisdom, and friendship.

Joseph (Joe) Delwiche passed
away Saturday, May 2, 2015, after
a seven-month battle with cancer.

Q A EYE


-------


?

73

\

^ PRO^

EPA-OAQPS
C304-02
RTP, NC 27711

\
uj
o

T

The Office of Air Quality Planning and Standards is
dedicated to developing a quality system to ensure that
the Nation's ambient air data is of appropriate quality
for informed decision making. We realize that it is only
through the efforts of our EPA partners and the moni-
toring organizations that this data quality goal will be
met. This newsletter is intended to provide up-to-date
communications on changes or improvements to our
quality system. Please pass a copy of this along to your
peers and e-mail us with any issues you'd like discussed.

E-mail: papp.michael@epa.gov

Mike Papp

Key People and Websites

Since 1998, the OAQPS QA

Team has been working with the

Program

Person



Affiliation

Office of Radiation and Indoor Air

STN/IMPROVE Lab Performance Evaluations

Eric

Bozwell

ORIA- Montgomery

in Montgomery and Las Vegas and

Tribal Air Monitoring

Emilio

Braganza

ORIA-LV

ORD in order to accomplish it's

Speciation Trends Network QA Lead

Dennis

Crumpler

OAQPS

QA mission. The following per-

OAQPS QA Manager

Joe

Elkins

OAQPS

sonnel are listed by the major

Standard Reference Photometer Lead

Scott

Moore

ORD-APPCD

programs they implement. Since
all are EPA employees, their e-
mail address is: last name.first
name@epa.gov.

National Air Toxics Trend Sites QA Lead
Criteria Pollutant QA Lead
NPAP Lead
PM2.5 PEP Lead

Greg
Mike

Mark
Dennis

Noah
Papp

Shanis
Crumpler

OAQPS
OAQPS
OAQPS
OAQPS



Pb PEP Lead

Greg

Noah

OAQPS

The EPA Regions are the prima-

Ambient Air Protocol Gas Verification Program

Solomon

Ricks

OAQPS

ry contacts for the monitoring

STN/IM PROVE Lab PE/TS A/Special Studies

Jewell

Smiley

ORIA-Montgomery

organizations and should always

STN/IM PROVE Lab PE/TSA/Special Studies

Steve

Taylor

ORIA-Montgomery

be informed of QA issues.

Websites

Website

EPA Quality Staff
AMTIC

AMTIC QA Page

URL

EPA Quality System

http://www.epa.aov/ttn/amtic/

http://www.epa.gov/ttn/amtic/guality.html

Description

Overall EPA QA policy and guidance
Ambient air monitoring and QA
Direct access to QA programs


-------