The QA EYE

v» *

ISSUE 2 3

OCTOBER 2018

OFFICE OF AIR
QUALITY PLANNING
AND STANDARDS

SPECIAL POINTS
OF INTEREST:

National Air Monitoring
Conference (pg. I)

Fair Warnings #1 and 2
(page 4)

PM2.5 Low Concentration
acceptance criteria (page
12)

INSIDE THIS
ISSUE:

I 2018 National Confer- I
ence

QAPP Session at Nation- I
al Conference

| A QAPP Writing Journey 3

I-Point QC Concentra- 4
I tion Ranges

QAPP Evaluations in	4

I AMP600

CSN Mega PE Results 5
Primary PM2.5 Monitors 7

Update..what's in the	10

Pipeline

PEP/NPAP Training	10

I Jenia McBrian Detail

Fond Farewell

I I

Xi Chen on board for	I I

NATTS

PM2.5 DQO for Low Con- 12
centrations

14

2018 National Ambient Air Monitoring Conference

On August I 3 through 16 in
Portland, Oregon, OAQPS
hosted the 2018 National Am-
bient Air Monitoring Confer-
ence. Judging from the feed-
back, it was a great success!
Quality Assurance had a large
presence once again including
the revamped QA 101 training
session on Monday, the QAPPs
and QC Discoveries technical
session on Wednesday, to the
TSA Training session on Thurs-
day. A big thanks to all those
involved in the planning and
delivery of these sessions and
to everyone who attended. A
huge amount of planning and
work goes into putting the
conference together and the
QA community stepped up to
make it a success once again.
The QA 101 training session
featured a new format using
parts of the newly revised APTI

470 course as a template for
the session. Mike Papp and
Stephanie McCarthy have
worked over the past year to
update this course and the QA
101 training sessions were de-
veloped from these modules. It
is our hope at OAQPS that the
APTI 470 course can provide a
solid QA foundation for new
and veteran QA staff and that it
builds consistency in QA
throughout the regions. Model-
ing the QA 101 training course
after the APTI 470 was a first
step in this direction. The ses-
sion began with the fundamen-
tals of the EPA Quality System,
then pollutant specific quality
requirements, and ending with
data verification/validation and
certification. Wednesday was
an afternoon of QAPPs and QC
Discoveries featuring speakers

from the regions and state and
local monitoring organizations.
QAPP writing guidance, experi-
ences with QAPPs, and low-
level auditing insight were the
highlights. To end a long week,
the TSA training session intro-
duced the new TSA Quality
Assurance Guidance Document,
explored the details of a tech-
nical systems audit, and tested
the audience with real-life TSA
findings in pictures. As a credit
to the presenters, all sessions
were very well attended with
good interaction throughout.

Three topics caught my interest
during the conference that have
or will have a big impact on the
QA community. These three
areas are PAMS, data visualiza-
tion, and low-cost sensors.
Continued on page 2

QAPP Session at the National Ambient Air Monitoring Conference

In recent months, many agencies have been
focusing efforts towards updating their monitor-
ing QAPPs, or developing new ones, which has
sparked a lot of interest and questions from the
monitoring community, especially for those
agencies with new QA staff or QAPP writers.
Common questions we have heard over the
months include, "Why are QAPPs important?",
"Why do I need a QAPP?", "Where do I start?"
and most frequently, "Is there any new guidance
available to help with this process?" The re-
sponsibility of writing a QAPP, especially for the
first time, can be really overwhelming! So, to
provide some assistance to QAPP writers, as

well as to answer some of these important
questions, a technical session was offered during
the national monitoring conference in Portland.
The technical session, QAPPs and QC Discoveries,
was a packed house! It was exciting to see so
many attendees - and there was a lot of good
discussion and interaction during the session.
Presenters discussed recent challenges and les-
sons learned during the QAPP writing process,
offering perspectives on the value of these doc-
uments. New tools to assist the QAPP writer
were also presented during the technical ses-
sion. Continued on page 2


-------
ISSUE 23	PAGE 2

2018 National Ambient Air Monitoring Conference (continued from Page l)

PAMS is a program that will involve everyone in the QA
community in one form or another that we should all be
preparing for in the very near future. At OAQPS we have
been busy writing the quality documents for PAMS and
providing resources for the PAMS QA program, but as I
quickly learned, there is much more on the horizon.

There may be new sites to visit, new QAPPs to review,
and new technologies such as the auto-GCs and ceilome-
ters to learn.

The rapid advancement of data visualization tools is excit-
ing because of their ability to digest large datasets and
create representations that are easily understandable and
useful to users such as the QA community. The question
is how do we make these tools available and relevant to
the needs of different data users with different needs?
We've only scratched the surface of the capabilities of
data visualization, and I believe we will see much more
advancement very quickly.

\

One of the biggest interests of the conference attendees was
the explosion of the low-cost sensor world. With their ever-
increasing numbers and their variability in data quality, low-cost
sensors are devices that should be of interest to all in the QA
community. OAQPS has taken a leadership role in helping to
determine ways to assess these sensors and is also involved in
their application in numerous studies. As QA professionals,
these sensors will keep creeping into our everyday work and
we will need to be knowledgeable in their appropriate use and
necessary quality assurance. There is a buzz in the air regard-
ing sensors and it's only going to get louder.

My final observation from the 2018 National Ambient Air Mon-
itoring Conference is the sheer amount of work that the QA
teams nationwide are responsible for. In every session, I would
see a place for QA, and in every case, I would see a familiar
face trying to build knowledge or asking a question. With that
said, I say thanks to all of you for your dedication to QA. We
at OAQPS recognize and appreciate your hard work in the
ambient air monitoring programs. - Greg Noah

/

QAPP Session at National Ambient Air Monitoring Conference (continued from Page 1)

The first tool introduced was the newly published Guide to
Writing QAPPs for Ambient Air Monitoring Networks (EPA-
454/B-18-006, August 2018). This document is a plain
language guide that concisely explains each of the required
elements in a QAPP using common air monitoring termi-
nology and examples. Additionally, the new guide offers
specific monitoring questions to QAPP writers to help
them brainstorm their air monitoring programs and quali-
ty systems, which in turn should help them craft language
that best reflects their specific projects. A few excerpts
from the QAPP guide were shown during the technical
session, and a sneak peak was provided of a new QAPP
review checklist that is also being developed to accompa-
ny the guide. Although the QAPP review checklist is
geared towards the EPA QAPP reviewer, it can also be
used as a tool to help the QAPP writer ensure that all the
major elements of an air monitoring QAPP have been
adequately addressed. The new guide, and its companion
checklist, are the result of efforts by an EPA workgroup
whose goal was to provide tools that would facilitate con-
sistency across EPA Regions in both air monitoring QAPP
content (for writers) and the approval process (for re-
viewers). Both the QAPP guide and the checklist can be
found on the AMTIC website under the Quality Assur-
ance Guidance Documents link (https://www3.epa.gov/

ttn/amtic/qalist.html). Please check it out! Check out the arti-
cle on page 3 for a first hand experience into using this guid-
ance.

Another tool discussed during the QAPP technical session in-
cluded an online QAPP-writing training course available
through the Institute of Tribal Environmental Professionals
(ITEP). ITEP's online training curriculum is available to any user
- just sign up for an account! The online QAPP-writing training
modules offer example text, provided element by element,
along with videos and quizzes to help students think about the
intended use of their data and the level of quality needed for
the specific project. To find this online QAPP-writing course,
visit ITEP's website at https://itep.scholarlms.com/courses/.
Thanks to everyone who attended and presented at the QAPPs
and QC Discoveries technical session! We hope you found it
beneficial. And, we hope these new tools will help you more
easily and quickly develop documents that reflect your unique
air monitoring programs! Please reach out to your EPA Re-
gional Office air monitoring QA contact with any questions or
concerns you may have when writing QAPPs. We're happy to
help! -Stephanie McCarthy


-------
PAGE '

, /

A QAPP Writing Journey

Being a person who loves the outdoors, hiking has naturally be-
come a past-time I truly enjoy. As much as anything, it has be-
come a great excuse to spend time in the woods, as you set a goal
and enjoy the sense of accomplishment that comes along with
pushing your limits. With that being said, I know my limitations
and rarely tackle some 30-mile trail labeled as "strenuous". I like
the shorter, easier trails that have some great reward at the end,
like an incredible vista from a mountain peak or a beautiful water-
fall tucked deep in the woods.

Five years ago I had a trail set in front of me that looked like it
was 100 miles straight uphill in the Himalayas. Being a new air
monitoring program manager, I learned my agency, like many
other agencies, needed to update all of our QA documents,
and in some cases, create them from scratch. This included
SOPs, QAPPs, and a QMP. Our SOPs were old, outdated, and
basically looked archaic. If you had questions about how to do
something, you just asked your co-workers or supervisor.
Well, everyone knows what that leads to: an unwritten and
even worse, undocumented, way of doing things. All the years
of doing the "right stuff," but not having documents detailing all
of our current procedures made our data defensibility appear
to be weak. And there it is, that 100-mile trail labeled as "very
strenuous". I was not intimidated by figuring out our current
process, because I knew that very well, but I never had any
experience writing technical documents so how would I put
my knowledge down on paper so it would satisfy EPA's expec-
tation for proper QA documentation?

Looking at our old SOPs and QAPPs only made the trail look
longer and added an altitude sickness component to it, which I
was starting to literally feel. So, instead of looking ahead at the
entire trail path, I thought about just putting one foot in front
of the other to move forward. EPA Region 4 said they would
help, and I knew other agencies that could help as well (not to
mention the staff on-hand that also had a great deal of
"current process" knowledge). We started with our Ozone
SOP and tried our best to document our actual process. The
goal was to write the SOP so that anyone with entry-level air
monitoring experience could follow it. After a few revisions
and a lot of bumps along the way, we got our first SOP ap-
proval letter from EPA. Now we could use this approved SOP
as a guide for each specific pollutant and take one step at a
time to get them all updated.

The QAPP was next. The QAPP is a document required to be
agency / project-specific, which covers not just what you do,
but why you do it. This is where the request sent to EPA Re-
gion 4 for guidance through the writing of the document
reached a new level. EPA was also aware of so many agencies,
who had been submitting QAPPs for approval, that were simp-
ly struggling with the concept of what a QAPP should be. So,
the idea of developing a "QAPP guide" as a tool for agencies
gained momentum. Region 4 EPA asked if I would be interest-
ed in helping in the development of a new air monitoring
QAPP guide by designing our agency's QAPP to be a model to

use in tandem with the guide. If it could help others improve
their own agency's data defensibility then yes, of course, I
would be interested.

Over the next few months, EPA and I worked over the
phone I -2 times a week and through email to develop, cri-
tique, fine tune, and finalize our QAPP, and at the same time,
the QAPP guide. So many of the ideas and focus put into our
QAPP started by just asking what do we do and why do we
do it. How does what we do fit into the requirements for the
QAPP? How does the QAPP reflect what we do? From site
descriptions to assessment types, to documents and records
management, each section states what we do — and not only
do we use it as a guide for staff, it serves as a resource for
each of us to use when questions arise. The more I worked
on the QAPP the more I realized how great a tool this could
be, not just for my agency, but for the entire region as well. I
have witnessed how many new faces are attending the EPA
regional annual workshops in Region 4 and knew the need for
this guidance tool would be greatly appreciated as agencies
fall into knowledge turnover. It will also keep all processes
more consistent and transparent for all levels.

The QA documentation developed by agencies is a tool for
staff on so many levels. The new hire can use it as a resource
for getting up to speed and improving their overall concept of
the monitoring program's goal. Operator level staff can use
it as a resource to make sure all procedures that they follow
are backed up by fully adopted and managed documents. QA
staff can use it as a true guidance and authority document to
cite any practices that may be deemed harmful to the data's
accuracy and defensibility. Management can also depend on
the documents to be legally binding and present a clear pro-
cess, objective, and basis for the monitoring program. Any-
one can request and inspect the QA documents and get a
clear understanding of how your monitoring program collects
data. This could be the EPA, interested citizens, media, re-
searchers, political groups, etc. No matter who reviews
them, the documents should match the actual procedures in
practice by your program and produce a high level of confi-
dence in the data collected.

And there it is... the beautiful view from a mountaintop that
you never thought you'd reach. All goals are reached by
taking that first step, and then the next, and while EPA was
pushing us to update everything and head up that strenuous
trail, they did not push us to move and then disappear. They
walked right alongside and helped answer questions along the
way to make the journey easier. Here at the Forsyth County
Environmental Assistance and Protection Office, we appreci-
ate all the hours on the phone and questions answered over
the last 5 years and would encourage everyone to extend a
hand or ask for a hand as we continue on this journey.

—Jason Bodenhamer -Forsyth County Environmental Assistance
and Protection Office

Q A EYE


-------
ISSUE 23

PAGE 4

Fair Warning #11-Point QC Check Concentration Ranges

40 CFR Part 58 Appendix A requires
the following ranges for the I -point
QC checks for the continuous gase-
ous monitors

•	0.005 and 0.08 parts per million
(ppm) for S02, N02, and 03,

•	0.5 and 5 ppm for CO monitors.

A I -point QC check that has the as-
sessment value (not monitor value)
concentration outside these ranges
can be reported but will not be used
in regulatory precision and bias statis-
tics, and will also not be used to as-
sess whether checks were conducted
within the CFR required frequency,
meaning the data will not show up on
AMP reports. An assessment value
that is within the range with the mon-
itor value outside the range (since
one can't determine what the moni-
tor will measure) will be used.

To assess in AQS whether checks
were conducted within the required
range, values are rounded to the fol-
lowing number of digits after the dec-
imal for the Assessment value, after it
has been converted to the standard
units for the parameter:

•	CO: Std Units: PPM, Rounded to
Number of digits after decimal: I

•	S02: Std Units: PPB, Rounded to
Number of digits after decimal: 0

•	N02: Std Units: PPB, Rounded to
Number of digits after decimal: 0

•	03: Std Units: PPM, Rounded to
Number of digits after decimal: 3

For Example: For S02, the value 0.5 ppb
will be rounded to I ppb (which is below
the allowable range of 5 to 80 ppb), and
the value 0.4999 ppb will be rounded to
0 ppb.

Another way of looking at

it is:

•	S02 and N02: 4.5 ppb - 80.4999
ppb is acceptable

•	03: 0.0045 ppm - 0.08049 ppm is
acceptable

•	CO: 0.45 ppm - 5.4999 ppm is ac-
ceptable

This rounding is only used to determine
if the assessment value is within the cor-
rect range. For the statistical assess-
ments, the values reported in the QA
transaction will be used. For example, if
an assessment value for S02 was 4.55
ppb and the monitor value was 4.42 ppb
the assessment value will round to 5 ppb
and will be within the range for statistical
assessment. However, when the statisti-
cal assessment is performed, AQS will
use the 4.55 ppb assessment concentra-

tion and 4.42 ppb monitor concentra-
tion in the precision and bias calcula-
tion. As discussed in the QA Hand-
book (see section 14.3.1), AQS has
been revised to allow monitoring or-
ganizations to report data up to 30
values to the right of the decimal and
it is suggested that monitoring organi-
zation take advantage of reporting to
more decimal places for both routine
as well as the QC data.

When the assessment value data is
reported to AQS outside of the range,
a warning will be provided to the
monitoring organization. As men-
tioned above, the data will be accept-
ed in AQS but it will not be used in
any assessment statistics. The Nation-
al Air Data Group initially implement-
ed this change around July 2018.
OAQPS got a few complaints and we
decided to wait until the calendar
year 2019 for full implementation. It is
important to note that beginning on
Jan. 1, 2019, not only would imple-
mentation of this potentially affect
regulatory precision and bias statis-
tics, but it could also affect p-check
completeness, resulting in the
AMP600 recommending that certain
monitors not pass "certification eval-
uation" criteria.

/

Fair Warning #2 QAPP Evaluation is Changing in the AMP600 Data

In order to address a finding in
the last Inspector General audit
(see QA EYE issue 22), OAQPS
is revising the data certification
and concurrence report
(AMP600) to flag data with an
"N" when a PQAOs QAPP is
over five years old. Quality
Assurance regulations require
QAPPs be revised on a 5-year
cycle and starting with the May

1, 2019 data certification for the
2018 data, OAQPS will be institut-
ing this revision to the AMP600 re-
port.

Last year Attachment I of the guid-
ance document "Ambient Air Moni-
toring Data Certification Q&A for
CY20I7" posted on AMTIC alerted
the PQAQ of this revision as fol-
lows:

NOTE: For the 2018 Data certifi-
cation process (due date, May I,
2019), any sites for PQAOs whose
QAPP approval date is greater than
5 years old will be given a Red "N"
flag. The tables below will be re-
vised for the 2018 Guidance Docu-
ment.


-------
First Results of CSN Mega-Performance Evaluation

Figure 2. PE Sampling system consisting of 32 PM2.5 cyclones on
four sampling manifolds and one dedicated pump
(in fancy pump-box in the foreground)

Each laboratory received the following set of PE speciation
samples:

•	Anion and Cation Analysis by Ion Chromatography (IC)

-Five Nylon® filter samples (all labs)

—Six Teflon® filter samples (one lab)

•	Carbon by Thermal Optical Analysis (TOA)

—Five quartz filter samples
—Four quartz filter samples (one lab)

•	Elemental analysis byX-Ray Fluorescence (XRF)

—Five 47 mm Teflon® filter samples

OAQPS does not have its own laboratories and was unable to
successfully qualify external referee labs, therefore it was not
possible to obtain reference values for the PE samples. Since
the lab results could not be evaluated against an assigned value
(referee lab result), OAQPS evaluated each result against the
results of the other laboratories participating in the study
(interlaboratory comparison). To analyze the data, results that
were reported as either "ND" or "
-------
ISSUE 23

CSN Mega PE Results (Continued from page 5)

PAGE

For this study, when z < 2 the analytical result is satisfactory (95% of
the z-scores are expected to fall in this range for normally distributed
data); when 2 < z < 3 the analytical result is considered questionable
(should be investigated by the laboratory); and when z > 3 the analytical
result is unsatisfactory. Happily, none of the results had a z-score
greater than 3 and only two results were greater than 2. See the tables
below for a summary of all results. Note that these data may change
when the remaining results come in from one laboratory.

For the elemental analysis by XRF, results from the top ten CSN wide
average elemental concentrations from June 2016 through May 2018
were included in the interlaboratory comparison. In descending order
by average concentration, these elements shown in Figure 2, below, and
are sulfur (S), silicon (Si), iron (Fe), potassium (K), sodium (Ns), calcium
(Ca), aluminum (Al), chlorine (CI), magnesium (Mg), and zinc (Z).

Network-wide average elemental concentrations

June 2016-May 2018

	• 0.3647

0.0912

« 0 0697
• 0.0670
0 0623

	• 0.0398

— 0 0307
' 0.0163
0.0136

Figure 4. Top-ten CSN-wide average elemental concentrations

As shown in Table I, below, all z-scores for the selected elements by
XRF were below 2

Table I. Selected Elements by XRF: Interlaboratory z-score Results

Lab

Event

S

Si

Fe

K

Na

Ca

Al

CI

Mg

Zn

DRI

1

0.82

1.14

0.85

0.50

1.48

0.68

0.70

1.19

0.77

1.26

ODEQ

1.43

0.52

1.44

1.14

0.70

1.44

1.46

0.19

0.77

1.19

SCAQMD

0.10

1.02

0.37

1.06

0.48

0.67

0.59

1.23

1.33

0.04

UCD

0.51

0.63

0.22

0.58

0.30

0.08

0.16

0.15

0.22

0.04

DRI

2

0.71

1.28

0.39

0.17

1.50

0.30

0.81

1.12

0.85

0.62

ODEQ

1.47

0.31

1.47

1.44

0.59

1.49

1.23

0.97

0.85

1.49

SCAQMD

0.22

0.91

0.77

0.85

0.46

0.63

0.81

0.70

I.I 1

0.43

UCD

0.53

0.67

0.31

0.41

0.45

0.55

0.39

0.55

0.58

0.45

DRI

3

0.90

0.78

0.45

0.58

1.50

0.39

1.50

0.61

0.87

0.77

ODEQ

1.40

0.85

1.49

1.49

0.53

1.49

0.55

1.50

0.87

1.47

SCAQMD

0.01

1.23

0.65

0.32

0.47

0.61

0.52

0.44

0.87

0.35

UCD

0.49

0.40

0.40

0.59

0.50

0.49

0.43

0.45

0.86

0.34

DRI

81

0.00

1.22

0.50

0.45

1.50

1.12

0.06

0.58

0.02

0.87

ODEQ

0.00

0.40

0.50

0.52

0.52

0.84

0.69

0.58

0.81

0.87

SCAQMD

0.00

0.90

0.50

0.52

0.52

0.84

0.69

1.49

1.40

0.88

UCD

0.00

0.73

1.50

1.50

0.46

0.57

1.43

0.33

0.61

0.85

DRI

B2

0.00

1.12

0.43

0.39

0.50

1.43

0.50

0.62

0.82

1.37

ODEQ

0.00

0.57

0.54

0.55

0.50

0.88

0.50

0.62

0.82

0.74

SCAQMD

0.00

0.83

0.54

0.55

0.50

0.42

0.50

1.48

l.2|l

0.74

UCD

0.00

0.86

1.50

1.50

1.50

0.13

1.50

0.24

0.43

0.12

For cation and anion analysis by IC, filters were extracted and analyzed
for the cations sodium (Na), ammonium (NH4+), potassium (K), and
anions chloride (CI"), nitrate (N03"), and sulfate (S042"). Note that CI
was added to the analyte list several years ago because there was an
interest in quantifying the impact of sea spray on PM2i5. Desert Re-
search Institute (DRI) has additional samples because they perform
these analyses on both nylon and Teflon filters, so both filter types

were included with their PE samples. Additionally, the DRI B2
Teflon PE sample was found to have two filters adhered togeth-
er. DRI extracted these samples separately and reported both
results, which were included in the data analysis. As shown in
Table 2, below, most z-scores were below 2, with only one out-
lier at 2.04.

Table 2. Cations and Anions by IC: Interlaboratory z-score
Results

Lab

Event

Na

NH4*

K

CI

NO,

so42

DRI

1

0.86

1.45

0.86

0.70

0.25

0.38

DRI

1.16

0.13

1.21

0.95

1.76

1.49

ODEQ

0.63

1.36

0.30

NA

0.60

0.97

RTI

0.17

0.02

0.60

0.48

0.66

0.08

SCAQMD

1.22

0.20

1.16

1.17

0.25

0.83

DRI

2

0.25

0.45

0.87

0.37

0.16

0.89

DRI

1.63

1.79

1.18

1.49

1.73

1.38

ODEQ

0.22

0.39

0.45

NA

0.79

0.23

RTI

0.05

0.45

0.39

0.62

0.56

0.34

SCAQMD

1.11

0.50

1.22

0.50

0.21

1.06

DRI

3

0.36

0.38

0.72

0.03

0.04

0.01

DRI

1.73

1.41

1.29

1.34

1.73

1.41

ODEQ

0.04

1.31

0.19

NA

0.71

0.73

RTI

0.56

0.11

0.66

1.06

0.63

0.46

SCAQMD

0.76

0.38

1.16

0.25

0.35

1.12

DRI

B1

0.80

0.45

0.99

0.71

0.04

1.03

DRI

0.43

0.45

1.20

0.98

0.07

0.73

ODEQ

0.90

0.45

0.73

NA

0.85

0.73

RTI

0.86

0.45

0.73

0.59

0.85

0.73

SCAQMD

1.27

1.79

0.73

1.09

1.60

1.16

DRI

B2

0.05

0.00

0.40

0.28

0.19

0.41

DRI

0.84

0.00

1.50

0.50

0.66

0.41

DRI

1.56

0.00

0.65

0.50

0.56

0.41

ODEQ

0.78

0.00

0.85

NA

0.96

0.41

RTI

0.78

0.00

0.85

0.50

1.68

0.41

SCAQMD

0.78

0.00

0.85

1.78

0.70

2.04

NA: This laboratory does not routinely report CI"

Italicized results were reported as ND or< DL and converted
to zero for statistical 3nalysis|

For organic carbon analy-
sis, results for organic
carbon (OC), elemental
carbon (EC), and total
carbon (TC) were ana-
lyzed and compared
across laboratories. Note
that one lab was not pro-
vided a filter for Event I,
so was not included in
that comparison. As
shown in Table 3, below,
all z-scores were below 2.

These preliminary interla-
boratory comparison
results will be updated
when the remaining labor-
atory submits their analyt-
ical results for this study.
-Jenia McBrian

Table 3. Carbon by TOA: Interlaboratory
z-score Results

Lab

Event

OC

EC

TC

DRI

1

0.93

0.81

0.91

SCAQMD

1.06

1.12

1.07

UCD

0.13

0.31

0.16

DRI

2

1.44

1.25

1.33

ORD

0.59

0.23

0.77

SCAQMD

0.75

0.37

0.77

UCD

0.10

I.I 1

0.21

DRI

3

1.26

1.49

0.98

ORD

1.18

0.37

1.38

SCAQMD

0.04

0.65

0.34

UCD

0.12

0.46

0.06

DRI

Bl

0.81

1.44

0.80

ORD

0.81

0.68

0.81

SCAQMD

0.37

0.68

0.37

UCD

1.25

0.09

1.25

DRI

B2

0.75

0.50

0.75

ORD

0.26

0.50

0.26

SCAQMD

1.47

0.50

1.47

UCD

0.46

1.50

0.46


-------
PAGE 7

Identifying the PM2.5 Primary Monitor (Sampler) for Routine
Monitoring and Collocated For Regulatory QA Purposes

When it comes to QA Collocation requirements there is still a lot of confusion out there. Some of the confusion stems from the
term "collocation" being used in the generic sense; meaning that sites exist where there is a primary PM2.5 monitor and there are
other PM2.5 monitors "collocated" at the site for purposes other than meeting the "QA Collocation" requirements. In this arti-
cle "QA collocation" refers to the NAAQS primary/QA collocated monitors that are paired to meet the 40 CFR Part 58 Appen-
dix A Section 3.2.3 Collocated Quality Control Sampling Procedures for PM2.S QA requirements.

The collocated monitor must be paired with the NAAQS primary monitor. The AMP 256 reports and the AMP600 report will
not recognize any collocation where the QA collocated monitor is not paired with the NAAQS primary. It will also not report
where the method codes are not appropriately paired as required in CFR.

Many PM2.5 sites have more than one PM2.5 sampler/monitor at a site. When there is more than one monitor it may be for two
reasons:

1.	Additional monitors to cover additional days or to have a daily continuous monitor for AQI purposes while also having an
intermittent primary monitor at the site, or

2.	to achieve official regulatory QA collocation as described in 40 CFR part 58 Appendix A Section 3.2.3.

In either of the two scenarios, a primary monitor needs to be designated in AQS. It must also be the monitor that is listed as the
primary monitor in the annual network plan as described below from 40 CFR 50, Appendix N, 1.0(c) (definitions):

Primary monitors- are suitable monitors designated by a state or local agency in their annual network plan (and in
AQS) as the default data source for creating a combined site record for purposes of NAAQS comparisons. If there
is only one suitable monitor at a particular site location, then it is presumed to be a primary monitor.

By default, AQS will designate the first PM2.5 monitor created at a site as the primary monitor for NAAQS comparisons. By
default, any other monitor created in AQS for the site will not be the NAAQS primary monitor.

This can lead to problems if a newer monitor is listed in the annual network plan as the primary, but the monitoring organization
has not identified it as the primary monitor in AQS. The monitoring organization always has the option of setting the primary in
AQS on the Maintain Site Form and EPA suggests that reviewing the primary monitor designation in AQS on the Maintain Site
Form is the best practice to avoid misidentifying the primary monitor.

Checking the Primary Monitor in AQS

The following procedure is a way to check what AQS has currently defined as the primary monitor.

I. At the main menu select "Maintain" and then select "Site" this will bring you to a blank site form where you can enter the

a

State, County and Site ID code (Fig I). Hit "Execute the query"	for that site. It will provide the information shown in

Figure 2. Continued on Page 8

Q Maintain Site (EPA Headquarters)	«I~! X

Basic Site Data J Additional Site Data j Agency Roles j Tangent Roads j Open Paths | Comments j Primary Monitor Periods |

Site Identification

State Code [37
County Code [021

Site Id 0034

Status Ind

r

Figure 1. Partial view of maintain monitor form

Q A EYE


-------
ISSUE 23

PAGE 8

Identifying the PM2.5 Primary Monitor for Routine Monitoring and Collocated For Regulatory (j \ Purposes
(continued from page 7)

2. Click on "Primary Monitor Period" (see Fig 2)

Click on Primary Monitor Period

^Maintain Site (EPA Headquarters)	^	-!~! xl

Baste Ste Data | Add tonal S«e Data ) Agency Roles | Tangent Roads | Open Paths | Comments | Pnmary Monlor Penods |

Site Identification

State Code |5Jj 	| |North Carohva

County Code [oil 	| |Buncombe

User Coordinates 			

Horizontal Datum jwGSW	Latitude 135 6062

UTM Zone [	UTM Easting

an

Lonsjttude] -82 58"
UTM Northing I

Standard Coordinates: Datum In ADS 2

Horizontal Method 1103		| |GPS

Horizontal Accuracy U n.

(Meters) ,JU*

Latitude |

Source Map Scale (Hon-GPS) 124000

cal Accurac

(Meters)

I66'" 		

Vertical Method foOO		| |UHKKOWN

Street Address '75 BWGHAM ROAO

Land Use Type [COMMERCIAL		| Location Setting |SUBURBAN

C*y Code 102140		1 |Ashev«e

Urtoan Area Code |o480 	| |ASHEVILE, NC

A OCR Code fiTi		| | WESTERN MOUNTAM

Vertical Datum

"_l

Time Zone Name

Owning Agency |o779	|i]jNoffh Carotna yVesterr^cgonaUtir PcBution ControlAgency

X

Parameter Code

POC

POC 1 is 1
Primary Monitor

Begin Date End Date

I&8101

h

-

|l9990101 S

I









I

1

I



I

I

! [37 [oil |0034

1)	Go to the main menu and select "Maintain" and select
"Monitor". The Maintain Monitor Form (Fig. 4) will appear

2)	Enter the state/county/site ID/Parameter Code (see Fig. 4) and
click on the "execute query" icon.

Action Help Session Admin Audit Retrieval Maintain Critical Rev CErtification Batch COrrect Main Menu

Bali a.j ias I « t < ~ i >~ ± x a p % ?

£2 Maintain - Monitors (Epa Headquarters)

Figure 2. Maintain site record

3. For this site (see Fig 3), the primary monitor is the POC I monitor
since there is no end date for this monitor

PA Headquarters)	-iDl *1

Jditional Site Data Agency Roles | Tangent Roads Open Paths Comments Primary Monitor Periods

State	County	Parameter

Code	Code	Site Id Code

|5T~ [oil I0034 -1 J88101| '\

Project Class
Meas Scale
Probe Location
Probe Vert Dist
Samp Res Time
Close Date



Dominant Source	!

Open Path Num	r_L

Probe Height	|

Surrogate hd	| _-J

Last Samp Date	|

Monitoring Agency (Owner)	j

"J

Probe Hor Dist I
Unrest Air Flow |
Last Post Date I

Screening Group i

£

Figure 3 Primary monitor table

NOTE: If you wanted to change the current primary NAAQS monitor
(in this case POC-1) to another monitor you would enter an end date
for the POC-1 monitor and then start a new line with the parameter
code, POC and begin date for the new primary monitor

Steps to Determine or Identify the QA Collocated Monitor
That Will Be Paired with the Primary Monitor

The following steps will ensure that the QA collocated monitor is paired
with the correct primary monitor at the site. In this scenario, the site
has 4 PM2.S monitors (POCS- 1,2 3 and 7) and as discovered above,
the POC-1 is the NAAQS primary monitor. The POC-2 monitor has
been identified as the collocated monitor to achieve the 40 CFR part 58
Appendix A Section 3.2.3 collocation.

Figure4 Maintain monitorform

3) This will retrieve all of the PM2.5 monitors at the site. Use the
scroll icons (see Fig 5) to bring the primary monitor up on the
form (POC-1 as determined in earlier section) . Then click on
the "QA Collocation" button.

Scroll to review

Action Help Session Admin Audit geirieval Maintain Critical Ray. ossification Batch CQrrect yam Menu

H t.	j % a 1 S « » (<	i i r ¦'

i - Monitors (Epa Headquarters)

Project Class

Meas Seal#	'NEIGHBORHOOD

Probe Location	TOP OF BUILDING

Probe Vert Oct :	

Samp Res Time
Close Date

mIidjs}

Monitor Basic

Screening Group 543

Figure 5 Maintain monitorform

4) The Monitor box in the upper right of the Form ( red box in
Fig 6) identifies the monitor that you are currently reviewing
(POC-1 in this case as shown in Fig 5). The monitor ID high-
lighted in blue is the current designated primary monitor. The
Field "Primary Sampler" which in this case is designated as "Y"
indicates that the POC-1 is the primary sampler (monitor).
Continued on page 9

Action Help Session Admin Audit Retrieval Maintain Critical Rev Certification Batch CQrrect Main Menu

Bl B 
-------
ISSUE 23

Identifying the PM2.5 Primary Monitor... (continued from page 8)

PAGE 9

Action Help Session Admin Audit Retrieval Maintain Critical Rev CErtification Batch correct Main Menu

5) Step 3 is repeated for the next
PM2.5 monitor (POC-2) at the
site. Figure 7 indicates that the
POC-2 monitor is the QA collo-
cated monitor since it is associat-
ed with the primary monitor and
it also indicates that it is not
(Primary sampler = "N") the pri-
mary sampler (monitor). The QA
collocated monitor can not be the
primary so the "N" is appropriate

6) The POC-3 and POC-7 monitors
are then selected (see step 3) and
because no monitor ID is associat-
ed with these monitors (rows are
blank) it indicates they are not the
QA collocated monitor.

HS II ^Blla ft 6

[2 Maintain - Monitors (Epa Headquarters)

« t O ± X S J] IP ^ ?

QA Collocation

Begin Date End Date

[Ffisirn

I



)



1



1



i



1



1



I

Distance from Primary Sampler

r





r





r

i—





r





r





37-021-0034-88101

~[| Indicates that
POC-2

Monitor is the
_ QA collocated

-	and is Not ("N"

-	the primary

Figure 7 Maintain monitorform for the designated QA collocated monitor

Action Help Session Admin Audit Retrieval Maintain Qritical Rev Certification Batch CQrrect Main Menu

0 a n z?Jp g|?a 1 « T < ~

E2 Maintain - Monitors (Epa Headquarters)

QA Collocation

Begin Date End Date

EE

Distance from Primary Sampler

Primary Sampler?

¦i

H
I-"

Monitor Id

I	

Action Help Session Admin Audit Retrieval Maintain Qritical Rev Certification Batch CQrrect Main Menu
15 Maintain - Monitors (Epa Headquarters)

QA Collocation

Begin Date End Date

Monitor



Dstance from Primary Sampler

Primary Sampler?

rs

Figure 8 Maintain monitorform for POC-3 and POC 7

Jnjx;
Monitor Basic

Sample Periods
Type Assign.

Network Affiliations

Agency Roles

Objectives

Req Frequencies
QA Collocation

Monitor Basic
Sample Periods |

Type Assign

Network Affiliations

Aaencv Roles

.sjnjxj
Monitor Basic

Sample Periods

Type Assign

Network Affiliations

Additional Information About the System

AQS has specific procedures that must be followed to change
primary monitors/samplers or change the QA collocated moni-
tor/sampler.

To change a primary monitor at a non-QA collocated
site

1.	Go to Maintain Site Form (Fig I)

2.	Enter the Site ID and click on "Primary Monitor Peri-
ods" (see Fig 2 and 3)

3.	Enter an end date for the current primary (POC-1)

4.	Enter in new primary monitor POC (i.e., POC-3) and a
begin date

5.	Save the edit

To change a primary or QA collocated monitor at an
official QA collocated site

If primary (currently POC-1) changes (i.e., to POC-3) but the
POC-2 QA collocated monitor stays the same you first need to
discontinue the QA collocated monitor from the current primary

I. Go into Maintain Monitor enter the QA collocated monitor
POC (See Fig 3) and click QA Collocation

2.	At the QA collocation table (Fig 4) enter an end date for
this collocation POC,

3.	Save the edit

Now change the primary

1.	Go to Maintain Site Form

2.	Enter the Site ID and click on "Primary Monitor Peri-
ods" (see Fig 2)

3.	Enter an end date for the current primary (POC-1)

4.	Save the edit

5.	Enter in new primary monitor POC (i.e., POC-3

6.	Save the edit

Now go back in and identify the collocated monitor

for the primary

1.	Go into Maintain Monitor (see Fig 3) and enter the QA
collocated monitor POC that you want to identify as the
QA collocated monitor (POC-2) and click QA Colloca-
tion

2.	At the QA collocation table (Fig 4) enter a begin date,
distance from primary monitor

3.	Save the edit


-------
ISSUE 23

PAGE 10

Update on Things in the Pipeline

There are a number of projects that have
yet to be completed that we plan to com-
plete this year. The following are updates
on progress:

National QAPP Level Activity

A Lean Kaizen event occurred in DC with
a number of the EPA Regional QA Manag-
ers and Headquarters QA Managers. The
E-Enterprise Leadership Council elevated
the need to streamline and modernize the
process for reviewing state and tribal
QAPPs as an FY 18 priority area. EPA held
listening sessions with tribes and states to
determine areas for improvement and the
EELC supported a Lean Kaizen event fo-
cused on:

•	Ensuring EPA's timely and consistent
review and approval of QAPPs

•	Streamlining QAPP requirements and
simplify guidance

•	Increasing the number of QAPPs that
are completed and accurate when
submitted.

The Lean event was held in DC September
I I -13, 2018. The event produced an im-
plementation plan and Workgroups were
developed to work on action items identi-
fied in the plan.

Ozone Transfer Standard Guidance

This was discussed in QA EYE issue 22. An
EPA Workgroup, including representatives
for CA Air Resources Board, is making
progress on this document. It is expected

that an initial draft will be completed by
spring 2019 that will then be out for review
to the monitoring community. Scott Hamil-
ton from EPA Region 5 and Greg Noah from
OAQPS are coordinating this effort.

Flow Transfer Standard Guidance Doc-
ument

Similar to the ozone transfer standard docu-
ment , we are trying to develop guidance for
flow rate transfer standards. This was also
discussed in Issue 22 We hope to have a
draft of this document in mid-2019.

Low Concentration Acceptance Crite-
ria for PM2.5 precision and bias.

We have been evaluating the PM2.5 collocat-
ed QA data as well as the performance eval-
uation data and we think we may be able to
develop a technical memo that would allow a
I ug/m3 difference acceptance criteria to be
used in data verifications. See more detail in
article on page 12

R-Shiny QC Tool.

Sonoma Technology Incorporated has been
working on a automated report to evaluate
the I -point QC checks in AQS. We report-
ed on this in QA EYE Issue 22 (Page I).

This report but it has been delayed due to
revisions associated with moving AQS data

to the Datamart. We hope have another
version of the report ready for review in
early 2019.

AQS Questions and Answers

These questions are related in the Technical
memo posted on AMTIC on 1/30/2018
called 'Steps to Qualify or Validate Data after
an Exceedance of Critical Criteria Checks'

Q-We have had a question relating to re-
porting and verifying the " IC" code in AQS
for an invalid QC check. Should the code be
reported in the QC null code field or in
place of the assessment and monitor concen-
tration values ?

A- The "IC" code will be reported in the
QC null code field. AQS will nullify any val-
ues provided for monitor and assessment
values: so the user does not need to manual-
ly remove these values.

Q-Will other QC checks (i.e., flow rate veri-
fications) that exceed critical criteria be
expected to follow the 1/30/2018 technical
guidance?

A- Yes, EPA's focus has been on the gaseous
pollutants to address the Inspector General's
findings but will be programming the other
QC checks considered critical criteria in
2019.

PEP and NPAP Training Update

On September 18th through 21st, OAQPS
conducted a series of training webinars for
all PEP/NPAP auditors and their EPA region-
al leads. These webinars included refresher
training for auditors, discussions of issues
encountered over the past year, analyses of
audit data, and a summary of expectations
for next year. The final day of the event was
a discussion with the EPA PEP/NPAP nation-
al leads to address concerns from the year
and to propose solutions going forward.
Typically, this training event is an interactive
face-to-face event: however, we opted for a
webinar this year to save resources. In
Spring 2019, we will implement new soft-
ware and a new audit process for the PM2.5
and Pb PEP. This project was intended to be

completed this past summer, but several
delays forced the timeline for completion
into 2019. So, we decided to have webinars
for training this year and focus resources on
training for the new process next year. The
Spring 2019 training will involve hands-on
with the new software and intensive instruc-
tion guiding the auditors through the new
process. The new PEP process involves a
field data collection and transfer application
for the tablet computers that integrates
with AQS. New weighing laboratory soft-
ware will also be installed which integrates
with AQS. Behind the scenes, AQS will
combine the two data streams and calculate
the final concentration. Finally, the data will
be available for the regional PEP leads to

validate before it moves into AQS. The
hope is that this new process will be much
more efficient and will standardize the PEP.
Much of the new audit process will look
familiar on the surface, but will be vastly
different behind the scenes.

If you are a PEP auditor or a regional PEP
coordinator, expect information about
another training in Spring of next year. No
location has been set yet, but we will have
details in the next few months. We're
looking forward to seeing everyone soon.
- Greg Noah


-------
PAGE II

Xi Chen Joins Ambient Air Monitoring Group as NATTS Lead

Jenia McBrian on Detail as OAQPS QA Manager

Research Laboratory's Air and Energy
Management Division (AEMD) in March
of 2014. Since working with AEMD, she
transitioned to Federal Postdoc. While
at AEMD she primarily worked on at-
mospheric reactive nitrogen chemistry
and deposition as well as improvement
of methods for organic nitrogen specia-
tion in PM.

With the move to the Ambient Air Mon-
itoring Group Doris will be responsible
for managing National Air Toxics Trends
Monitoring Network, continue with on-
going methods development work, spe-
cial field studies, and communications
with the state, local, and tribal monitor-
ing agencies. Welcome aboard Doris!

/

/

-	Revised Policy will feed into revi-
sions of QA/R-2, QA/R-5 and
other QA policy documents

•	Regional QAPP LEAN Workgroup;

-	LEAN event focused on the effi-
cient review of QAPPs across all
Regions

-	Brought about by complaints of
inconsistent QAPP reviews

-	Goal is to have more commonali-
ty in QAPP preparation and re-
views across all Regions

•	Other Agency-wide initiatives as

they arise

While Jenia is on the detail, the Mega PE
and Gravimetric Round Robin programs
will be overseen by Nealson Watkins.
QA questions regarding the CSN can
still be directed to Jenia.

With the retirement of Dave Shelow
in September, the Ambient Air Moni-
toring Group has hired Xi Chen, who
also goes by Doris, to lead the Na-
tional Air Toxics Trends Program.

Doris received a B.E. in Civil and
Environmental Engineering from
the Harbin Institute of Technology,
a M.S. in Atmospheric Environmen-
tal Science from the Hong Kong
University of Science and Technol-
ogy, and a Ph.D. in Environmental
Science and Engineering from
Clarkson University. She spent 3
years at Colorado State University
working as a postdoc on charac-
terizing and evaluating atmospheric
ammonia/ammonium monitoring
methods and applications.

She moved to RTP, NC and started
working as an Oak Ridge Institute for
Science and Education (ORISE) postdoc
in the Office of Research and Develop-
ments National Risk Management

QA Team to:

•	Update the OAQPS Quality Man-
agement Plan (QMP);

•	Develop the OAQPS Annual QA
Report;

•	Prepare for FY 2019 OEI OAQPS
Quality Systems Audit) QSA;

•	Provide QA training to staff;

Jenia is super ex-
cited about this opportunity which
arose following the retirement of Joe
Elkins, who had been with OAQPS
since 1991. Jenia intends to maintain
the established esteem Joe brought to
the Office as an Agency leader in
Quality Assurance.

In addition to learning about the work
of CORE in supporting the mission of
OAQPS, as the OAQPS QAM, Jenia
will be working with the cross-office

•	Respond to inquiries from the Of-
fice of the Inspector General (OIG);
and

•	Work with OAQPS divisions in de-
velopment of relevant QAPPs that
support the QMP.

In addition to OAQPS QA activities,
Jenia will be involved in many Agency-
wide QA initiatives to ensure OAQPS
views are represented, including;

•	CIO Policy 2105.0 and 2105-P-01-0
Revisions;

opmental oppor-
tunity with the
OAQPS Central
Operations and
Resources Office

(CORE) as the
OAQPS QA Man-

In October Jenia
McBrian has begun
a six-month devel-

Q A EYE


-------
ISSUE 23

PAGE 12

Low Concentration Data Quality Objective for PM2.5

We have been evaluating the PM2.5 performance evaluation
data and we are confident that we can support the use of
an absolute difference of +1 ug/m3 for the acceptance cri-
teria to be used in data certifications for sample pairs at low
concentrations. Using this DQO, measured concentrations
down to 2 ug/m3 may be used in data quality assessments.
Due to the fact it would also need to be programmed into
AQS, it is likely that it will be proposed in 2019 and imple-
mented in AQS by 2020.

There are two important facets of this transition. The first
is to establish that we can measure at low concentrations
with acceptable accuracy and precision. To verify the lower
concentration limit at which bias can be reliably measured,
we summarized data for field blanks collected in the PEP
over the past 10 years to characterize the programmatic
detection limit for PM2.5. Figure I plots annual averages (±
one standard deviation) of PM2.5 measurements in field
blanks. (These concentrations assume a total volume com-
mensurate with 24-hour sampling.) In August 201 I the PEP
lab discovered than a subtle deterioration in the PEP's cas-
sette cleaning compound had been contaminating our cas-
settes. This caused a gradual increase in annual average of
measured field and travel blank mass over time. Upon re-
solving this issue, PEP field blank measurements dropped
and have been stable, averaging 0.26 jjg/m3. Two approach-
es can be used to characterize the method detection limit
(MDL):

•	Using the convention that an MDL corresponds to the
field blank average plus 3 standard deviations (orange
curve in Figure I), the PEP's MDL since August 201 I is
0.77 }jg/m3.

•	Using the recently promulgated MDL Method Update
Rule (MUR), the PEP's MDL (red curve in Figure 2)
averages 0.83 |jg/m3.

An investigation of the national PM2.5 field blanks reveals
the average and apparent MDL is consistent with the PEP's
or even a little lower.

We've seen that when bias is calculated exclusively using
percent difference, average bias across the network be-
comes more negative as the concentrations get closer to 0
pg/m3, and the math forces the percent difference to dis-
proportionately larger values even though the absolute
difference is small. This is graphically represented in Figures
2 and 3, in which sample pairs collected from 2013 through
2017 are placed into one of ten bins by their PEP sample

concentration (I to <2 ng/m3, 2 to <3 pg/m3	 9 to <10

Mg/m3, > 10 |jg/m3) and the distribution of percent differ-
ence values (Figure 2) and absolute difference values (Figure
3) in each bin is represented by a boxplot.

Ends of the boxes correspond to the 25th and 75th percen-
tiles, and the red line within the box equals the medi-
an. The "whiskers" emanating from the boxes extend to
the 10th and 90th percentiles. Blue circles represent aver-
ages (arithmetic means). Data include all PEP successfully
paired w/ any SLT monitor (i.e., both primary and any collo-
cated SLT monitor at a given site are compared against the



1.8-



1 4 -

£

12-









S

10-

to

0 8-

9



8

08-





I

04-



0.2-



0.0-



I A i -ft i J5> 1 kN I .\ I Uv I A I ,> I	I Jb 1 A I

^ ^ ^ ^

J?

W

Mean i SD In Field Blanks

Mean ~ 3*SD
• MDL MUR

Figure I. Annual mean air-equivalent PM1.5 concentrations for field blanks (± I
standard deviation), and lower programmatic detection limit calculations.

(0-1) |1-2) [2-3) (3 - 4) [4-5* J5-6) [6-7)
PEP Concentration @»n (pg/m1)

17 - 8) [a-9> >to

Figure 2. Percent Difference vs. PEP concentration for the
years 2013-2017.

I 2



?



(0-1> [1-2) [2-3) [3-4) [4-5) [5-6) [6-7) [7-8) [8-9) >10
PEP Concentration Bm (jjg/m3)

Figure 3. Absolute Difference (SLT - PEP) vs PEP concentration
for the years 2013-2017.

PEP) between 2013-2017. The grey shaded region illustrates the bias DQO
(± 10%) for the percent difference plot and ± I |jg/m3 for the absolute dif-
ference plots. On the x-axis labelling, a parenthesis "("indicates the value is
excluded from the bin, whereas a bracket "[" indicates the value is includ-
ed. For example, the label [ I — 2) includes any value = I pg/m3 and ex-
cludes any = 2 |jg/m3. (Continued on page 13)


-------
ISSUE 23

PAGE ! 3

Low Concentration Data Quality Objective for PM2.5 (Continued from page 12)

Figure 4.

A closer look at the interquartile range
(IQR) of the boxplots presented in Fig-
ures 2 and 3 give us a clue as to a logical
transition point between percent differ-
ence and absolute difference as the
metric for the DQO. For the percent
differences (red curves in Figure 2), we
pay special attention to how the IQRs
compare to 20%, which is the range
associated with the current bias DQO
(± 10% of 0% — thus, the difference be-
tween the lower and upper values of
this range is 10% - (-10%) = 20%; this
value is portrayed by the black refer-
ence line in Figure 4). When the IQR
exceeds 20%, this implies that some
portion of the box will always fall out-
side of the DQO range, and thus, fewer
than 50% of sample pairs will meet the
DQO. Figure 4 shows that the size of
the percent difference IQR decreases as the
PEP sample concentration increases.

Similarly, if a component of the revised DQO
for bias states that the absolute difference in

PM2.5 concentration between paired SLT
and PEP samples must fall within ± I |Jg/m3,
the size of the range covered by the DQO is
I - (-1) = 2 |jg/m3. Thus, the right vertical
axis in Figure 4 is positioned so that the

horizontal reference line is at 2 yigi
m3. The blue curves in Figure 4
show that the size of the absolute
difference IQR increases with in-
creasing concentration.

For PEP concentrations less than 5
|jg/m3, both the mean and median
percent difference IQRs are always
above 20%. Thus, a cutoff concen-
tration threshold of 5 pg/m3 may be
appropriate in a revised bias DQO.
In turn, the revised DQO could state
that a PQAO use the percent differ-
ence calculation (± 10%) when the
PEP sampler's concentration is S 5
|jg/m3, and an absolute difference
(± I pg/m3) when the concentration
is < 5 |jg/m3.

Graphics and some of the prose for this
article were supplied by Battelle Memorial
institute under contract to EPA. For further
discussion send questions and comments to
Dennis Crumpler; crumpler.dennis@epa.gov

QA BYE...A Fond Farweli From the Editor-in-Chief

I started my career in 1981 right out of the
University of Maine as a Forest Technician
(pictured above), with the US Forest Service
Research Station working for an "old school"
Ukrainian soil scientist. He always went into
the field with a suit and tie so you know who
did all the work. He was a great mentor and II
loved the work; spending many hours in the
field in upstate Maine. I did all his field work,
laboratory analysis, data reduction/
assessments. I learned a lot about quality assur-
ance and quality control by doing and making
mistakes. In about 1985, EPA started up Acid
Deposition Research in New England and the
Mid-Atlantic states and they needed soil scien-
tists to characterize and sample soils. At the

time, I was working for the Soil Conserva-
tion Service (SCS) and EPA and SCS cooper-
ated on the work. That job eventually got
me to EPA where I was asked to quality
assure the data and laboratory analysis of the
soil samples from the program. During that
time, I also participated in a great program
called the Environmental Monitoring and
Assessment Program (EMAP) which kept my
foot in the door with forestry work.

From there I went on the Great Lakes Na-
tional Program Office (GLNPO) as QA Man-
ager which solidified my career in QA. Much
of my work in GLNPO had to do with water
related projects.

In 1995, after my parents made the move to
retire to the coast of NC, I decided to look
for work in RTP and found an opening in the
Ambient Air Monitoring Program. For bet-
ter or worse (you can decide) I took the job
and the rest is history.

By the end of my career I had worked and
performed QA is almost every media; from
forest to soils to water and finally air. At
every step along the way I have met so many
inspirational people. II can't begin to thank
you all without creating another QA EYE
just for the folks that I've worked with, ar-

gued with, laughed with, golfed with, and
hiked with. Ever notice at those award
shows when someone wins they always seem
to be forgetting someone they meant to
thank. With that, I do not want to thank
anyone bllt everyone that I've had the
honor to meet and work with over my ca-
reer.

You have helped guide the Ambient Air Mon-
itoring QA Program to be one of the strong-
est programs in the country. Through your
workgroup participation, questions and com-
ments on our guidance you have kept our
Air QA Program up-to-date and maintained
our data quality so that we can make im-
portant environmental decisions with confi-
dence.

Thanks again for
you professional-
ism, expertise
and friendship all
these years, I'm
back to the for-
ests where I
started and has
always been my
love. Happy
Trails!

Mike Papp

It is with mixed
emotions that I
wrap up this Issue
of the QA EYE as
my last since I will
be retiring from
federal service at
the end of Decem-
ber. Our first issue
was September
2005 and it's been
quite a journey, not
just as editor of the Newsletter, but as a 30
year career in federal service.


-------
I

7J

\

^£DS?>|v

^ PRO^°

EPA-OAQPS
C304-02
RTP, NC 2771 I

s

o
x

E-mail: papp.michael@epa.gov

The Office of Air Quality Planning and Standards is
dedicated to developing a quality system to ensure that
the Nation's ambient air data is of appropriate quality
for informed decision making. We realize that it is only
through the efforts of our EPA partners and the moni-
toring organizations that this data quality goal will be
met. This newsletter is intended to provide up-to-date
communications on changes or improvements to our
quality system. Please pass a copy of this along to your
peers and e-mail us with any issues you'd like discussed.

Mike Papp

Key People and Websites

Since 1998, the OAQPS QA
Team has been working with the
Office of Radiation and Indoor Air
in Las Vegas, and ORD in Re-
search Triangle Park in order to
accomplish OAQPS's QA mission.
The following personnel are listed
by the major programs they im-
plement. Since all are EPA em-
ployees, their e-mail address is:
last name.first name@epa.gov.

The EPA Regions are the prima-
ry contacts for the monitoring
organizations and should always
be informed of QA issues.

Program

Person



Affiliation

CSN/IMPROVE Lab PEand PM25 Round Robin

Nealson

Watkins

OAQPS

Tribal Air Monitoring

Emilio

Braganza

ORIA-LV

CSN/IMPROVE Network QA Lead

Jenia

McBrian

OAQPS

OAQPS QA Manager (On Detail)

Jenia

McBrian

OAQPS

Standard Reference Photometer Lead

Scott

Moore

ORD-APPCD

National Air Toxics Trend Sites QA Lead

Greg

Noah

OAQPS

Criteria Pollutant QA Lead

Mike

Papp

OAQPS

NPAP Lead

Greg

Noah

OAQPS

PM25 PEP Lead

Dennis

Crumpler

OAQPS

Pb PEP Lead

Greg

Noah

OAQPS

Ambient Air Protocol Gas Verification Program

Solomon

Ricks

OAQPS

Websites

Website

EPA Quality Staff
AMTIC

AMTIC QA Page

URL

EPA Quality System

http://www3. eDa.gov/ttn/amtic/

http://wvwv3.epa.gov/ttn/amtic/aualitv.html

Description

Overall EPA QA policy and guidance
Ambient air monitoring and QA
Direct access to QA programs


-------