-------
ISSUE? PAGE3
Pb Measurement Quality Objectives
Data quality indicators are quantitative statistics and qualitative descriptors that are used to interpret the degree of acceptability or utility of
data to the user. The principle data quality indicators are precision, bias, completeness, comparability, representativeness and detectability.
A measurement quality objective is a goal set by EPA guidance that represents a reasonable expectation of what one should be able to
achieve for a specific data quality indicator in order to maintain acceptable levels of uncertainty.
As part of the DQO process described in the page 1 article, EPA reviewed precision data from various sources including routine Pb data
from the SLAMS, National Air Toxics Trends Sites and Chemical Speciation Network Sites; this Pb data was collected by various sampling
and analytical methods. Table 1 below provides a comparison of this data. The data represent eight precision assessments separated based
on either a different sampling method or a different analysis method. As with our other particulate-based criteria pollutants, EPA identifies a
"cutoff concentration value and precision and bias estimates are made only data with values that are equal to or above this cutoff value. At
low concentrations, agreement between measurements of collocated values, expressed as relative percent difference, is understandably poor
but at such low concentrations precision is not an important objective for air quality purposes. Prior to the new Pb NAAQS standard, the
collocated precision cutoff value was 0.15 ug/m3. With the lowering of the NAAQS, and improvements in sampling and analytical technolo-
gies, EPA feels this cutoff value can and should be lowered. The data in Table 1 was reviewed at a number of potential cutoff values; start-
ing at 0.002 ug/m3, which is the proposed method detection limit (MDL) for the XRF-based FRM for Pb-PM10, and up to 0.02 ug/m3.
Some scenarios in Table 1 do not show the 0.01 or 0.02 ug/m3 scenarios because there were not enough (or no) routine data concentrations
Table 1. Ph Collocated Precision 90% Coefficient of Variation Summary in these ranges- Based on our evaluation,
Data Values 1 2 345678 we believe that 0.02 ug/m3 is an appropri-
Pb > 0.002 ug/m3 19.4 13.0 16.9 9.4 36.6 37.0 23.5 15.5 ate cutoff value for two reasons: 1) there
Pb > 0.006 ug/m3 20.7 11.8 16.8 8.8 29.1 36.1 14.9 15.4 , , , ... , , , ,.
Pb>0.01ug/m3 11.2 11.7 16.5 8.1 24.1 18.3 15.4 has been an established concept of a
Pb>0.02ug/m3 12.0 6.7 15.0 9.0 14.0 16.4 "limit of quantitation" that is usually
estimated at ten times the MDL, and 2) it
1. PMIO NATTS PI. High-volume sampling (-113 LPM) Analysis ICP-MS ; practically one order of magnitude
2. TSPPb High-volume sampling (-113 LPM) Analysis ICP-MS MAAn« A 'A
3. TSP Ph High-volume sampling (-113 LPM) Analysis Atomic Absorption away trom tne NAAQ» and provides an
4.TSP Pb High volume NY Data Analysis Graphite Furnace AA adequate margin of safety for data re-
5. TSP Pb Low-volume sampling Analysis XRF view. As an alternative, EPA could con-
6. PM2.5 CSN Very-low-voliime sampling (-6 S 7 LPM) Analysis XRF -j n m / 3 + «-u + A +
, _. _ __.. _ * , . I- ,,-->ir>ii>« i i/nr sider 0.01 ug/m as a cutott but we do not
7. PM2.5 CSN Texas Low-volume sampling (%.7 LPM) Analysis XRF ^5
8. TSP Pb High-volume sampling (-113 LPM) Analysis ICAP recommend going below this concentra-
tion. Based on this cutoff value and re-
viewing the historical data in Table 1 at or above the 0.02 ug/m3 cutoff value, EPA proposes a precision measurement quality objective of
20% for a 90% confidence limit coefficient of variation, aggregated over a 3-year period at the primary quality assurance organization level.
This means that the large majority of paired precision data should show a difference below 20%; monitoring organizations that do not
achieve this result would be advised of the problem and encouraged to investigate and resolve the causes of the disagreements.
Bias Estimates
Estimates of Pb bias were evaluated by reviewing data collected through the PM2.5 Chemical Speciation Network (CSN) and the National
Air Toxics Trends Stations (NATTS) QA programs. The XRF bias estimates for the PM2.5 CSN were obtained from data provided by the
analysis of Performance Evaluation (PE) samples. CSN PE samples consist of "real-world" particle filters collected over multiple days to
ensure that an adequate amount of material is present for analysis. For XRF, 46.2-mm Teflon filters were collected and analyzed by an EPA
reference lab prior to distribution. The average concentration in ug/filter was 0.331 ug/filter and the equivalent concentration in mg/m3,
based on 24 m3 (16.7 Lpm sampling), was 0.0138 ug/m3. The overall absolute bias upper bound for the 95% percentile is 23.42%.
Bias estimates for the NATTS were obtained from data provided by the analysis of Performance Evaluation (PE) samples by ICP-MS. Sev-
eral laboratories provide ICP-MS analyses in support of the NATTS. NATTS PE samples consisted mostly of 46.2-mm quartz fiber filters
that are produced by the aerosolization and deposition of a Pb-salt solution onto each filter. The size distribution of the liquid aerosol was not
controlled or characterized. Initially Teflon filters were used and then switched to quartz filters to match the filter material used by the
NATTS. The filters were prepared and analyzed by ICP-MS at a reference lab prior to distribution. The average concentration in ug/filter
was 2.965 ug/filter and the equivalent concentration in ug/m3, based on 24 m3 (16.7 Lpm sampling) was 0.1236 ug/m3. The overall absolute
bias upper bound for the 95% percentile is 16.81%. (continued on page 4)
-------
ISSUE 7
PAGE 4
Pb Measurement Quality Objectives (continued from page 3)
It is important to note the differences in
the PE samples generated for each pro-
gram as these differences have the poten-
tial to affect the bias estimates. The XRF
bias estimate is based on PM2.5 particles
collected in the field and include any
associated particle or sample "matrix"
effects. For NATTS, the ICP-MS PEs
samples are lab-generated liquid aerosols.
In addition, the XRF PE samples were at
a concentration level that is one order of
magnitude lower than the ICP-MS PE
samples (0.331 versus 2.965 ug/filter) and at
an equivalent concentration (0.0138 ug/m3).
It should be observed that this equivalent
concentration is below the proposed cut off
value. Therefore, one might expect for XRF
bias results to comparable to the NATTS bias
results if values above the proposed cutoff
are used.
Based on this cutoff value and reviewing the
CSN and NATTS data, EPA identified an
overall absolute bias upper bound goal of
15%. The XRF bias estimate of 23.4% is
expected to improve at concentrations 10
times higher than those evaluated. The
ICP-MS bias estimate of 16.81% is in line
with the proposed goal. This means that
the large majority of bias data should show
difference below 15%; monitoring organi-
zations that do not achieve this result would
be advised of the problem and encouraged
to investigate and resolve the causes of the
disagreements.
For Pb... What's in CFR Appendix A?
Base on the DQO process and the data quality assessments EPA re-
viewed the QA requirements in 40 CFR Part 58 Appendix A. The
following are the highlights of the changes that occurred in in Appen-
dix A:
DQO Goals
As mentioned in earlier articles in the Newsletter the measurement
quality objective for precision will be 20% for a 90% confidence limit
coefficient of variation and an overall absolute bias upper bound goal
of 15%.
Pb Strip Audits
The requirement for the analysis of 6 Pb audit strips per quarter
(3 strips at 2 concentration ranges ) has not changed. However,
the audit concentrations ranges have changed. The lower con-
centration range is 30-100% of the NAAQS and the higher con-
centration range is 200-300% of the NAAQS. EPA is contem-
plating the possibility of developing audit strips for monitoring
organization laboratories based on interest.
PEP-Like Audits
Flow Rates
No changes occurred to flow rate. Flow rate verification will be im-
plemented quarterly and flow rate performance evaluations will be
implemented every six months.
The implementation of a PEP-like audit is a new requirement
and it provides some assessment of overall bias but will be a mix
of one or two PEP like audits with additional collocated sam-
pling. The program will require the same number of audit sam-
ples as required for PM2.5 meaning:
Collocated Monitoring
No changes occurred to the collocation requirements. Collocation will
continue to be required at 15% of each method designation within a
primary quality assurance organization at a l-in-12 day sampling fre-
quency. EPA added language encouraging monitoring organizations to
site the first collocated sampler in each network at the highest concen-
tration site. This will allow the site to operate over the longest time
period and since it may be the site that affects the NAAQS and it is
allowable to substitute collocated data for missing data from the pri-
mary monitor, this siting would be advantageous for improving data
completeness at a very important site.
PQAOs with < 5 sites require 5 audits (1 PEP, 4 collocated)
PQAOs with > 5 sites require 8 audits (2 PEP, 6 collocated)
Similar to the PEP, monitoring organizations are responsible for
these audits but must meet adequacy and independence require-
ments. EPA is anticipating using the current PEP auditors to
provide federal implementation of the program if monitoring
organizations would like to have the program implemented
through that implementation mechanism.
Additional QA guidance detailing the QA requirements will be
developed in January, 2009.
-------
ISSUE 7
PAGE 5
QA Handbook Volume II Complete
The QA Handbook Vol II was completed
in December, 2008 and is available on
AMTIC at http://www.epa.gov/ttn/amtic/
qabook.html . A few items in the new
version include:
Heavy use of web links and foot-
notes in order to provide the reader
sources with more detailed informa-
tion.
Removed high volume PVC laminar
inlets. We have made the Handbook
consistent with CFR on the use of
Teflon and borosilicate glass only for
all inlets and the sampling train and
are discouraging the use of high flow
inlets which are difficult to audit.
Removed zero/span calibrations 1
and 2 from section 12 and included
the discussion of zero, span and pre-
cision checks in the QC section.
New Attachments
Monitoring Program Fact Sheets
QA Info attachment
Color validation templates
Revisions to this document started in earnest
in 2004 with Anna Kelley in the lead during
her one-year IPA with EPA from the Hamil-
ton County Department of Environmental
Services. The QA Strategy Workgroup is also
commended for their dedication to the en-
deavor as they met with EPA every few
months to review and revise each section. A
separate Workgroup, led by Gordon Jones
from Region 5, met to revise the technical
systems audit (TSA) form which is now
included as Appendix H. EPA appreciates
the assistance of all EPA Regions and moni-
toring organizations who helped in the com-
pletion of this document. Since the revision
of this document took longer than expected,
EPA hopes that the new version of this docu-
Quality Assurance
Handbook for Air
Pollution Measurement
Systems
ment be
posted on
AMTIC in
such a manner
that sections
can be con-
tinuously
revised with-
out having to
revise the
whole docu-
ment. There-
fore, if a rule is changed that effects one or
two sections of the Handbook, these sec-
tions will be revised and a quality bulletin
explaining the change, and what sections
are effected by the change, can be posted on
AMTIC. Monitoring organizations can
ensure their Handbook is current by review-
ing the quality bulletin postings and
downloading the appropriate sections. For
additional information on the Handbook,
contact Mike Papp at:
papp.michael@epa.gov
Guidance for Entering TEOM Flow Rate Data into AQS
Over the years, EPA has received nu-
merous questions about the submission
of monthly flow rate verification data
and the semi-annual flow rate perform-
ance evaluation data for TEOMs to
AQS. The questions include:
what flow rate to report, main
flow or total flow, and
where to submit this data.
What Flow Rate to Report
There are two flow rates for the con-
tinuous PM2.5 TEOM: the main flow
rate which is typically set to 3 liters/
minute and the total flow rate which
set to 16.67 liters/minute. Both flow
rates are important and both should be
checked during the verification and perform-
ance evaluations. Both flow rates can be re-
ported to AQS and this is encouraged, however,
the priority flow rate which must be reported to
AQS is the main flow rate as this flow rate di-
rectly factors into the calculation of the reported
concentration.
Where to Report the Flow Rate Data
The monthly flow rate verifications are reported
in the precision transaction area. In order to
report both the main flow rate and the total flow
rate two separate precision transactions must be
supplied. These precision audits will be differ-
entiated by the use of the "Precision
ID" (number between 1 and 99) field on the RP
transaction. There is no significance as to
which number is used for the total flow versus
the main flow. It will be responsibility of the
data analyst to distinguish between audits
using total flow and main flow values.
The semi-annual flow rate performance
evaluations, are reported as an accuracy
transaction. EPA suggests reporting the
main flow rate data in the Level 1 accuracy
field and the total flow rate data in the
Level-2 accuracy fields. EPA is not sug-
gesting any resubmission of audit data if the
guidance above has not been followed, but
recommends this entry scheme in future
submittals
As a reminder when reporting the flow rate
values, the "Actual" field is for the results
of the auditing device's flow rate, the
"Indicated" field is for the result as reported
from the monitoring instrument being
tested.
-------
ISSUE 7
PAGE 6
PEP Program Testing Audit Samplers for Very Sharp Cut Cyclone (VSCC) Transition
Picture courtesy Greg Noah ( Region 4 EPA)
The PM2.5 Performance Evaluation
Program (PEP) will deploy very sharp
cut cyclones (VSCCs) in the fleet of
BGI PQ200A PEP Federal Reference
Method audit samplers in 2009. The
Region 4 Athens Laboratory set up 20
samplers (picture above) during the
week of December 8 for what is called
a "Parking Lot" collocation study con-
figuration to compare the performance
of the WINS and the VSCC separators.
Samplers ran for three days with
WINS impactors in all samplers
to establish the precision of the
parking lot fleet and identify any
"rogues." Six to nine more sam-
pling days are planned in early
January to randomly rotate the
WINS and VSCCs among the
satisfactorily performing sam-
plers on the lot. We will have all
of the PEP field operators run 2
BGI PQ200A samplers, one with
a WINS and the other with a
VSCC, in their first four audit
events in 2009. We are taking
these extra steps to ensure no audit events are
lost due to unfamiliarity with operating with the
VSCCs and to ensure data comparability be-
tween the BGI PQ200As and the R & P 2025
sequential samplers in the national network
over the next year or two. In the 2005-2007
(see Figure 1 above) bias data, EPA noticed a
positive bias with the R & P 2025 that utilizes a
Comparison of 95% Confidence Intervals for 1999 thru 2007
National Bias Estimates
o Average lor 1999-2004 FRM/FEM Sampler
» Average (or 2004-2007
AH Concentrations*3isglm1 TargetDOO± 10%
Figure 1
VSCC. The spread between biases for R &
P 2025 with WINS and R & P 2025 with
VSCCS was greater than the spread for
other makes of samplers. EPA will be look-
ing into this issue in 2009 and is testing the
BGIs to ensure similar bias does not show
up in the PEP data.
2005-2007 Three Year PM2.5 QA Report Out for Review
Every three years, OAQPS documents the quality assurance activi-
ties that were undertaken for the SLAMS PM2.5 environmental
data operations. The QA Report evaluates the adherence to the
quality assurance requirements described in 40 CFR 58App. A and
evaluates the data quality indicators of precision, accuracy, bias,
and completeness. Tables 1, 2, and 3 provide some general infor-
mation covered in the report. The report assesses the QA informa-
tion mainly at the level of a primary quality assurance organization
but also looks at method designations and at individual sites where
appropriate. In general, the majority of the data are meeting the
data quality objective goals but the data is showing an increased
percentage of primary quality assurance organizations not meeting
the precision and bias goals. The report is posted on AMTIC for
review http://www.epa.gov/ttn/amtic/anlqa.html.
The report is available for review until January 30, 2009. Please
send any comments or corrections you might have to Mike Papp
at: papp.michael@epa.gov.
Table 1- Data Completeness- 9 Year Summary
Data Type
Routine Data
Average Capture Rate
Collocation Precision
Flow Rate Accuracy
Performance Evaluations
Completeness
3- Year Average
1999-2001
28%
Not
calculated
67%
76%
8596
2002-2004
64%
92%
90%
83%
83%
2005-2007
68%
92%
91%
S5%
81%
Table 2 Precision, Accuracy and Bias Estimates 9-year Summary
Data Type
Collocation Precision
Flow Rate Accuracy
Performance Evaluations
Acceptance
Criteria
10%
4%/5%
±10%
3-Year Estimates
1999-2001
72%
0.2%
-2.%
2002-2004
6.9%
0.15%
-2.1%
2005-2007
7.35%
0.007%
-2.97%
Table 3 Percentage of POAOs Meeting Acceptance Criteria- 9-Year Summary.
Data Type
Collocation Precision
Flow Rate Accuracy
Performance Evaluations
Acceptance
Criteria
10%
4%/5%
±10%
n of PQAO Meeting Criteria
1999-2001
8d%
99%
91%
2002-2004
89%
100%
84%
2005-2007
81%
100%
76%
-------
PAGE 7
Shippable Gas Dilution Systems Available for Back-of-the-Analyzer
or Single Line Auditing of Routine or Precursor Gas Sites
Prior to EPA performing the NPAP audits as through the probe (TIP), the
mailed NPAP program included small, shippable cases containing zero air and
gas dilution systems with small cylinders of blended CO, SO2 and NO that
were used to generate concentrations at the low, medium, and high on-scale
levels for 0-50 PPM CO analyzers and 0-0.50ppm SO2 and NO/NO2 analyzers.
Three toggle switches turned the flow path through them on or off, allowing the
gas from the blended gas cylinder to flow through one of 3 critical orifices that
reliably controlled the pollutant gas flow that was then mixed with the zero air
input into the gas dilution system.
These systems generate about 4-6 Lpm which is enough to audit an analyzer
that was on a single line sampling inlet or through the back of the analyzer
(BOA). It can not be used through a sampling manifold with a diameter larger
than 1/4" and especially not with a fan or pump causing a high sampling flow
rate.
Since EPA has converted the NPAP almost entirely to through-the-probe
(TTP), we have an inventory of gas dilution system (and other BOA NPAP au-
dit devices) available for use. Some of these systems have been placed in a
number of the Regions for storage and potentially to supplement TTP mobile
lab audits. The rest are in RTF are available to the Regions that may have a use
for these devices.
Precursor Gas (NCore) Testing with BOA System
Last year an OAQPS contractor assembled and tested an EPA trace level TTP
system for EPA and at the same time used the trace level blended gas cylinder
with the BOA gas dilution/zero air kit. The system generated high, medium
and low concentrations in the required audit level ranges for trace level
analyzers.
This means that the Regions could make the equipment available to monitoring
organizations to perform QA or QC performance evaluations of CO, SO2, and
NO analyzers by a BOA or single line sampling procedure. The only item
needed to complete the systems are the acquisition of small blended gas or sin-
gle gas cylinders (as needed) with concentrations of CO , SO2, and NO needed
to dilute down to the trace level medium and low full scale concentrations. In-
cluding shipping, these might cost as little as $200-300 dollars each. For more
information on this contact Mark Shanis at: shanis.mark@epa.gov.
THE QA EYE
-------
PAGE 8
Re-design of the AMP255
EPA looking for Reviewers
Several enhancements are
currently under develop-
ment with the Precision and
Bias Quality Indicators Re-
port (AMP25 5) in AQS. In
the past, several issues as to
the report's ability to accu-
rately display the informa-
tion required by 40 CFR
Part 58 Appendix A have
been questioned.
As a part of this effort, sev-
eral changes to the layout of
the report are currently be-
ing considered to make the
information more usable to
the end users. EPA plans on
having a new look to the
AMP 255 by the end of
January, 2009
If you would like to join a
workgroup who will con-
tribute ideas to the new
layout of the report, please
contact Jonathan Miller of
the National Air Data
Group at
miller.jonathan@epa.gov.
National Ambient Air (JA Meeting-May 12-13, San Antonio
For the last seven years the
OAQPS Ambient Air Moni-
toring Group has facilitated
sessions devoted to ambient
air monitoring QA at the
Quality Management Con-
ference sponsored by the
Quality Staff at the EPA
Office of Environmental
Information. This year is
no different. The meeting
will be held May 12-14,
2009 in San Antonio,
Texas. Two days of ambient
air sessions are planned for
May 12 and 13. Registra-
tion information can be
found at http://
www.epa.gov/
qualitv/2009.htm. An
agenda specifically for the
ambient air session will be
completed by mid-
February. For those inter-
ested in providing a presen-
tation, abstracts are due
January 30 to Mike Papp at:
papp.michael@epa.gov.
Hope to see you at the
meeting!
National Ambient Air Conference Being Planned for November, 2009
The November 2006 Na-
tional Air Monitoring Con-
ference (see QA EYE issue
#5) held in Las Vegas was
considered a success and
was EPA's intention to try
and schedule a conference
of this magnitude every 3
years. Two years have
passed and EPA is starting
the planning for another
National Conference sched-
uled for the November,
2009 timeframe. Kevin
Cavender, OAQPS and
Anna Kelly, Hamilton
County Department of En-
vironmental Services have
been identified as co-leads
for planning. EPA is look-
ing for volunteers to help as
session chairs and modera-
tors and to participate in
workgroups to develop
session goals and topics. If
you are interested in assist-
ing in any planning activi-
ties, contact Kevin Caven-
der at:
cavender.kevin@epa.gov.
THE QA EYE
-------
ISSUE 7
PAGE 9
f'
NPAP Data Review and Entry Becoming More Efficient
Currently, NPAP through the probe audit data is collected during the audit process, reviewed by the auditor and site operator,
additionally reviewed by the EPA Regions and the monitoring organization, sent to EPA-OAQPS for collection and then fi-
nally submitted in batches to AQS for upload. This process can take four to 6 months before audit reports get into AQS.
Since it is very rare that data from audits change, and the information posted to AQS is virtually a subset of the data collected
during the audit, it was felt that a system could be put in place that would make the final reporting to AQS much easier and
with less chance of data loss or entry error. The process described in the flow chart below has been discussed among the EPA
Regions and some monitoring organizations and it appears to be worthy on implementation in 2009 for the federally imple-
mented NPAP TTP program.
The Process
As illustrated in the flowchart below, the NPAP Auditor would conduct the audit. As is currently implemented, the data is col-
lected in the NPAP Database Excel spreadsheet. Upon completion of the audit and preliminary acceptance by the monitoring
organizations site operator, the auditor would upload the audit data within 2 days of the audit to AQSQA. AQSQA is a testing
area in AQS that is a mirror image of AQS Production but is a holding area that is not officially the AQS data base. Therefore,
no entity could retrieve data from AQSQA. Entry into AQSQA allows the NPAP auditor the ability to check for entry errors or
other types of errors that would hinder the submission of data to the AQS Production database. Upon successful entry to
AQSQA, the NPAP auditor would email the NPAP Excel workbook with the audit results to the EPA Regions/Headquarters
and the monitoring organization point of contact. These entities would have five working days to accept the results as reported
or address any discrepancies. In most cases EPA expects that results would be accepted and the entities would reply to the
email affirming their acceptance. In the rare cases of discrepancies, edits would be sent to the EPA Data Administrator who
would make any changes required. After the five-day review period, data would be uploaded by the EPA Data Administrator.
It is expected that EPA will test the implementation of this procedure in 2009 and are looking forward to feedback from the
monitoring organizations. Details of the procedure will be forthcoming in the form of NPAP standard operating procedures.
Entering NPAP Data into AQS
Notes and Definitions
In thjs «enono, the Auitor Is assunwd to be A «nlr«ttv ihe ability to enter data for any monitor In
the database without compromising fte sectary rules of the AQS production envwonmert.
flOSoa Database;
Tr«s database is used fur tesDng purposes when new software is introduced to ensure that nothing 'breaks' on implementation,
s Bie producbori "live' database for AQS. Entry of Information into DW datapate is reHmted to named uttn who are designated » the 'o
-------
EPA Office of Air Quality
Planning and Standards
EPA-OAQPS
C304-02
RTF, NC 27711
E-mail: papp.michael@epa.gov
The Office of Air Quality Planning and Standards is dedi-
cated to developing a quality system to ensure that the Na-
tion's ambient air data is of appropriate quality for in-
formed decision making. We realize that it is only through
the efforts of our EPA partners and the monitoring organiza-
tions that this data quality goal will be met. This newsletter
is intended to provide up-to -date communications on
changes or improvements to our quality system. Please pass
a copy of this along to your peers. And please e-mail us with
any issues you'd like discussed.
Mike Papp
Important People and Websites
Since 1998, the OAQPS QA Team
is working with the Office of Ra-
diation and Indoor Air in Mont-
gomery and Las Vegas in order to
accomplish it's QA mission. The
following personnel are listed by
the major programs they imple-
ment. Since all are EPA employ-
ees, their e-mail address is: last
name.first name@ epa.gov.
The EPA Regions are the primary
contacts for the monitoring organi-
zations and should always be in-
formed of QA issues.
Websites
Program Person Affiliation
STN/IMPROVE Lab Performance Evictions Eric Bozwell ORIA- Montgomery
Tribal Air Monitoring Emilio Braganza ORIA-LV
Statistics, DQOs, DQA, precision and bias Louise Camalier OAQPS
Statistics, DQOs, DQA, precision and bias Rhonda Thompson OAQPS
Speciation Trends Network QA Lead Dennis Crumpler OAQPS
OAQPS QA Manager Joe Elkins OAQPS
PAMS & NATTS Cylinder Recertifications Rich Flotard ORIA LV
Standard Reference Photometer Lead Scott Moore ORD-NERL
Speciation Trends Netwoik/IMPROVE Field Audits Jeff Lantz ORIA -LV
National Air Toxics Trend Sites QA Lead Dennis Mikel OAQPS
PAMS & NATTS Cylinder Recertifications David Musick ORIA-LV
Criteria Pollutant QA Lead Mike Papp OAQPS
NPAP Lead Mark Shanis OAQPS
STN/IMPROVE Lab PE/TSA/Special Studies Jewell Smiley ORIA-Montgomery
NATTS PT Studies and Technical Systems Audits Candace Sorrell OAQPS
STN/IMPROVE Lab PE/TSA/Special Studies Steve Taylor ORIA-Montgomery
The following websites will get you to the important QA Information.
Website
EPA Quality Staff
AMTIC
AMTIC QA Page
Ambient Air QA Team
Contacts
URL
http://www.epa.gov/quality I /
http://www.epa.gov/ttn/amtic/
http://www.epa.gov/ttn/amtic/quality.html
http://www.epa.gov/airprogm/oar/oaqps/qa/
http://www.eDa.gov/ttn/amtic/contacts.html
Description
Overall EPA QA policy and guidance
Ambient air monitoring and QA
Direct access to QA programs
Information on Ambient Air QA Team
Headquarters and Regional contacts
-------