OFFICE OF AIR QUALITY PLANNING AND STANDARDS The QA EYE SPECIAL POINTS OF INTEREST: OIG Report nearing completion Page I New Quarterly Reports being tested Page I TSA Guidance Complet- ed Page 4 INSIDE THIS ISSUE: OIG Report I Quarterly QC Reports I NO, Gas Standards 3 Warning TSA Guidance Complete 4 New Flow Rate Guidance 5 and Ozone Guidance Revision APTI 470 Revision Pro- 5 gress New SRP in RTP 6 Null Codes Available in 7 AQS QA Transactions PEP/NPAP Self Imple- 8 mentation Guidance National Gravimetric Lab 9 Round Robin Results ISSUE 2 2 JANUARY, 2018 Office of Inspector General Report Nearing Completion On January 8, 2016, the EPA Office of Inspector General (OIG) initiated a project to evaluate data from the ambient air program. The stated pur- pose was to pursue the follow- ing questions: Do data revisions comply with EPA criteria? Do data exclusions or gaps comply with EPA criteria? QA EYE Issue 21 provided de- tails of the information provid- ed to EPA at that time. During 2017 OIG continued to gather information and interview some monitoring organizations. Alt- hough OIG has not issued a final report, on November 16, 2017, OIG provided a draft of their Final Report entitled Dif- ferences in Air Monitoring Agencies' Data Processing Prac- tices Could Decrease the Relia- bility of Ozone Data Used to Assess Air Quality. The Draft provided similar con- clusions to their earlier Manage- ment Alert issued on February 6, 2017, but provides 5 recom- mendations. The following pro- vides the recommendations and what corrective action has been or will be implemented in the next year. Recommendation I: Assess the risk of any data adjust- ments impacting the ozone data used in the EPA's Na- tional Ambient Air Quality Standards designation de- terminations. In response to this recommen- dation, OAR is in the process of reviewing 2014-2016 data using design values generated from AIRNow data compared to de- sign values reported by the valid data in AQS to see what differ- ences might be revealed. Recommendation 2: Issue guidance clarifying the shel- ter temperature criteria that should be used during data validation. When the process of approving monitors as federal reference or equivalent methods was started, one of the testing criteria was for monitors to properly oper- ate at 20-30°C. Therefore, the majority (if not all) monitors were initially designated with that range and the QA Hand- book guidance originally listed that requirement. (continued on page 2) New Quarterly QC Reports Being Tested One of the corrective actions taken during the I /8/2016 OIG audit was to review 2014- 2016 data to determine if ozone data that was represented by an exceedance of the I - point quality control acceptance criteria in the QA Handbook validation template were invalidated, and if not, was there compelling evidence to justify considering the routine concentration data valid. A technical memo- randum entitled Steps to Qualify or Validate Data after an Exceedance of Critical Criteria Checks was posted on AMTIC on 8/30/2017 explaining the procedure. In that memoran- dum, we also provided an example of a quar- terly report that would be developed to help monitoring organizations and EPA review the routine and quality assurance data to deter- mine whether the procedures described in the memo were being implemented. Sonoma Technology Incorporated (STI) and EPA in- cluding EPA Region I and CASTNET, have been working on this program since Septem- ber and have the front page selection criteria completed. We are presently evaluating the look and feel of the page, as well as the accu- racy of the content. We hope to have the ozone quarterly data assessment completed by the end of January for review by all EPA Regions and monitoring organizations. ------- ISSUE 22 PAGE OIG Update Continued from Page I / However, newer monitors have been designated at wider temperature ranges. The 2017 QA Handbook was revised to accommodate these monitors while also indicating that the shelter must be maintained to accommodate the monitor with the most sensitive temperature requirement. OAQPS believes that if a shelter does go beyond 20-30°C range, data should not be invalidated from all monitors but only those that are not designated to operate at the temperature excur- sion. EPA expected that the guidance in the QA Handbook would be used in this manner, but OIG did not observe this in all cases. Since the QA Handbook gets updated every five years and was last updated in 2017, in response to this rec- ommendation, OAR will develop and post a table of changes on AMTIC that will apply to monitoring guidance until the next full QA Handbook revision. Recommendation 3: Complete the quality assurance project plan review-and-approval process to verify that air monitoring agencies' quality assurance pro- ject plans incorporate the EPA regulations and guid- ance for conducting data validations and adjustments. In response to this recommendation the Office of Air and Radiation (OAR) issued a memo on July I I, 2017 (posted on AMTIC), alerting the monitoring agencies of the importance of having quality assurance project plans (QAPPs) submitted and approved that conform to regulation and critical criteria. The OAR expects this review process to be completed by the end of CYI8. Additionally, OAR plans to revise the Data Certification and Concurrence Report (AMP600) to flag non- concurrence for any QAPP approval dates over five years. The OAR has already revised AQS to provide better infor- mation on the QAPP data reported to AQS. OAR is revising the Air Pollution Training Institute (APTI) course Quality As- surance for Air Pollution Measurement Systems (see article on page 5) that will address the issue of QAPP development and approval. Finally, the technical system audits that are con- ducted by the EPA Regions on monitoring agencies every three years will be used to identify QAPPs requiring revision. Recommendation 4: Periodically verify that air monitoring agencies are implementing the EPA's recommended crite- ria for data validation and adjustments through technical system audits or other oversight mechanisms. In response to this recommendation, OAR has developed and issued a technical systems audit guidance document (see article on page 4) with consensus from the EPA Regions to implement. This document will specify that auditors review validation criteria and the "process for documenting any adjustments made to raw data before submittal to AQS." The EPA Regions will use this guidance during technical systems audits that are conducted on the monitoring agencies every three years. Recommendation 5: Develop a process to provide assuranc- es that data reported to the Air Quality System database have met the approved zero- and span-check validation cri- teria prior to regional review and approval of the air moni- toring agencies' annual data certification packages. In response to this recommendation, OAR believes that the most important of the three critical criteria quality control checks (zero, span, I-point QC) is the I-point QC (reported to AQS) since it in- volves the use of both the zero air source (used for the zero check) for ozone standard dilution, and the ozone standard that is used to generate and measure the span. The I-point QC check concentra- tion approximates the ambient air concentrations reported by the monitoring organization and best represents the precision and bias around the concentrations reported by the monitoring agency. OAR believes that it is sufficient for monitoring agencies to complete zero and span checks in accordance with their approved QAPPs that uti- lize the EPA validation template critical criteria, and make these data available for review during the EPA technical systems audits. Alt- hough the AQS reporting of zero and span checks is not a regulatory requirement, some monitoring organizations and the EPA Regions have requested zero and span transactions be developed in order to voluntarily submit these data to AQS. OAR has requested that zero and span QA transactions be added to AQS and we will provide technical guidance suggesting that monitoring agencies submit these data to AQS. Reminder on Data Certifications After Data has been Modified In 2017, a number of monitoring organizations were asked to review their ozone data and invalidate any data that did not meet the critical criteria in the ozone valida- tion template. Many monitoring organizations performed these reviews and invalidated some data. We appreciate the efforts made by those monitoring organizations. As a reminder, because data for those years where data has been invalidated has been modified, there is no longer a "Y" certification and concurrence flag on that data. Dur- ing the 2017 certification period, (May 2018) monitoring organizations should consider recertifying the earlier years data. ------- N02 Gas Standard Warnings to Monitoring Organizations and Vendors EPA recently made available two technical memos ad- dressing concerns with the use of N02gas standards to calibrate, as well as conduct quality control (QC) checks on, true no2 analyzers. One memo was prepared specifi- cally for monitoring organizations, while the other was prepared for gas producers. These memos can be found at AMTIC. The advent and promotion of true N02 analyzers to Fed- eral Equivalent Method (FEM) status has led to increased use of these analyzers in ambient monitoring networks. EPA is aware of the air quality community's interest in, and adoption of, practices that use compressed N02 gas stand- ards for QC checks. The EPA has not issued comprehen- sive guidance on the use of N02 gas standards in quality assurance exercises. However, in the October 2016 edi- tion of the EPA's QA EYE (Issue 20, page 4), it was noted that EPA's Office of Research and Development's (ORD) recent experience indicated that N02 gas standards could be used to replace iso-propyl nitrate (IPN) and N-propyl nitrate (NPN) for NOy I -point QC checks. At that time, the Agency expected that commercially available N02 standards would follow EPA Protocol Gas requirements specified in 40 CFR Part 58 Appendix A Section 2.6.1. Unfortunately, it appears that in some instances this may not be the case. The Protocol Gas Requirements In order to produce an EPA Protocol Gas, a NIST-certified Standard Reference Material (SRM), a NIST-Traceable Reference Material (NTRM), or a Gas Manufacturers In- termediate Standard (GMIS) is required as the analytical reference standard. At the moment, NIST does not pro- vide an N02 SRM. NIST has indicated they are in the process of developing an N02 SRM, but there is no time- line in place for when this development will be completed. However, the Netherlands' Van Swinden Laboratory (VSL) has a Declaration of Equivalence with NIST, and they pres- ently produce an NOi Primary Reference Material (PRM) that is equivalent to a SRM. This PRM can be used to pro- duce a GMIS to, in turn, produce the N02 EPA Protocol Gas. It was pointed out in the memo to the gas producers par- ticipating in the Ambient Air Protocol Gas Verification Program (AA-PGVP) that if they use VSL's PRMs as the no2 analytical reference standard and follow the traceabil- ity protocol, such N02 gas mixtures can be certified as EPA Protocol Gases. The EPA will seek to determine which gas producers are using the VSL PRMs, and make this information available to the air quality monitoring community. Monitoring organizations were instructed that they should check certificates of analysis to ensure that their N02 gas producers use the VSL PRMs in the production of their N02EPA Protocol Gases. Stability of N02 Gas Mixtures It has also come to the EPA's attention that there are con- cerns about the cylinders used to store the N02 gas mix- tures. In standard passivated aluminum cylinders, the N02 gas concentration is unstable and degrades over a relatively short period of time. In light of this knowledge, VSL uses SGSTM (superior gas stability) aluminum cylinders from Luxfer for PRM concentrations less than 250 ppm to main- tain gas concentration stability. These cylinders have a pro- prietary interior surface that helps prevent reactions and concentration degradation. The use of these cylinders have shown a stability of approximately 12-18 months. In dis- cussions between the Agency and one specialty gas produc- er, the producer alluded that there may be certain concen- tration ranges that may be more stable than others, even when using the SGSTM cylinders. The EPA plans to involve appropriate staff from both the OAR and ORD to engage those specialty gas producers selling N02 EPA Protocol Gases to determine appropriate certification periods for the various concentration ranges of the N02 EPA Protocol Gases they may choose to produce. Summary In order to ensure true N02 analyzers are properly cali- brated and are providing accurate results, we suggest the following: 1. All Agencies using true N02 analyzers should calibrate their instruments via gas phase titration (GPT) using NO EPA Protocol Gases. 2. After the initial GPT calibration, if the agency has an N02 EPA Protocol Gas, that cylinder may be used as a QC check on the instrument if it meets the EPA Pro- tocol Gas requirements described above. Agencies should check these QC concentrations frequently and control chart these data since the cylinder may de- grade while the analyzer maintains its calibration. 3. Should an agency notice what might be degradation in the concentration of the gas mixture in the cylinder, the agency is advised to cease using the cylinder as a QC check and inform the EPA Regional Office. EPA would like to track these results. QC checks follow- ing the removal of the unstable N02 standard should be completed using GPT or an alternative N02 stand- ard.Solomon Ricks ------- ISSUE 22 PAGE 4 The TSA Guidance Docum On December 7th, the new Technical Systems Audit Quality Assurance Guidance Document (TSA QAGD) was emailed to the Air Division Directors, Air Program Managers, and the TSA Workgroup members. A holiday gift for the TSA auditor who has everything! A couple years ago at the National Ambient Air Monitoring Conference, EPA began a discussion that hinged around making Technical Systems Audits more consistent across the regions and raised a desire to share best practices nationally to improve our own audit programs. Out of these discus- sions, the TSA Workgroup was created and began work on a document to promote consistency in audits, compile best audit practices, revise old TSA tools, and create some new tools as well. The TSA Workgroup also wanted our audit processes to be transparent regarding what a monitoring organization could expect during a TSA so that they could prepare adequately for the audit. The culmination of these discussions and effort is the TSA QAGD. Another reason for crafting this document was to provide the document to the state, local, and tribal monitoring or- ganizations to help them strengthen their own quality sys- tems. By giving the quality assurance community the same guidance we use to conduct TSAs, those groups can then go through the same process when conducting internal audits. Following our EPA "TSA Playbook" on an annual basis, mon- itoring agencies can identify and correct issues routinely rather than waiting every three years for the EPA TSA. Everyone wins when there is a strong audit program within a quality system. Small issues are discovered and addressed before they are large issues and improvements can be iden- tified and implemented, resulting in a stronger organization. So what's in the document and how can it help you? To set the stage, the first three sections of the TSA QAGD lay out the basics of what a TSA is, why we do them, and what is required. But, the heart of the TSA QAGD lies in the fol- lowing sections describing the TSA process as a whole. Sections 4 through 8 guide the reader through the audit process beginning with the pre-audit preparations at the regional office, into the on-site audit, writing the TSA audit report, and through the corrective action process. These sections contain the audit steps, timelines, and best practices that were compiled through the collaborative discussions of the Workgroup members to detail the TSA process. \ \ The TSA QAGD also includes some revised and new audit tools as appendices including: Two versions of the revised TSA Questionnaire (Fillable Word and Excel) Field Audit Logbook Low Volume Weighing Laboratory Checklist Lead (Pb) Laboratory Checklist Audit Report Template Audit Close-Out Letter Along with describing the audit activities and process in detail, the Workgroup gave much consideration and attention to the TSA corrective action process. TSAs are not effective if weaknesses in a program are identified but not addressed. The TSA QAGD pre- sents a process for the auditor to use to facilitate corrective action in the monitoring organizations. The goal of the TSA is not simply to see how many "dings" can be uncovered, but to identify where the issues or weaknesses are within a monitoring organization and resolve the problems. Last, but definitely not least, this document is the result of two years of hard work from the TSA Workgroup and I'd like to thank each and every member who participated in the document's devel- opment. No collaborative effort is ever easy, but this group set the bar very high on exchanging opinions and experiences, sharing re- sources, and working together to complete this work. Job well done folks; you have my thanks and my respect. - Greg Noah I / _ y So, how can you learn more about this new guidance? Right now, we are in the initial planning phase to make some training opportu- nities available. For the EPA TSA auditors, we are considering a workshop in RTP to train on the document itself and other tech- nical needs. For monitoring organization auditors, we are consider- ing a TSA training session at the National Ambient Air Monitoring Conference to introduce the TSA QAGD and explore the TSA process. These are a couple of opportunities that are being consid- ered and others could develop over time. Currently, there is TSA guidance in the QA Handbook in Section 15.3. The new TSA QAGD supersedes this guidance, and all TSA auditors should follow the new guidance. We are currently work- ing on a new TSA webpage that will fall under the Quality Assur- ance section. We will post the document and the appendices on that page upon its completion. ------- PAGE 5 One New Guidance Document in Development, One Old One Being Revised OAQPS is currently working on two guidance documents: one is a brand new Flow Transfer Standard (FTS) Certifi- cation Quality Assurance Guidance Document, and the other is a revision to the Transfer Standards for Calibration of Air Monitoring Analyzers for Ozone. Transfer Standard (FTS) Certification Quality Assurance Guidance Document Over the years, there has been considerable interest in having an FTS certification guidance document that stand- ards labs or calibration shops could follow to conduct their own FTS certifications. The FTS certification guid- ance will provide a methodology for state, local, and tribal standards labs to use to recertify their fleet of flow stand- ards instead of returning them to the vendors. The FTS certification guidance will take a page from the ozone transfer standard guidance by implementing a tiered sys- tem with increasing levels of traceability that will drive how certifications are done and what will be acceptable at each level. The document will also have a procedure that laboratories can follow to recertify flow transfer standards. This document is in the initial stages of development, and Jenia McBrian is coordinating the effort at OAQPS. Ozone Transfer Standard Document The Transfer Standards for Calibration of Air Monitoring Analyzers for Ozone, or ozone transfer guidance, is currently under revision. Ozone Standard Reference Photometer (SRP) operators have recently voiced a desire to rework the document and to research current practices to determine if improvements can be made in the document. Scott Hamilton, Region 5, is leading a workgroup to help reorganize the document and to solicit and address con- cerns from the workgroup. The workgroup will review the trace- ability language, types of ozone transfer standards, the application of ozone transfer standards, and ozone transfer standard specifica- tions. The document will also contain a best practice standard operating procedure for conducting verification and re-verification of transfer standards.Greg Noah The QA Force Awakens-Reviving theAPTI 470 Course The role of Quality Assurance Manager (QAM) is a challeng- ing one. The QAM ensures the monitoring project and its participants adhere to the QA/QC policies established within the organization's QAPP. For an ambient air monitoring pro- gram designed to support the NAAQS, the QAM must be fluent in the suite of ambient air monitoring regulations, par- ticularly those found in 40 CFR Part 58, Appendix A. In addi- tion to knowing these requirements, the responsibilities of the QAM may include activities such as writing and updating QAPPs/SOPs, providing QA training to staff, oversight of audit programs and corrective actions, data validation, and comple- tion of annual data certification activities, among others. So for those who are new QA Managers, new to QA in general, or new to air monitoring, these responsibilities may seem a little overwhelming at first. But help is on the way! The Air Pollution Training Institute Quality Assurance for Air Pollution Measurement Systems (APTI 470) course is a multi-day training geared towards providing QA Managers and QA staff the tools they need to support and enhance their quality sys- tems. The course ties together the fundamentals of QA with the vast regulatory requirements of ambient air monitoring, so that participants can walk away with a better understanding of how these concepts interconnect - and how these con- cepts apply directly to them. APTI 470 has actually been around for a long time. Offered historically as a week-long classroom course, it provided partic- ipants the basics of QA, with a strong emphasis on statistics. In fact, during each day of the course, participants were tasked with applying statistical concepts to various data sets; challenges included performing regression analyses of calibration data, conducting outlier tests, calculating precision and bias of QA/ QC data sets, and building control charts (including calculating the warning and control limits) - all by hand. During the past decade, however, the APTI 470 course has not received up- dates needed to keep it current with revised monitoring regula- tions and QA requirements. And, although the need to under- stand the statistics is still fundamental to any QA program, various tools and AQS reports are now available to help moni- toring organizations quickly and effortlessly perform many of the statistical computations required in the monitoring pro- gram. So, we are excited to announce that efforts are currently un- derway to revitalize this important QA course! In addition to bringing course content up-to-date with current regulations, EPA guidance, and monitoring technology, the course is also being reconstructed to offer content in both a traditional class- room setting and online. Continued on page 7 Q A EYE ------- ISSUE 2 2 PAGE 6 The system includes a new data software package and a data acquisition card which is USB driven. Over the next few years, Scott hopes to be able to upgrade the current SRPs with many of the improvements found in the new SRP. Number 62 will now become the "bench standard" in RTF. If issues arise with travelling standard #7, we will be able to send out SRP #1 without any interruption of verification service. The 2017-2018 SRP Level I verification schedule is posted on AMTIC. \ S Data Acquisition Card ~ New Ozone Standard Reference Photometer Over the years we have occasionally run into slowdowns when transferring the "travelling SRP" (SRP #7) from region to region in order to verify the regional level I SRPs. A few years back the verifications were delayed due to technical issues and transit damage associated with SRP #7 to the extent that we sent out the RTP bench SRP (SRP # I) which, to our dismay, also got damaged in transit. Although we have made improve- ments to our shipping containers to reduce transit damage, it was felt that it was time to have one additional SRP available in order to maintain our verification schedules. In FY 2016 and 2017 we secured additional STAG funds in order to purchase a new SRP (#62) which arrived in RTP on December I Ith. Dur- ing the previous week, Scott Moore, the ORDs SRP lead, was in Region 2 working with the EPA Region I and Region 2 SRP leads on their equipment and stopped into NIST on the way back to North Carolina to pick up the new SRP. The new sys- tem sports a stealth black color and has incorporated a num- ber of improvements. These improvements include a manifold (seen to the right of the SRP) which has a fixed glass insert for more consistent set-up and operation. The SRP pneumatic system also sports a number of improvements including: A second mass flow controller which replaced the use of capillary tubes; reorientation of the heat block which facilitates access to the ozone generator lamp; addition of a solid teflon block for two solenoids, which replaced a number of swagelock fittings where there was potential for leaks; and, the addition of a fan to reduce heat generated from the solenoids. (SRP) Arrives in RTP ¦ ^ s Pneumatics Box ------- PAGE 7 APTI Training Course (continued from page 5) The Mid-Atlantic Regional Air Man- agement Association (MARAMA) and OAQPS have teamed up for this project, with EPA staff devel- oping the course material and a MARAMA Technical Review Com- mittee (TRC) reviewing the con- tent and providing feedback as each lesson is developed. The first phase of this project is to develop new material for the traditional classroom course, which will be taught in-person by an instructor. Once completed, the classroom material will be modified and possi- bly expanded to create an online course, which will include a series of quizzes and tests, followed by a certificate issued at course comple- tion. The new APTI 470 classroom course will be comprised of 12 primary lessons over 3 days - rang- ing from broad discussions of EPA's QA Policy and monitoring require- ments to in-the-weeds talks about the technical QA/QC for criteria pollutant monitors. The course will dive into the specifics of how to veri- fy/validate data, conduct data quality assessments, and perform internal systems audits. The course will also offer best practice suggestions along the way to help participants brain- storm ways they can augment their QA programs. The course is also designed to help other monitoring staff - such as site operators -better understand their vital role in the quality system, and how their routine QC activities can have a profound impact on the quality of the organiza- tion's monitoring data. A separate module on statistics will be offered to participants as a refresher on the various statistical analyses applicable to the monitoring program, which will follow the completion of the 3- day QA overview. Many monitoring organizations have welcomed new QA managers and staff to their programs in re- cent years, so we are hopeful this new course will provide those staff with the knowledge, tools, and confidence they need to tackle their new roles. And although geared towards staff in monitoring organizations, this course should be beneficial to anyone working in QA or ambient air monitoring, including EPA. All classroom con- tent should be drafted by the end of January 2018, finalized in early spring after all comments from the MARAMA TRC have been ad- dressed. The online course mate- rial is targeted for completion in December 2018. Stay tuned. Stephanie McCarthy Null Codes Now Available lor AQS QA Transactions As described in the OIG article on Page I of this issue, on 8/30/2017, EPA posted a Technical Memoran- dum on AMTIC describing the steps to validate data after an ex- ceedance of a critical criteria check. In order for this process to be implemented in AQS, AQS needed to be modified to include: 1. An "EC" null code for use when the concentration data is invali- dated due to a valid exceedance of the critical criteria; 2. An "IV" qualifier for use when compelling evidence exists to validate the concentration data in the case where there is a valid exceedance of the critical crite- ria; 3. An "1C" (invalid QC check) null code to the QA transaction when the QC check, after further evalua- tion, was found to be invalid. In this case, the concentration of the QC check (assessment concentra- tion) would not be reported and the null code would be reported as the descriptor; and, 4. A comment field in the QA trans- action. All four processes have been coded in AQS and AQS is ready to accept information based on the process described in the technical memo. A null code can be provided in the QA transaction without a comment (although it is strongly suggested to include one), and the comment field can be used without an null code reported. The AQS team will be forthcoming with more infor- mation on this addition. The Quarterly Report described in the memo and on Page I of this issue will use the null codes and qualifiers to perform the quarterly assessments. Monitoring organiza- tion can now implement this pro- cedure for the 4 gaseous pollu- tants. We will work with moni- toring organizations and the EPA Regions to determine instances where null codes or other qualifi- ers may be applicable to other QA Transactions. Q A EYE ------- ISSUE 22 PAGE 8 Revised guidance for Self-implementation of the PEP and NPAP All primary quality assurance organizations (PQAOs) have the primary responsibility of implementing their own PM25 Perfor- mance Evaluation Program (PEP) for network bias and National Performance Audit Program (NPAP) for NAAQS gases; however, the regulations allow PQAOs to defer to EPA implementation of either using STAG grant funds. Most opt for EPA to handle it. If a PQAO is considering self-implementing the PEP or NPAP, 40 CFR Part 58 Appendix A, Section 2.4 makes clear that the SLT's program must be "independent and adequate. The "independence" is defined succinctly in the regulation, but gener- ally means the PEP and NPAP measurements are not performed by the PQAOs monitoring program. Adequacy is more esoteric. The overarching concern is that aN data be "comparable," wheth- er it is generated by the EPA-implemented or PQAOs program. To achieve this objective all implementers should ideally use the same or equally-performing equipment, use the same technical procedures, and avoid any additional bias introduced by the ana- lytical methods. Self-implementation requires that PQAOs dedi- cate resources to the program for independent sampling equip- ment and flow standards (calibrators), tools, and consumable items, NIST-traceable certifications of samplers, and a working laptop computer. Independent lab services require the PQAOs to either construct a separate lab or subscribe to external sources. The EPA has previously posted documents on AMTIC that dis- cussed the equipment, procedures, and quality assurance measures, which PQAOs should include in self-implemented programs. The NPAP document is dated July 23, 2008, and is available for review at AMTIC. The last edition for the PEP document was 2009, but it has been inadvertently removed from AMTIC. It is available from Regional PEP leads and the national PEP lead at OAQPS. The 2009 PEP document did not specifically cover the Pb-PEP. Several factors necessitate a revision of the independence and adequacy guidance. In addition to the disappearance of the 2009 PEP guidance on AMTIC and the absence of specific guidance on the Pb-PEP, two subsequent rulemakings have altered a few requirements in the associated monitoring Federal Reference Methods in 40 CFR Part 50 and the quality assurance require- ments in 40 CFR Part 58 Appendix A. In 2017, EPA launched a new set of procedures for managing the logistics of the NPAP annual performance audits and the associated data flow from start of the process all the way to the final posting of the audit results in AQS. A similar process will launch for the PM25- and Pb-PEPs in January 2018. We anticipate publication and posting of the revised independ- ence and adequacy guidance for NPAP and PEP by the end of February 2018. For more information, contact crum- pler.dennis@epa.gov for PEP; and noah.greg@epa.gov for A Few Things in the Works for 2018 As mentioned in a number of articles in this issue, we have some plans for new guidance documents. The fol- lowing are a few more items we'll be working on in 2018. QA Handbook Although the QA Handbook came out in 2017 we will be asking the QA Handbook Workgroup to pro- vide a review and comments on any guidance that needs more detail or anything that we might need to add. We have found that minor edits are needed and we will also be working on areas in response to the OIG audit that need more clarification. We will be developing a table in AM- TIC of any changes that we think need to be made so monitoring or- ganizations will be able to follow and update their QA documents as needed. Continuous PMI0 Validation Template. Based on our research, the PM,0 con- tinuous monitors are all low volume instruments and QC checks like the flow rates are more conducive to PM25 acceptance criteria. We will be revising that template as well and adding some additional QC criteria to the PM25 continuous validation templates based on the introduction of new monitoring technologies. We will let the EPA Regions know when these changes have been made. Minor Methods Clarification in 40 CFR Part 50 Based on some comments from monitoring organizations and EPA Regions we have been working on a technical memorandum with ORD to provide some methods clarifications. These include : Allowing lower concentration stand- ards to be used for calibration of N02 and S02 instruments Clarification of calibration ac- ceptance criteria language in the gaseous criteria methods, and Clarification on the use of the Dy- namic parameter specifications in the N02 method. Data Certification Software We will be revising the data certifica- tion software to identify monitoring organization with QAPPs that have not been approved within a 5 year period. This will be a stricter requirement than we currently have in the program and is based on the Technical Memo dated 7/1 1/2017 on AMTIC. ------- National Gravimetric Lab Round Robin Preliminary Results In the last issue of the QA Eye, we reported that the Mega PE and National Gravimetric Lab Round Robin programs were close to resuming after a two-year hiatus in which OAQPS has been working hard to transition the programs. We are happy to report that both programs are now up and running! After successful testing of the PE sampling array (briefly de- scribed in Issue 21), the first National Gravimetric Lab Round Robin event began in late September. Laboratories were selected based upon feedback from each Region, which resulted in 22 laboratories participating in the first round. This is a dramatic increase in the number of laboratories that were able to participate previously and will enable us to evaluate all of the National Gravimetric Laboratories on a biennial basis. For the Gravimetric Round Robin, OAQPS and the 22 par- ticipating labs obtained tare weights of five 47 mm Teflon filters. Three of the filters would be sampled, and the re- maining two would remain blanks. After all the filters and blanks were returned to OAQPS and reweighed, three sam- pling events were run to obtain samples with three different loadings, which were targeted based upon Network averag- es. Following sample collection, which ranged from several days to over a week, the samples were weighed by OAQPS and then returned to each participating laboratory for their final weights. All samples were kept at <4 °C except during equi- libration and weighing during the entire process. All the results were in from the participating labs by mid- November and the samples were reweighed by OAQPS. The resulting average loading from each event was 188 |jg (6% RSD) for Event I; 242 Hg (3% RSD) for event 2; and 488 (2% RSD) pg for event 3. As expected, the %RSD increased at the lower loading. Preliminary results have been compiled and are summarized in the figures to the right. This initial data analysis compares the laboratory differences from the OAQPS and participat- ing laboratory result, with the upper and lower bounds set at the 95% confidence interval (CI) from the mean differ- ence. The results will undergo further analysis using the Youden Index (discussed in Issue 16, page 5) to compare laboratories. A final report on this Gravimetric Round Rob- in is expected to be issued by late January, 2018. Sampling for Mega PE (XRF, OCEC, and IC analyses) is expected to begin by early February. Laboratories included in that study will be limited to those currently used by the CSN and IM- PROVE Networks, and labs will only receive PE samples for those analyses they currently provide to the Network. -Jenia McBrian Q A EYE 2017 National Gravimetric Lab Round Robin: Event 1 Average Loading 188 ng Preliminary Results - EPA vs Lab Delta with a 95% Confidence Interval 1 2 3 5 6 7 I 11 12 13 ] Laboratory 15 16 17 18 19 20 21 22 National Gravimetric Lab Round Robin Event 1 Preliminary Results 2017 National Gravimetric Lab Round Robin: Event 2 Average Loading 242 |ig Preliminary Results - EPA vs Lab Delta with a 95% Confidence Interval a < o E I I I t £ -5 1 2 3 5 6 7 95% CI Average 9 10 11 12 13 14 15 16 17 18 19 20 21 22 Laboratory National Gravimetric Lab Round Robin Event 2 Preliminary Results 2017 National Gravimetric Lab Round Robin: Event 3 Average Loading 488 ng Preliminary Results - EPA vs Lab Delta with a 95% Confidence Interval O 5 E ...... 95% CI Average -15 1 2 3 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 Laboratory National Gravimetric Lab Round Robin Event B Preliminary Results ------- I 7J \ ^£DS?>|v ^ PRO^° EPA-OAQPS C304-02 RTP, NC 27711 s o x E-mail: papp.michael@epa.gov The Office of Air Quality Planning and Standards is dedicated to developing a quality system to ensure that the Nation's ambient air data is of appropriate quality for informed decision making. We realize that it is only through the efforts of our EPA partners and the moni- toring organizations that this data quality goal will be met. This newsletter is intended to provide up-to-date communications on changes or improvements to our quality system. Please pass a copy of this along to your peers and e-mail us with any issues you'd like discussed. Mike Papp Key People and Websites Since 1998, the OAQPS QA Team has been working with the Office of Radiation and Indoor Air in Montgomery and Las Vegas and ORD in order to accomplish it's QA mission. The following per- sonnel are listed by the major programs they implement. Since all are EPA employees, their e- mail address is: last name.first name@epa.gov. The EPA Regions are the prima- ry contacts for the monitoring organizations and should always be informed of QA issues. Program Person Affiliation STN/IMPROVE Lab Performance Evaluations Jenia McBrian OAQPS Tribal Air Monitoring Emilio Braganza ORIA-LV Speciation Trends Network QA Lead Jenia McBrian OAQPS OAQPS QA Manager Joe Elkins OAQPS Standard Reference Photometer Lead Scott Moore ORD-APPCD National Air Toxics Trend Sites QA Lead Greg Noah OAQPS Criteria Pollutant QA Lead Mike Papp OAQPS NPAP Lead Greg Noah OAQPS PM2.5 PEP Lead Dennis Crumpler OAQPS Pb PEP Lead Greg Noah OAQPS Ambient Air Protocol Gas Verification Program Solomon Ricks OAQPS Websites Website EPA Quality Staff AMTIC AMTIC QA Page URL EPA Quality System http://www3.epa.aov/ttn/amtic/ http://www3.epa.gov/ttn/amtic/guality.html Description Overall EPA QA policy and guidance Ambient air monitoring and QA Direct access to QA programs ------- |