^tDsrx
* d \
V PRO^°
U.S. ENVIRONMENTAL PROTECTION AGENCY
OFFICE OF INSPECTOR GENERAL
Improving air quality
Differences in Processing
Practices Could Decrease the
Reliability of Ozone Data Used
for Assessing Air Quality to
Protect Public Health
Report No. 18-P-0105
February 28, 2018

-------
Report Contributors:
Acknowledgements:
Andrew Lavenburg
Wendy Wierzbicki
Geoffrey Pierce
Renee McGhee-Lenart
James Hatfield
Anita Mooney
Jasprit Matta
Abbreviations

AQS
Air Quality System
CFR
Code of Federal Regulations
DEQ
Department of Environmental Quality
DHEC
Department of Health and Environmental Control
DNR
Department of Natural Resources
EPA
U.S. Environmental Protection Agency
NAAQS
National Ambient Air Quality Standards
OAQPS
Office of Air Quality Planning and Standards
OIG
Office of Inspector General
ppb
parts per billion
QA
Quality Assurance
QA Handbook
The EPA's Quality Assurance Handbook for Air Pollution

Measurement Systems
QAPP
Quality Assurance Project Plan
QC
Quality Control
TSA
Technical Systems Audit
Cover photos: Air monitor audit being conducted under the National Performance Audit
Program. (EPA, Quality Assurance Handbook for Air Pollution Measurement
Systems. Volume II: Ambient Air Quality Monitoring Program,
EPA-454/B-17-001, January 2017)
Are you aware of fraud, waste or abuse in an
EPA program?
EPA Inspector General Hotline
1200 Pennsylvania Avenue, NW (2431T)
Washington, DC 20460
(888) 546-8740
(202) 566-2599 (fax)
OIG Hotline@epa.gov
Learn more about our OIG Hotline.
EPA Office of Inspector General
1200 Pennsylvania Avenue, NW (2410T)
Washington, DC 20460
(202) 566-2391
www.epa.gov/oiq
Subscribe to our Email Updates
Follow us on Twitter @EPAoig
Send us your Project Suggestions

-------
^tD sx
* o •,
% PR0^0


-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
THE INSPECTOR GENERAL
February 28, 2018
MEMORANDUM
SUBJECT: Differences in Processing Practices Could Decrease the Reliability of Ozone Data Used
for Assessing Air Quality to Protect Public Health
Report No. 18-P-010
This is a final report on the subject review conducted by the Office of Inspector General (OIG) of the
U.S. Environmental Protection Agency (EPA). The project number for this review was
OPE-FY16-0009. This report contains findings that describe the problems the OIG has identified and
corrective actions the OIG recommends.
The agency agreed with all recommendations and provided planned corrective actions and completion
dates that meet the intent of these recommendations. Therefore, the agency is not required to provide a
written response to this final report. Please update the EPA's Management Audit Tracking System as
you complete the planned corrective actions for the recommendations. Please notify my staff if there is a
significant change in the agreed-to corrective actions. Should you choose to provide a response to this
final report, we will post your response on the OIG's public website, along with our memorandum
commenting on your response. You should provide your response as an Adobe PDF file that complies
with the accessibility requirements of Section 508 of the Rehabilitation Act of 1973, as amended.
FROM: Arthur A. Elkins Jr.
TO:
William Wehrum, Assistant Administrator
Office of Air and Radiation
We will post this report to our website at www.epa.gov/oig.

-------
Differences in Processing Practices Could Decrease
the Reliability of Ozone Data Used for Assessing
Air Quality to Protect Public Health
18-P-0105
	Table of Contents
Chapters
1	Introduction		1
Purpose		1
Background		1
Responsible Office		5
Scope and Methodology		5
Prior Coverage		7
2	Varying Ozone Data Processing Practices Pose Risk to Data Reliability		8
EPA's Data Validation Requirements and Guidance		8
EPA's Recommended Critical Validation Criteria Not Used Consistently ...	9
Ozone Data Adjusted in Manner Inconsistent With EPA's QA Handbook...	11
Varying Shelter Temperature Criteria Used to Validate Ozone Data		11
Different Data Processing Practices Could Decrease Data Reliability		13
EPA Has Taken Actions to Assess AQS Data Quality		15
Conclusions		16
Recommendations		16
Agency Comments and OIG Evaluation		16
3	EPA Oversight Should Be Strengthened to Improve Ozone Data Quality . ..	17
EPA Oversight of Ambient Air Monitoring Programs		17
EPA Did Not Verify That QAPPs Were Revised in Timely Manner		19
Data Certification Reviews Could Be Used to Better Identify
Inconsistent Data Processing Practices		20
TSAs Did Not Identify Data Processing Practices That Were
Inconsistent With EPA Guidance		21
EPA Has Improved Oversight, but Further Steps Are Needed		22
Conclusions		23
Recommendations		23
Agency Comments and OIG Evaluation		24
Status of Recommendations and Potential Monetary Benefits		25
Appendices
A Agency Comments on Draft Report and OIG Evaluation	 26
B Distribution	 36

-------
Chapter 1
Introduction
Purpose
The Office of Inspector General (OIG) for the U.S. Environmental Protection Agency
(EPA) conducted this review to determine whether selected air monitoring data in the
EPA's Air Quality System (AQS) meet criteria established by the EPA. Specifically,
we asked the following questions:
•	Do data revisions comply with EPA criteria?
•	Do data exclusions or gaps comply with EPA criteria?
Background
Ambient air is a term used to refer to the surrounding, outdoor air. Charged with
protecting human health and the environment, the EPA establishes standards that
limit pollution, such as ozone, in the ambient air. To determine compliance with
these standards, air monitoring agencies (i.e., state, tribal and local governments
that operate ambient air monitoring networks) collect data regarding air quality
using air monitoring networks. These networks are composed of individual
monitors and monitoring stations housed in shelters that have been installed at
various sites throughout an area. The EPA uses the data from air monitoring
networks to inform its regulatory decisions about ambient air quality standards.
National Ambient Air Quality Standards
The EPA uses data from state, local and tribal air monitoring networks to
determine whether an area's air quality meets the National Ambient Air Quality
Standards (NAAQS). The EPA sets these air quality standards at a level to protect
public health, including sensitive populations such as the elderly, children and
asthmatics, from the effects of air pollution. Table 1 identifies health effects
associated with ground-level ozone, a major pollutant in ambient air.
Table 1: Health effects of ozone
Short-term health effects
Long-term health effects
•	Shortness of breath and pain when
taking a deep breath.
•	Coughing and sore or scratchy throat.
•	Inflamed and damaged airways.
•	Increased frequency of asthma attacks.
•	Increased susceptibility to lung infection.
•	Aggravation of asthma, and is likely to
be one of many causes of asthma
development.
•	May be linked to permanent lung
damage, such as abnormal lung
development in children.
•	May increase the risk of death from
respiratory causes.
Source: OIG analysis of EPA websites describing the health effects of ozone.
18-P-0105
1

-------
In October 2015, the EPA set the ozone ambient air quality standard at 70 parts
per billion (ppb). To meet Clean Air Act requirements, the EPA was required to
make its initial designation determinations as to whether areas in the United States
meet the 2015 ozone NAAQS by October 1, 2017. The EPA started this
determination process in 2016. This designation process had not been completed
as of February 26, 2018.
An EPA determination that an area's air quality does not meet NAAQS (which is
called a "nonattainment designation") can have significant consequences for that
area and state. If a responsible state or local agency is found to be in
nonattainment, it must develop an implementation plan that identifies enforceable
measures for reducing emissions of the specific criteria pollutant that is in
nonattainment to improve air quality in that area. These measures can include
more stringent permits and emission controls for industry and other sources
within the nonattainment area.
Air Monitoring Databases
The EPA maintains ambient air monitoring data in two databases: AirNow and
AQS. Air monitoring agencies report raw or real-time data to AirNow every hour.
These data are used to report an area's air quality index, which informs the public
of current air quality conditions. Then, monitoring agencies generally have
3 months to review and validate the
monitoring data collected before
submitting the data to the AQS. In
addition, monitoring agencies must
certify once every year that the
ambient air monitoring data are
accurate and entered into the AQS,
as required by 40 CFR § 58.15.
The EPA uses the air monitoring
data from the AQS to compute
yearly design values for each
monitor.1 The EPA uses these design
values to make its designation
determinations and to classify
nonattainment areas based on the
monitor with the highest design
value in an area.
AirNow
Collects hourly, real-time and forecasted air
quality information to inform the public.
Communicates air quality to the public via
the air quality index.
Includes data that are considered
preliminary and that are not used for
regulatory decisions.
For more information, visit About AirNow.
>
>
>
AQS
AQS data are used to perform the following tasks:
>	Assess air quality.
>	Assist in attainment and nonattainment
designations.
>	Evaluate state implementation plans for
nonattainment areas.
>	Perform modeling for permit review analysis
and other air quality management functions.
1 The ozone design value is the annual fourth-highest daily maximum 8-hour average concentration for a monitor,
averaged over 3 years.
18-P-0105
2

-------
EPA Data Processing Requirements and Guidance
Appendix A of 40 CFR Part 58 requires that each air monitoring agency establish
a quality system that provides sufficient information to assess the quality of the
monitoring data. This quality system must include performance requirements for
data precision, bias and completeness. To help the monitoring agencies meet these
requirements, the EPA has established quality assurance (QA) criteria through
both regulation and guidance. These regulations and guidance outline how to
produce comparable data within an acceptable level of data quality for the EPA to
use in making regulatory decisions about air quality.
Validation Criteria
The EPA's 2013 Quality Assurance Handbook for Air Pollution Measurement
Systems (QA Handbook)2 is specifically referenced in appendices to 40 CFR
Part 58 as guidance for air monitoring agencies to use when developing a quality
system for an ambient air monitoring program. This guidance has been updated a
number of times since its issuance, most recently in 2008, 2013 and 2017;
however, the 2013 edition was the applicable guidance for our review since we
reviewed data from 2012 to 2014 when that version was effective. The QA
Handbook provides guidance for performing quality checks of air monitors and
establishing measurement objectives to validate the data collected by air monitors.
These objectives should be based upon requirements in the Code of Federal
Regulations (CFR), the monitoring agency's QA project plan (QAPP) and
standard operating procedures, and field and laboratory technical expertise.
The EPA outlines criteria that monitoring agencies should use to validate ozone
monitoring data in Appendix D of its QA Handbook. These criteria are referred to
as the "validation criteria." Some validation criteria outlined in the QA Handbook
are required by the CFR, while others are recommended as best practices. The
EPA organizes these validation criteria into three levels based on how significant
the criteria are to overall data quality:
• Critical Criteria: Critical for maintaining integrity of the data.
Observations (i.e., the data collected by a monitor) that do not meet each
and every critical criterion should be invalidated, unless there are
compelling reasons or justifications for not doing so. The QA Handbook
establishes three "critical" quality control (QC) checks that maintain the
integrity of the data collected by an ozone monitor: the zero check, the
one-point QC check and the span check. Although the EPA considers all
three of these criteria to be critical, only the one-point QC check is
required by the CFR.
2 EPA, Quality Assurance Handbook for Air Pollution Measurement Systems. Volume II: Ambient Air Quality
Monitoring Program, EPA-454/B-13-003, May 2013.
18-P-0105
3

-------
•	Operational Criteria: Important for maintaining and evaluating the
quality of the data. Violation of an operational criterion may be cause to
invalidate the data, but further investigation is warranted. The QA
Handbook states that the validation decision should consider other QC
information that may or may not indicate that the data are acceptable. An
example of an operational criterion is the temperature of the shelter that
houses the monitor. Monitors are approved for use within certain
temperature ranges, and monitoring agencies review shelter temperature as
part of the data validation process.
•	Systematic Criteria: Important for the correct interpretation of the data,
but do not usually impact the validity of the data. For example, annual
precision, bias and data completion criteria are considered systematic
criteria. If these criteria are not met, the observations are not invalidated,
but the error rate associated with the attainment/nonattainment decision
may be impacted.
To conduct the three critical QC checks, air monitoring agencies regularly test
each air monitor with known, certified concentrations of ozone. The concentration
of ozone used for each test depends on
the critical QC check being performed
(see green box). The air monitoring
agency then compares the monitor's
response (i.e., the ozone concentration
detected and recorded by the monitor) to
the certified test concentration for each
test performed.
For each QC check, the EPA allows a
certain degree of difference between the
monitor's response and the certified
concentration. This acceptable difference
is referred to as the "acceptance criteria." The EPA provides recommended
acceptance criteria for each QC check in its QA Handbook. According to the QA
Handbook, if acceptance criteria are exceeded, the data collected by that monitor
from the time of the last acceptable check to the failed check should be
invalidated unless there are compelling reasons and justifications for not doing so.
When data are invalidated, they are not reported to the AQS by the monitoring
agency. Instead, null codes that explain why the data are missing are to be
reported to the AQS. Data that are invalidated are not used to calculate ambient
air averages or design values.
Data Adjustments
The QA Handbook states that "based upon validation criteria, the data is either
reported as initially measured or invalidated." The handbook allows daily
EPA's critical QC checks
>	The zero check measures the analyzer's
response to zero ozone (0 ppb ozone).
>	The one-point check measures the
analyzer's response to the typical ozone
concentration at the site (5-80 ppb).
>	The span check measures the analyzer's
response to a concentration at the upper
range of the analyzer's measurement
capability, traditionally at 80-90 percent
of operating range, which can be 500
ppb or more.
18-P-0105
4

-------
adjustments to monitors based on automated zero checks3 but only under certain
circumstances. Adjustments based on automated zero checks are not intended to
correct data previously collected at the monitor, which would be considered post-
processing of the data and is not allowed.
EPA Oversight
The EPA's Office of Air Quality Planning and Standards (OAQPS) and the EPA
regions provide oversight of the ambient air monitoring systems:
•	OAQPS provides oversight of the national ambient air quality monitoring
network, including (1) managing the AQS database to verify that
monitoring agencies are properly reporting monitoring data; (2) using data
from the AQS to determine whether an area's air quality meets the
NAAQS; (3) issuing and revising guidance documents regarding quality
systems, including the QA Handbook, as needed; and (4) providing
technical assistance to the EPA regional offices and the air pollution
monitoring community.
•	EPA regional offices directly oversee the implementation of the air
monitoring networks located in their regions. This oversight includes
reviewing and approving monitoring agencies' QA and QC procedures,
the ambient air monitoring data, and the QA data that each monitoring
agency submits annually to the EPA as part of its annual data certification
package. Per 40 CFR § 58.15, monitoring agencies are required to submit
their data certification packages by May 1 of each year. In addition, the
regulations require the regions to conduct technical systems audits (TSAs)
of state, local and tribal monitoring agencies at least once every 3 years to
assess their compliance with regulations governing the collection,
analysis, validation and reporting of ambient air quality data.
Responsible Office
The EPA office responsible for implementing the recommendations included in
this report is OAQPS, within the Office of Air and Radiation.
Scope and Methodology
We performed our review from January 2016 through October 2017. We
conducted this performance audit in accordance with generally accepted
government auditing standards. Those standards require that we plan and perform
the audit to obtain sufficient, appropriate evidence to provide a reasonable basis
for our findings and conclusions based on our audit objectives. We believe that
3 According to the QA Handbook, some air monitoring analyzers are capable of periodically conducting regularly
scheduled zero- and span-check calibrations and can automatically adjust the monitor readings based on the results
of those calibrations.
18-P-0105
5

-------
the evidence obtained provides a reasonable basis for our findings and
conclusions based on our audit objectives.
We selected two states in Region 4 (Georgia and South Carolina), one state in
Region 5 (Michigan) and one state in Region 9 (Arizona) for review. We initially
selected Georgia and South Carolina for review due to the volume of differences
we observed in hourly ozone values in AirNow and AQS.4 We then expanded our
review to include Michigan and Arizona because we observed hourly ozone
values from both of these states that were different in AirNow and AQS. In
addition, these states were also included because they were located in different
EPA regions and because some monitoring sites in these states measured ambient
air ozone conditions that were close to the EPA's ambient air ozone standard.
To address our objective regarding data revisions, we reviewed a sample of
hourly ozone data for each of the four states we selected to determine whether
data were adjusted by the monitoring agency prior to reporting the data to AQS.
To address our objective regarding data gaps or exclusions, we reviewed
1,326 instances in three states (Georgia, Michigan and South Carolina) where
hourly averages were not reported to the AQS and were replaced with invalidation
codes (null codes). A relatively small number of data gaps were sampled in
Georgia and South Carolina because we limited our review to data gaps that
resulted in different 8-hour averages among the highest 8-hour average daily
maximums at each site. However, we expanded our review of data gaps in
Michigan to include any day where the AQS did not have an 8-hour average daily
maximum. We focused the data gap sample in Michigan to data reported in 2014
because this was the only year of our data review that would potentially impact
data the EPA will use to make the 2015 ozone NAAQS attainment designations.
We selected six monitoring agencies in our sample of four states for further
review. We conducted site visits at three of these agencies: the Georgia
Department of Natural Resources (DNR), the Michigan Department of
Environmental Quality (DEQ) and the South Carolina Department of Health and
Environmental Control (DHEC). During these site visits, we obtained raw
monitoring data, QA and QC data, and supporting documentation to explain data
differences and gaps. We did not conduct site visits for the three monitoring
agencies we reviewed in Arizona: Arizona DEQ, Maricopa County and Pima
County. However, for all six monitoring agencies, we interviewed staff regarding
data processing policies and procedures, including any data adjustment practices
and data validation criteria. We also reviewed each monitoring agency's QAPP,
standard operating procedures and TSAs.
4 For more details about our AirNow and AQS analyses, see the "Scope and Methodology" section in EPA OIG
Report No. 17-P-0106. Management Alert: Certain State, Local and Tribal Data Processing Practices Could Impact
Suitability of Data for 8-Hour Ozone Air Quality Determinations, issued February 6, 2017.
18-P-0105
6

-------
We also interviewed staff from the EPA's OAQPS regarding air monitoring
regulations, EPA guidance on data processing, and how data in the AQS is used
for attainment designation decisions. We interviewed staff in EPA Regions 3, 4
and 5 to discuss their oversight of air monitoring agencies. In interviews with
Regions 4 and 5, specifically, we discussed the use of AQS data, review and
approval of QAPPs, TSA findings, and review of annual data certifications.
Prior Coverage
During preliminary research for this evaluation, we identified concerns with the
data processing practices of two monitoring agencies, and we issued a
management alert report to notify the EPA of a potential risk in using this ozone
data to make its designation determinations regarding compliance with the
2015 NAAQS. Our management alert report, EPA OIG Report No. 17-P-0106.
Management Alert: Certain State, Local and Tribal Data Processing Practices
Could Impact Suitability of Data for 8-Hour Ozone Air Quality Determinations,
was issued on February 6, 2017. The report did not have any recommendations
for the agency; however, the EPA issued a response to the management alert
report and proposed several corrective actions.
18-P-0105
7

-------
Chapter 2
Varying Ozone Data Processing Practices
Pose Risk to Data Reliability
Certain air monitoring agencies employed ozone data processing practices that
were not consistent with EPA-recommended practices or with each other:
•	Three of the six monitoring agencies we reviewed did not consistently
use the EPA's recommended critical validation criteria before reporting
ozone data to the AQS. Managers at two of these agencies told us that the
criteria were not required or not needed to meet regulatory requirements.
•	Three of the six monitoring agencies we reviewed revised or adjusted
ozone data reported to the AQS from 2012 to 2014 using processes that
were not consistent with the EPA's QA Handbook. Managers at these
agencies told us that they thought their practices improved data accuracy.
•	Data gaps or invalidated data in the sample we reviewed were generally
supported and in accordance with EPA guidance. However, monitoring
agencies applied varying shelter temperature range criteria for data
validation.
The EPA's oversight controls did not always identify when validation and
adjustment practices were inconsistent with the EPA's QA Handbook (see
Chapter 3 for more details). Variation in monitoring agencies' data processing
practices could result in data quality uncertainty and could decrease the reliability
of the data used to make decisions regarding NAAQS compliance.
EPA's Data Validation Requirements and Guidance
Pursuant to Appendix A of 40 CFR Part 58, the failure to conduct or pass a
required QC check does not by itself invalidate data for regulatory decision-
making. Instead, the EPA's ambient air monitoring regulations require that the
EPA and monitoring agencies to use a "weight of evidence" approach to
determine the data's suitability for regulatory decision-making, such as
determining compliance with the NAAQS. The regulation states that using the
data validation criteria approved in an air monitoring agency's QAPP is the basis
for this approach.
Appendix D of the EPA's QA Handbook contains data validation templates for all
criteria pollutants. These templates were initially developed in 1998 by a
workgroup composed of personnel from air monitoring agencies, EPA regional
offices and OAQPS. The EPA recommends invalidating observations that do not
18-P-0105
8

-------
meet the critical criteria provided in this appendix, stating that such observations
are invalid unless proven otherwise.
Air monitoring agencies are required to follow EPA regulations. Air monitoring
agencies are encouraged—but not required—to follow the guidance provided in
the EPA's QA Handbook. To distinguish regulatory requirements from guidance
in the QA Handbook, the EPA defines and uses the following terms:
•	Shall and must when the element is required by statute and regulation.
•	Should when the element is recommended to help establish or improve the
quality of data or a procedure. If this element is not followed, an alternate
procedure that meets the intent of the guidance should be developed.5
•	May when the element is optional.
According to the QA Handbook, observations that do not meet each and every
critical criterion should be invalidated unless there are compelling reasons and
justifications for not doing so.
EPA's Recommended Critical Validation Criteria Not Used Consistently
Three of the six monitoring agencies we reviewed used less stringent criteria for
validating ozone data than the criteria recommended by the EPA. Further, none of
these three agencies had incorporated all of the EPA's critical validation criteria
(zero, span and one-point QC checks) into their EPA-approved QAPPs.
Of the three critical QC checks listed in the QA Handbook, only the one-point QC
check is required by regulation (40 CFR Part 58, Appendix A) to assess data
quality. Per Appendix A of 40 CFR Part 58, the one-point QC checks are used by
the EPA to annually assess the data quality from each monitoring site and agency
to determine whether the data meet the designated regulatory goals.6 According to
an EPA QA staff person, the CFR does not mandate that air monitoring agencies
use the one-point QC check to validate or invalidate hourly data points; however,
the QA Handbook recommends using the one-point QC for this purpose.
According to staff and managers at some air monitoring agencies we spoke to, the
air monitoring agencies are not compelled to implement this approach because it
is included in guidance and not required by regulation.
Table 2 provides the results of our review of the six monitoring agencies'
implementation of the EPA's critical data validation criteria.
5	The EPA's direction to develop an alternative procedure when a recommended element is not followed was added
as part of the EPA's January 2017 revision to its QA Handbook.
6	The regulatory goals for ozone data can be found in 40 CFR Part 58, Appendix A, § 2.3.1.2.
18-P-0105
9

-------
Table 2: Application of critical criteria for ozone data at six monitoring agencies
Monitoring agency
Were critical criteria implemented in
accordance with the applicable EPA
QA Handbook guidance?
Did the approved
QAPP establish
critical criteria that
were consistent
with the applicable
EPA QA Handbook
guidance?
Zero
check
One-point
check
Span
check
Arizona DEQ
Yes
Yes
Yes
Yes
Georgia DNR
No
No
No
No
Maricopa County
Yes
Yes
Yes
No a
Michigan DEQ
No
Yes
No
No
Pima County
Yes
Yes
Yes
No a
South Carolina DHEC
No
No
No
No
Source: OIG analyses of air monitoring agencies' implementation practices, QAPPs and the EPA's
2013 QA Handbook.
a Although Maricopa and Pima Counties implemented the EPA's recommended critical criteria,
their QAPPs did not contain zero-check criteria that were consistent with the validation criteria
in the EPA's 2013 QA Handbook.
State monitoring staff and one state manager provided us with several reasons
why they did not use the critical criteria checks recommended by the EPA's
guidance. For example, a South Carolina DHEC manager said that DHEC's
monitoring network could meet the EPA's regulatory QA requirements without
adopting the EPA's recommended criteria. Georgia's DNR staff stated that they
interpreted the following statement in the 2008 and 2013 versions of the EPA's
QA Handbook as allowing them to use zero- and span-check acceptance criteria
that were less stringent than those recommended by the EPA:
Cumulative drifts of up to 15 percent of full scale from the original
or nominal zero and span values may not be unreasonable, subject
to [certain limitations],
Georgia DNR staff also noted that some recommended critical criteria are not
found in regulation.
In the 2017 update of the QA Handbook, the EPA removed the statement
referenced by Georgia. In addition, subsequent to our review of South Carolina
DHEC, the state's DHEC management told us that the monitoring agency started
using the EPA's recommended one-point QC check acceptance criterion in
January 2017 and had recertified ozone data from 2012 through 2016 using this
acceptance criterion.
18-P-0105
10

-------
Ozone Data Adjusted in Manner Inconsistent With EPA's QA Handbook
In our February 2017 management alert report, we informed the EPA that
monitoring agencies in Georgia and South Carolina had adjusted ozone data
reported to the AQS. These adjustments were based on the results of zero checks
and were conducted in a manner that was inconsistent with the EPA's QA
Handbook. While completing this current report, we also found that Michigan
DEQ adjusted ozone data based on the results of zero checks in a manner that was
inconsistent with the EPA's QA Handbook. Table 3 summarizes the results of our
review and the impact of the zero-check data adjustments.
Table 3: Monitoring agency ozone data adjustment practices
Monitoring
agency
Were zero-check
adjustments consistent
with QA Handbook?
Examples of the extent of
adjustments to data
reviewed by OIG
Arizona DEQ
Not applicable.
The agency does not
adjust data.
Not applicable
Georgia DNR a
No a
Hourly ozone values adjusted by as
much as 5 ppb from raw values
Maricopa County
Not applicable.
The agency does not
adjust data.
Not applicable
Michigan DEQ
No
Hourly ozone values adjusted by as
much as 8 ppb from raw values
Pima County
Not applicable.
The agency does not
adjust data.
Not applicable
South Carolina DHEC
No
Hourly ozone values adjusted by as
much as 5 ppb from raw values
Source: OIG analysis.
a Georgia DNR applied a zero-check adjustment process to data we reviewed that were reported
to the AQS from 2012 through 2014. Georgia DNR stopped its zero-check adjustment practice in
June 2015 and no longer adjusts data reported to the AQS.
Monitoring agency staff in Georgia, Michigan and South Carolina told us that
they thought the zero-adjustment practice improved the accuracy of the data
reported to the AQS. In addition, Michigan DEQ staff told us that they had been
performing zero-adjustments for a long time and that Region 5 had never
identified it as a problem.
Varying Shelter Temperature Criteria Used to Validate Ozone Data
Monitoring agencies applied varying shelter temperature criteria to validate ozone
data. For example, one agency applied the same shelter temperature criteria to all
18-P-0105
11

-------
monitors, while other agencies used shelter temperature criteria that were specific
to the monitors used at each monitoring station.
The QA Handbook states that it is important to maintain each shelter at
temperatures that accommodate the most temperature-sensitive instrument in the
shelter. The EPA's QA Handbook recommends the acceptance criterion for the
hourly average shelter temperature range to be 20 to 30 degrees Celsius "or per
the monitor manufacturer's specifications 'if designated to a wider temperature
range'" (emphasis added).7 The general acceptance range of 20 to 30 degrees
Celsius specified in the QA Handbook is based on the temperature range that the
EPA uses to test monitors.8 However, once the EPA approves the use of a
monitor, the EPA publishes a Notice of Designation in the Federal Register,
which sets forth the requirements for how a monitor is operated. This designation
may allow the monitor to operate within a wider temperature range than 20 to
30 degrees Celsius.
The monitoring agencies we reviewed used different criteria to assess the validity
of the data collected based on the shelter temperatures. For example, the Michigan
DEQ's 2012 and 2014 QAPPs specified a shelter temperature acceptance range of
18 to 32 degrees Celsius. When shelter temperatures were outside this range,
Michigan DEQ invalidated all data collected during those periods. However,
Michigan DEQ used monitors that are allowed, per the published Notice of
Designation, to operate at a wider temperature range: 5 to 40 degrees Celsius.
Based on our review of 69 hours of ozone data that were invalidated by Michigan
DEQ, at no time were the hourly shelter temperatures outside the wider operating
range approved by the EPA.
Conversely, the three monitoring agencies we reviewed in Arizona established a
shelter temperature range of 20 to 30 degrees Celsius in their respective QAPPs,
with two of the agencies' QAPPs stating that manufacturer specifications could be
used if designated to a wider range. According to the personnel at these
monitoring agencies, they validate ozone data using the manufacturer's
designated operating range as an acceptance criterion instead of applying the 20 to
30 degrees Celsius range to all monitors.
In general, data invalidation due to shelter temperatures should be infrequent if
monitoring shelters have adequate cooling and heating systems and are properly
maintained. It is important for monitoring agencies to keep shelter temperatures
within an acceptable operating range, especially on hot summer days, which are
generally associated with high ozone values and can cause excessively high
7	The QA Handbook references the EPA's list of approved monitoring methods with instrument-specific
temperature ranges, which can be found in the List of Designated Reference and Equivalent Methods document on
the EPA's "Air Monitoring Methods-Criteria Pollutants" webpage.
8	Per 40 CFR Part 58, a criteria pollutant monitoring method used for making NAAQS decisions must be approved
by the EPA as a "federal reference method" or "federal equivalent method." The QA Handbook states that federal
reference method and federal equivalent method testing is required to be conducted in the 20 to 30 degrees Celsius
range.
18-P-0105
12

-------
shelter temperatures. Invalidating data on these days increases the risk that
potentially high and unsafe ozone levels are not recorded and used to assess air
quality safety. Further, invalidating data collected when shelter temperatures are
within the monitor's designated operating range adds to that risk. The EPA should
clarify its guidance concerning how shelter temperature should be considered
within the data validation process so that agencies consistently use the appropriate
shelter temperature criteria for each specific monitor.
Different Data Processing Practices Could Decrease Data Reliability
The use of differing data processing practices increases the risk that data reported
to the AQS are not comparable. According to the EPA's QA Handbook,
comparability is a measure of the confidence with which one data set or method
can be compared to another. The EPA states in its QA Handbook that the
comparability of data sets is critical to evaluating their uncertainty and usefulness.
When the comparability of data is affected by different data processing practices,
it could decrease the reliability of the data for certain uses.
To illustrate the impact that different data processing practices could have on the
data reported to the AQS, we created a set of hourly ozone data and then
processed the data using the different practices that we identified during our
review. Specifically, we applied the following three zero-check adjustment
practices to 24 hours of hourly ozone data:
•	Not adjusting for zero-check results within the EPA's recommended
acceptance criteria. This is the practice recommended by the EPA's QA
Handbook.
•	Adjusting each hourly value by the same amount, based upon the zero-
check result at the start of the day.
•	Adjusting each hourly value by an incremental amount, based upon the
difference between the zero check at the start of the day and the zero
check at the end of the day. Each hour is adjusted incrementally based on
the difference between two zero checks divided by the number of hours
between the zero checks.
The data in Table 4 demonstrate how these different daily zero-adjustment
practices could cause the monitoring agencies to report different values to the
AQS, even though the raw monitoring and QC data were the same. Our example
illustrates zero-check results where adjustment procedures caused the adjusted
data to be lower than the raw values recorded by the monitor. However, under
different circumstances, the inverse is also possible, and adjusted values could be
higher than raw values.
18-P-0105
13

-------
Table 4: How different zero-check adjustment processes could affect reported
ozone data
Hourly values recorded by
the monitor
Hourly values (ppb) reported in the AQS if:
Time
Hourly
value
(PPb)
Not adjusting
for zero-
check results
Adjusting by same
amount based
on previous
zero-check result
Adjusting by
incremental amount
based on difference
between two zero-
check results
11 a.m.
66
66
65
64
12 p.m. (noon)
71
71
70
68
1 p.m.
76
76
75
73
2 p.m.
77
77
76
74
3 p.m.
80
80
79
77
4 p.m.
81
81
80
78
5 p.m.
79
79
78
76
6 p.m.
79
79
78
76
8-hour
average
76
76
75
73
Source: OIG analysis.
As illustrated in Table 4, different data adjustment practices can decrease the
reliability of the data reported to the AQS. For example, using a common set of
raw data, these three practices produced three different 8-hour averages to be
reported to the AQS: 76 ppb, 75 ppb and 73 ppb. In this example, the unadjusted
data resulted in an 8-hour average of 76 ppb, which would have exceeded the
EPA's 2008 ozone NAAQS of 75 ppb. However, the 8-hour averages for the two
adjusted sets of data in the example did not exceed this standard. The data quality
adjustment process used may therefore directly affect the EPA's determination
regarding whether the air is healthy or unhealthy.
The application of different validation practices can also produce different 8-hour
averages using the same set of raw monitoring data and QC check results. Based
on the validation practices used, some agencies would accept the data, while others
would reject and invalidate the data. Since the ozone standard is based on an
8-hour average, differing data processing practices could impact the EPA's design
value calculations, which are used to determine compliance with the ozone
NAAQS.
Risk That Other Air Monitoring Agencies Apply Different Data
Processing Practices and Have Outdated QAPPs
While our review focused on the data processing practices of six monitoring
agencies, we found data indicating a risk that other monitoring agencies are not
implementing the EPA-recommended data processing practices. Based on our
analysis, about 26 percent of the AQS hourly ozone data differed from the
corresponding real-time data reported in AirNow. There are a number of reasons
18-P-0105
14

-------
for such differences. For example, monitoring agencies could find certain data
reported in real time to AirNow to be invalid and, therefore, would not report the
data to the AQS. Further, monitoring agencies could apply different conventions
for rounding or truncating raw data before reporting to either database. However,
because we confirmed that at least some of these differences were due to data
adjustment practices, there is a risk that other monitoring agencies could have
made adjustments to the raw monitoring data before they were reported to the
AQS. These adjustments can impact the EPA's ability to assess data quality and
to determine whether the data are reliable for making designation decisions.
Air monitoring agencies develop QAPPs that should identify their QA and QC
procedures and data validation criteria. The EPA's critical criteria for zero checks
changed significantly in 2013 and then again in 2014. However, our analysis of
data from the EPA's AQS website shows that 58 percent of monitoring agencies
have ozone monitoring QAPPs that were approved before 2014. Thus, there is a
risk that those QAPPs do not include the EPA's revised critical criteria.
EPA Has Taken Actions to Assess AQS Data Quality
During our review, OAQPS initiated several corrective actions to address the QA
concerns outlined in our February 2017 management alert report and in this
report:
•	Revising the QA Handbook. OAQPS revised its QA Handbook in
January 2017 to clarify its guidance on the zero-adjustment (Section 10.4)
and data-validation processes (Section 17 and Appendix D). In addition, in
May 2017, OAQPS posted a technical note to the EPA website alerting air
monitoring agencies to the appropriate practice for conducting zero-check
adjustments.
•	Reviewing data that failed one-point QC checks. OAQPS sent a
memorandum in April 2017 to EPA regions, directing them to begin
reviewing and invalidating monitoring data when acceptance criteria for
one-point QC checks were exceeded and when the air monitoring agency
did not have compelling evidence to support the data's validity. OAQPS
stated that it is providing monitoring agencies with the flexibility to
determine data validity in cases where their QAPPs provided less stringent
acceptance criteria than the EPA's recommended criteria.
•	Reviewing the impact of data adjustments on attainment decisions. In
November 2016, OAQPS began reviewing hourly ozone data from 2012
through 2015 in AirNow and the AQS to determine the risk of data
adjustments impacting the data that could be used in the EPA's
designation determinations for the 2015 ozone NAAQS. In July 2017,
OAQPS provided an updated analysis, which included ozone monitoring
data through 2016. However, the analysis did not address how zero-check
18-P-0105
15

-------
adjustment practices may impact design values or designation
determinations for specific locations. An OAQPS manager told us that if
OAQPS had concerns about the quality of data from a particular
monitoring site, such an analysis would be considered on a case-by-case
basis.
Conclusions
Variation in the data adjustment practices and data validation criteria used by
monitoring agencies can lead to data quality uncertainty and decrease the
reliability of data used to make decisions regarding NAAQS compliance. In
addition, the comparability of monitoring data can be impacted if monitoring
agencies implement quality systems with different data validation and processing
practices. Thus, improved EPA oversight of monitoring agencies is needed to
reduce the risk that monitoring agencies inconsistently apply data processing
practices and report unreliable data to the AQS.
Consistent implementation of data processing and validation practices results in
comparable data and provides better assurances that the data used to determine
whether air quality meets the EPA's health-based standards are reliable and of
sufficient quality. The EPA has initiated actions to correct the inconsistencies in
how monitoring agencies process ozone data.
Recommendations
We recommend that the Assistant Administrator for Air and Radiation:
1.	Assess the risk of any data adjustments impacting the ozone data used in
the EPA's National Ambient Air Quality Standards designation
determinations.
2.	Issue guidance clarifying the shelter temperature criteria that should be
used during data validation.
Agency Comments and OIG Evaluation
The agency concurred with the recommendations and provided acceptable planned
corrective actions and completion dates. Recommendations 1 and 2 are resolved. In
addition to a response to our recommendations, the agency provided technical
comments on the draft report. Based on the agency response and technical
comments received, we made revisions to the report where appropriate.
Appendix A contains the agency's response to the draft report, the OIG's
evaluation of the agency's response, the agency's technical comments, and the
OIG's response to each technical comment.
18-P-0105
16

-------
Chapter 3
EPA Oversight Should Be Strengthened to Improve
Ozone Data Quality
The EPA's oversight of the air monitoring agencies' quality systems should be
strengthened to improve the agencies' data processing practices. Specifically, we
found the following issues:
•	EPA regions did not verify that the monitoring agencies' QAPPs were up
to date and included the EPA's recommended data processing practices.
•	The EPA's annual data certifications did not incorporate data for two of
the three critical validation criteria recommended by the EPA.
•	The EPA's TSAs did not always identify or resolve the use of data
processing practices that did not follow the EPA's recommended
practices.
The quality and reliability of monitoring data could be impacted if monitoring
agencies implement quality systems with different data processing practices. The
EPA's oversight of monitoring agencies' quality systems and practices is an
important function to facilitate consistent application of data processing practices
and to reduce the risk that monitoring data submitted to the EPA is unsuitable for
use in attainment decisions.
EPA Oversight of Ambient Air Monitoring Programs
The EPA's OAQPS and regional offices oversee ambient air monitoring programs.
The responsibilities of OAQPS and EPA regional offices are listed in Table 5.
Table 5: Summary of OAQPS and regional oversight responsibilities
OAQPS responsibilities
Regional office responsibilities
•	Develop a satisfactory quality system
for the ambient air quality monitoring
network.
•	Ensure that the methods and
procedures used in making air pollution
measurements are adequate to meet
program objectives and that the
resulting data are of appropriate
quality.
•	Perform data quality assessments of
monitoring agencies making air
pollution measurements.
•	Distribute and explain technical and QA
information to monitoring agencies.
•	Alert EPA headquarters to QA needs of
monitoring agencies that are "national"
in scope.
•	Confirm that monitoring agencies have
approved QAPPs prior to routine
monitoring.
•	Provide monitoring agency personnel
with knowledge of QA regulations and
with adequate technical expertise to
address air monitoring and QA issues.
18-P-0105
17

-------
OAQPS responsibilities
Regional office responsibilities
•	Ensure that guidance pertaining to the
QA aspects of the ambient air
monitoring program are written and
revised as necessary.
•	Render technical assistance to the EPA
regional offices and the air pollution
monitoring community.
•	Evaluate the capabilities of monitoring
agencies to measure air pollutants by
implementing network reviews and
TSAs.
•	Assess the quality of data submitted by
monitoring agencies.
Source: OIG analysis of the EPA's QA Handbook.
Annual QAPP Reviews
Appendix A of 40 CFR Part 58 requires that monitoring agencies implement a
quality system that provides sufficient information to assess the quality of the
monitoring data. Each monitoring agency must describe its quality system in a
quality management plan and a QAPP. A QAPP outlines the required procedures
for providing monitoring data that are of adequate quality, meet statutory
requirements and comply with applicable standard specifications.
According to EPA guidance, the EPA Project Manager (or authorized
representative) should review QAPPs at least annually.9 When revisions are
necessary, the QAPP must be revised and submitted to the EPA for review and
approval. Revisions to a QAPP are necessary when a reviewing official
determines that a substantive change (i.e., a change impacting the technical and
quality objectives of the project) is needed. In addition, EPA staff told us that a
change to national policy or guidance, such as a revision to the QA Handbook,
may warrant revisions to QAPPs. In addition, the 2017 QA Handbook
recommends that QAPPs be updated and resubmitted to the EPA at least once
every 5 years.
In certain instances, EPA regions can grant monitoring agencies the authority to
self-approve QAPPs. In these cases, EPA staff may not be involved in the review
and approval of the QAPPs, which means they would not see any revised QAPPs
until the next TSA. However, an OAQPS staff person expressed concern about
the EPA's ability to properly oversee monitoring agencies without reviewing the
QAPPs to verify that the agencies meet the regulatory requirements. The EPA
revised 40 CFR Part 58, Appendix A, in March 2016 to require that self-
approving monitoring agencies submit a copy of their QAPPs to the EPA to
identify and correct any inaccuracies.
Annual Data Certification Reviews
Per 40 CFR Part 58, each monitoring agency is required to submit all ambient air
quality data and associated QA data to the AQS quarterly in accordance with the
9 EPA, EPA Requirements for Quality Assurance Project Plans (QA/R-5), EPA/240B-01/003, March 2001 (reissued
May 2006).
18-P-0105
18

-------
AQS Data Coding Manual and the monitoring agency's QAPP. Additionally, the
regulations require each monitoring agency to annually certify the following
statements:
•	Data collected at its monitoring sites meet the criteria in 40 CFR Part 58,
Appendix A.
•	Concentration data are accurate to the best of its knowledge.
•	Concentration data and QA data are completely submitted to the AQS.
To meet these regulatory requirements, monitoring agencies must submit an
annual data certification letter, an annual summary report of the air quality data
collected for the year, and a summary of precision and accuracy data by May 1 of
each year. EPA regional offices review and evaluate each monitoring agency's
annual data certification package to confirm that the EPA has no reservations
about data quality. The regions review the certification letters, the summary
reports, the completeness of QA data submitted to the AQS, the resulting quality
statistics, and the days with the highest reported concentrations.
In addition, the EPA has developed a Data Evaluation and Concurrence Report
(AMP 600) that pulls data quality information already reported by the monitoring
agencies to the AQS. This report summarizes various QC data in the AQS and
flags whether the EPA has concurred that the data are of suitable quality.
TSAs
Appendix A of 40 CFR Part 58 requires the EPA regional offices to conduct a
TSA at each monitoring agency at least once every 3 years and to report the TSA
results to the AQS. A TSA is an on-site review and inspection of a monitoring
organization's ambient air monitoring program to assess its compliance with
established regulations governing the collection, analysis, validation and reporting
of ambient air quality data. The QA Handbook states that TSAs are designed to
address and report on an organization's field operations, laboratory operations,
QA and QC processes, data management, and reporting.
As part of the quality portion of the audit, EPA regional staff are expected to
review the monitoring agency's most recent QAPP to determine when it was
approved; review data handling procedures, including verifying that QC checks
are conducted properly and documented; and select a portion of the data for a data
quality audit. The results of each TSA are reported to the monitoring agency, and
agencies are required to take any necessary corrective action.
EPA Did Not Verify That QAPPs Were Revised in Timely Manner
Five of the six monitoring agencies we reviewed had QAPPs that had not been
fully approved since 2014. For example, the EPA last approved ozone monitoring
QAPPs for Georgia and South Carolina in 2007. The Michigan DEQ's QAPP was
18-P-0105
19

-------
approved in 2014, but it did not contain validation criteria that were consistent
with the criteria recommended in the EPA's QA Handbook.
OAQPS informed us that Region 4 had conditionally approved the Georgia
DNR's QAPP in December 2016. According to EPA staff, the Georgia DNR
received a conditional approval because the monitoring agency was not willing to
incorporate measurement quality objectives into its QAPP that were consistent
with the EPA's QA Handbook.
The EPA can identify QAPPs that are outdated by reviewing data in the AQS
database and by running an AQS management report (the AMP 600 report) that
identifies the date of the last EPA-approved QAPP. However, the report does not
generate a nonconcurrence flag until the QAPP is older than 10 years or unless the
QAPP has never been approved. The EPA can also use TSAs to identify when
QAPPs are outdated and request that monitoring agencies take corrective action.
Based on the QAPPs we reviewed, we also found that the EPA is not verifying
whether QAPPs are updated with current QA requirements and guidance. This
lapse could directly impact the data validation criteria used by monitoring
agencies. If monitoring agencies are conducting CFR- and QA Handbook-
compliant QA activities, as stated in their QAPPs, then the legal defensibility of
the data is enhanced. As noted in Appendix A of 40 CFR Part 58, data validation
criteria should be established in a monitoring agency's approved QAPP.
Therefore, it is important that the EPA effectively use its available oversight
mechanisms to verify that QAPPs reflect current regulatory requirements and
EPA recommendations for QA and data validation.
Data Certification Reviews Could Be Used to Better Identify
Inconsistent Data Processing Practices
The EPA requires monitoring agencies to report one-point QC check results in the
AQS, but the EPA does not require monitoring agencies to report data for the
other two critical QC checks: zero checks and span checks. The EPA uses the
one-point QC check data in the AQS to conduct its annual certification reviews of
the monitoring agencies. These reviews could be enhanced if all critical QC data
were reported to the AQS.
A key tool that the EPA uses to review the annual certification packages is the
AMP 600 report housed in the AQS. The AMP 600 report calculates data quality
statistics for each monitor using the results from the one-point QC checks. This
report generates either a green (acceptable), yellow (warning) or red
(nonconcurrence) color code for each monitor based on the EPA identified ranges.
A red flag for any monitor will elicit an AQS recommendation of
nonconcurrence, indicating that issues regarding the quality of the data cannot be
resolved. Three yellow warning flags for any one monitor will also result in a
nonconcurrence flag. However, without data in the AQS for two critical criteria,
18-P-0105
20

-------
the EPA can only evaluate data quality based upon the results of the one-point QC
check. The benefits of the AMP 600 report could be improved if it included the
additional QC data from the zero and span checks.
The monitoring agencies we visited maintained the zero- and span-check results.
In addition, as noted in Chapter 1, the EPA's QA Handbook clearly states that
zero, span and one-point QC checks for ozone are all critical to maintaining the
integrity of the ambient air data. However, monitoring agencies may not be
willing to submit data associated with zero and span checks or to have the quality
of their monitoring data assessed against these criteria because they are not
required by regulation.
TSAs Did Not Identify Data Processing Practices That Were
Inconsistent With EPA Guidance
The TSAs of the air monitoring agencies we reviewed did not always identify or
resolve the use of critical criteria and data adjustment practices that did not follow
the EPA's recommended practices. The EPA should use TSAs to identify and
oversee these types of practices. During a TSA, EPA regions conduct an on-site
review of quality systems in place at monitoring agencies. The EPA's TSA
checklist, which is sent to agencies to complete prior to the TSA, includes
questions that could help identify zero-adjustment practices and whether data
validation criteria vary from the EPA's recommended practices. EPA regional
staff must follow up on each of these questions during a TSA to fully understand
how data are processed.
Specifically, we found the following issues:
•	EPA Region 4 identified zero-adjustment practices at the Georgia DNR in
both the 2011 and 2014 TSAs, but the region stated in both TSAs that the
practices were allowed. However, the 2014 TSA should have noted that
the practice did not follow the recommended practices outlined in the
2013 version of the EPA's QA Handbook.
•	EPA Region 4 TSAs did not identify the South Carolina DHEC's zero-
adjustment practices.
•	EPA Region 5 did not identify the Michigan DEQ's zero-adjustment
practices in the 2014 TSA. A Michigan DEQ manager told us that staff
had been performing zero adjustments for a long time and that Region 5
had never identified it as a problem.
•	EPA Region 4 noted in its 2015 TSA of the South Carolina DHEC that the
state's ozone validation criteria did not conform to the QA Handbook.
However, the issue was not resolved until January 2017. South Carolina
DHEC management told us that it adopted the EPA's recommended
18-P-0105
21

-------
validation criteria for one-point QC checks after Region 4 issued a
November 2016 memorandum proposing that all monitoring agencies in
Region 4 include the EPA's recommended one-point QC check
acceptance criteria in their QAPPs.
•	Region 5's 2014 TSA of the Michigan DEQ did not identify that Michigan
had invalidated ozone data for shelter temperature exceedances when the
temperature was still within the monitoring method's approved operating
range.
EPA Has Improved Oversight, but Further Steps Are Needed
In response to the OIG's February 2017 management alert report, the EPA
initiated actions to improve oversight to help confirm that monitoring agencies'
QAPPs are developed and revised in a timely manner and reflect the EPA's
critical criteria contained in the most recent QA Handbook. Many of these actions
were detailed in an OAQPS memorandum issued on July 11, 2017, to EPA
Program Managers and staff. The EPA initiated the following action plans:
•	OAQPS developed a list of all QAPPs reported to AQS and requested that
EPA regions that have not approved QAPPs within a 5-year period work
with the monitoring agencies to provide a schedule of when each QAPP
will be revised and submitted to the EPA.
•	OAQPS will revise the AQS AMP 600 report by 2019 to flag QAPPs that
are over 5 years old. The current report flags QAPPs with approvals more
than 10 years old or that have never approved.
•	OAQPS and the EPA regions agreed to develop a QAPP review
"checksheet" to provide a more consistent review of QAPPs.
In addition, OAQPS asked EPA regional air monitoring staff to review QAPPs to
determine whether they contain the EPA's critical criteria. The EPA stated that
only conditional approval of a QAPP will be provided if monitoring agencies do
not revise their QAPPs to include the critical criteria. However, until the QAPP
update-and-review process is completed, there is a risk that some QAPPs may not
reflect the EPA's critical validation criteria.
As discussed in Chapter 2, the EPA is reviewing whether monitoring agencies
adhered to the acceptance criteria for one-point QC checks for data already
submitted to the AQS. In addition, OAQPS issued guidance on flagging data
values in the AQS that exceed critical criteria. This practice will provide
additional information and consistency regarding the assessment and validity of
data collected in cases of failed one-point QC checks. However, the EPA is only
able to identify exceedances of one-point QC criteria because the EPA does not
require agencies to submit zero- and span-check data to the AQS.
18-P-0105
22

-------
We believe the EPA's TSA process should include steps to determine whether air
monitoring agency QAPPs are kept up-to-date and contain the following
elements:
•	Critical criteria that meet EPA recommendations for data validation.
•	A process for documenting any adjustments made to raw data before
submittal to the AQS.
In response to our evaluation, the EPA developed and issued a national TSA
guidance document in December 2017 to achieve more consistent implementation
of the TS A process across all EP A regions. The guidance included steps to
address the issues we identified above.
Conclusions
The EPA should more effectively use its available oversight tools to help verify
that air monitoring agencies are implementing the EPA's recommended QA
practices. The QAPPs are the primary way that monitoring agencies establish
validation criteria and QA practices, and it is important for EPA regions to
conduct careful reviews of these documents to determine whether they include the
EPA's recommended practices or, if not, a valid explanation of why the QAPP
practice differs. Other EPA oversight functions, such as TSAs and annual data
certification reviews, should validate that the quality systems identified in the
QAPPs reflect the most recent regulatory requirements and guidance and are
being implemented appropriately. The EPA also should collect additional QC data
in the AQS, such as zero- and span-check data, and develop more robust AQS
reports to oversee monitoring agencies.
Recommendations
We recommend that the Assistant Administrator for Air and Radiation:
3.	Complete the quality assurance project plan review-and-approval process
to verify that air monitoring agencies' quality assurance project plans
incorporate EPA regulations and guidance for conducting data validations
and adjustments.
4.	Periodically verify that air monitoring agencies are implementing the
EPA's recommended criteria for data validation and adjustments through
technical systems audits or other oversight mechanisms.
5.	Develop a process to provide assurances that data reported to the Air
Quality System database have met the approved zero- and span-check
validation criteria.
18-P-0105
23

-------
Agency Comments and OIG Evaluation
The agency concurred with Recommendation 3 and provided an acceptable planned
corrective action. Recommendation 3 is resolved. The agency agreed with
Recommendation 4 and completed the corrective action on December 7, 2017.
The agency provided two corrective actions for Recommendation 5. However,
these corrective actions, as described in the agency's written response to our draft
report, did not fully meet the intent of our recommendation. As a result, we met
with the agency to obtain clarification. At this meeting, agency staff explained that
they were also taking an additional action to address this recommendation. Thus,
our final report continues to recommend that the EPA develop a process to assure
that the reported data for the zero- and span-check meet validation criteria, but we
revised the recommendation to no longer specify the timing of this process. When
implemented, we believe that the agency's proposed actions, as described in the
written response and in our meeting, comprise a process to provide reasonable
assurance that zero- and span- checks are properly used to validate data.
Recommendation 5 is resolved.
In addition to a response to our recommendations, the agency provided technical
comments on the draft report. Based on the agency response and technical
comments received, we made revisions to the report where appropriate.
Appendix A contains the agency's response to the draft report, the OIG's
evaluation of the agency's response, the agency's technical comments, and the
OIG's response to each technical comment.
18-P-0105
24

-------
Status of Recommendations and
Potential Monetary Benefits
RECOMMENDATIONS
Rec.
No.
Page
No.
Subject
Status1
Action Official
Planned
Completion
Date
Potential
Monetary
Benefits
(in $000s)
1
16
Assess the risk of any data adjustments impacting the ozone
data used in the EPA's National Ambient Air Quality Standards
designation determinations.
R
Assistant Administrator for
Air and Radiation
3/31/18

2
16
Issue guidance clarifying the shelter temperature criteria that
should be used during data validation.
R
Assistant Administrator for
Air and Radiation
3/31/18

3
23
Complete the quality assurance project plan review-and-approval
process to verify that air monitoring agencies' quality assurance
project plans incorporate EPA regulations and guidance for
conducting data validations and adjustments.
R
Assistant Administrator for
Air and Radiation
12/31/18

4
23
Periodically verify that air monitoring agencies are implementing
the EPA's recommended criteria for data validation and
adjustments through technical systems audits or other oversight
mechanisms.
C
Assistant Administrator for
Air and Radiation
12/7/17

5
23
Develop a process to provide assurances that data reported to
the Air Quality System database have met the approved zero-
and span-check validation criteria.
R
Assistant Administrator for
Air and Radiation
9/30/18

1 C = Corrective action completed.
R = Recommendation resolved with corrective action pending.
U = Recommendation unresolved with resolution efforts in progress.
18-P-0105
25

-------
Appendix A
Agency Comments on Draft Report
and OIG Evaluation

UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
DEC I 4 ZOt?
OFFICE OF
AIR AND RADIATION
MEMORANDUM
SUBJECT; Response to Office of Inspector General Management Alert OPE-FYl 6-0009,
"Differences in Air Monitoring Agencies' Data Processing Practices Could
Decrease the Reliabi"	~ " '	f
Thank you for the opportunity to review and comment on the Office of Inspector General's
(OIG's) Management Alert titled "Differences in Air Monitoring Agencies' Data Processing
Practices Could Decrease the Reliability of Ozone Data Used to Assess Air QualityWe
appreciate the efforts and investigations the OIG has made to alert the Office of Air and
Radiation (OAR) to potential data reporting issues with ozone data. We generally agree with the
findings and recommendations identified in the report. However, to place these findings in
context, we note that our analysis shows that the majority of the ozone data are not impacted by
these issues and less than two percent of the data show differences which may represent a
legitimate concern in terms of quality assurance (QA) practices.10 Furthermore, our analysis
shows that these differences have a minimal impact on the 2014-2016 design values (DVs)
which the EPA expects to use in designations for the 2015 ozone national ambient air quality
standards (NAAQS), and thus will have little, if any, impact on initial area designations for that
standard.
10 https://www.epa.gOv/sites/production/files/2017-07/documents/_epaoig_17-p-0106_agency_response_technical
addendumjuly2017.pdf.
18-P-0105	26
FROM: William L. Wehrum
Assistant Administra

TO:
Kevin Christensen, Assistant Inspector General for Audit and Evaluation
Office of Inspector General

-------
The EPA's previous response to the OIG Management Alert on this project (dated February 10,
2017) provided background on the OIG process, the methodology used by OIG in their fact
finding, our involvement in the findings, and our response to those findings.11 OAR believes that
the OIG's findings in the earlier management alert and this draft report are substantially the same
and, therefore, our response is similar. While we generally agree with these findings, there are a
few places where information in the draft report is slightly unclear and deviates from our
understanding of specific facts. Please refer to the attached list and suggested revisions intended
to clarify and improve the draft report's accuracy.
Below are OAR's responses to the OIG's specific recommendations (recommendations 1, 2, 3, 4,
and 5).
Recommendation 1: Assess the risk of any data adjustments impacting the ozone data used
in the EPA's National Ambient Air Quality Standards designation determinations.
Response 1: The Office of Air and Radiation agrees with this recommendation. The Office of
Air and Radiation conducted a review of the 2014-2016 data, which we intend to use for initial
area designations for the 2015 ozone NAAQS. Specifically, we calculated 2014-2016 DVs based
on the data from AirNow and compared those values to the 2014-2016 DVs in the Air Quality
System (AQS). We found 12 monitors where the DV computed using AirNow exceeded the
standard (> 70 parts per billion, ppb) and the AQS DV attained the standard (<= 70 ppb). Table 1
below provides a listing of these monitors. Eight of the 12 monitors differed by 1 ppb (i.e.,
AirNow DV = 71 ppb; AQS DV = 70 ppb). Since the data in AirNow are preliminary values and
the validated data in AQS are truncated as specified by regulation, we conclude that the
differences at those eight monitors are explainable based purely on data reporting conventions
versus monitoring agency data adjustments and will not impact designations. Three of the
remaining four monitors were located in counties with other violating monitors, and thus will
have no impact upon designations. The final site was located in Shasta County, California, which
does not contain any other violating monitors, nor is it in an existing nonattainment area. The
Office of Air and Radiation is working with EPA Region 9 to investigate why the differences in
the AQS and AirNow data occurred at this site.
Table 1




AirNow
AQS
Potential




Design
Design
Impact on
AQS Site

County

Value
Value
03
ID
State Name
Name
CBSA Name
(ppb)
(ppb)
Designations
04-013-
3002
Arizona
Maricopa
Phoenix-Mesa-
Scottsdale, AZ
71
70
None -
Rounding/
Truncation
04-013-
4003
Arizona
Maricopa
Phoenix-Mesa-
Scottsdale, AZ
71
70
None -
Rounding/
Truncation
11 https://www.epa.gov/sites/production/files/2017-02/documents/_epaoig_17-p-0106_agency_response.pdf.
18-P-0105	27

-------
06-089-
0009
California
Shasta
Redding, CA
71
68
EPA
Investigating
21-111-
0027
Kentucky
Jefferson
Loui sville/Jefferson
County, KY
72
69
None - Other
Violating
Monitors
32-003-
1019
Nevada
Clark
Las Vegas-
Henderson-Paradise,
NV
73
70
None - Other
Violating
Monitors
34-007-
1001
New Jersey
Camden
Philadelphia-
Camden-
Wilmington, PA-NJ-
DE-MD
74
69
None - Other
Violating
Monitors
34-025-
0005
New Jersey
Monmouth
New York-Jersey
City, NY-NJ-PA
71
70
None -
Rounding/
Truncation
42-005-
0001
Pennsylvania
Armstrong
Pittsburgh, PA
71
70
None -
Rounding/
Truncation
44-003-
0002
Rhode Island
Kent
Providence-
Warwick, RI-MA
71
70
None -
Rounding/
Truncation
44-009-
0007
Rhode Island
Washington
Providence-
Warwick, RI-MA
71
70
None -
Rounding/
Truncation
51-059-
0030
Virginia
Fairfax
Washington-
Arlington-
Alexandria, DC-VA-
MD-WV
71
60
None -
Rounding/
Truncation
55-127-
0005
Wisconsin
Walworth
Whitewater-Elkhorn,
WI
71
70
None -
Rounding/
Truncation
Planned Completion Date: FY18, Q2 for the one ozone monitoring site indicated above.
Otherwise, OAR considers the assessment to be completed.
OIG Response #1: The agency concurred with the recommendation and provided acceptable
planned corrective actions and completion dates. Recommendation 1 is resolved.
Recommendation 2: Issue guidance clarifying the shelter temperature criteria that should
be used during data validation.
Response 2: The Office of Air and Radiation agrees with this recommendation. The Office of
Air and Radiation will issue a technical memo that will be shared with the monitoring agencies
and posted to the Ambient Monitoring Technology Information Center (AMTIC). The Office of
Air and Radiation will subsequently revise the Quality Assurance Handbook to clarify the
18-P-0105
28

-------
current language on this topic. Since the Quality Assurance Handbook gets updated every five
years and was last updated in 2017, OAR will develop and post a table of changes that will apply
to monitoring guidance until the next full Quality Assurance Handbook revision.
Planned Completion Date: FY18, Q2 for Technical memo and Quality Assurance Handbook
change table posted on AMTIC.
OIG Response #2: The agency concurred with the recommendation and provided acceptable
planned corrective actions and completion dates. Recommendation 2 is resolved.
Recommendation 3: Complete the quality assurance project plan review-and-approval
process to verify that air monitoring agencies' quality assurance project plans incorporate
the EPA regulations and guidance for conducting data validations and adjustments.
Response 3: The Office of Air and Radiation agrees with this recommendation. The Office of
Air and Radiation issued a memo on July 11, 2017, alerting the monitoring agencies of the
importance of having Quality Assurance Project Plans (QAPPs) submitted and approved that
conform to regulation and critical criteria. The Office of Air and Radiation expects this review
process to be completed by the end of CY18. Additionally, OAR plans to revise the Data
Certification and Concurrence Report (AMP600) to flag non-concurrence for any QAPP
approval dates over 5 years. The Office of Air and Radiation has already revised AQS to provide
better information on the QAPP data reported to AQS. The Office of Air and Radiation is
committed to revising the Air Pollution Training Institute (APTI) course Quality Assurance for
Air Pollution Measurement Systems that will address the issue of QAPP development and
approval. Finally, the technical system audits that are conducted on monitoring agencies every 3
years will be used to identify QAPPs requiring revision. The EPA notes that the 2016 revision to
40 Code of Federal Regulations (CFR) part 58, Appendix A requires submission of QAPPs to the
EPA for agencies that have been delegated self-approval of QAPPs in order to ensure
conformance with the EPA regulation and important guidance such as the validation templates.
Planned Completion Dates: FY18, Q4 Revision of Data Certification and Concurrence Report;
FY19, Q1 completion of APTI 470 Course; FY19, Q1 Completion of approval process for
QAPPs to ensure meeting every five-year timeline.
OIG Response #3: The agency concurred with the recommendation and provided acceptable
planned corrective actions and completion dates. Recommendation 3 is resolved.
Recommendation 4: Periodically verify that air monitoring agencies are implementing the
EPA's recommended criteria for data validation and adjustments through technical system
audits or other oversight mechanisms.
Response 4: The Office of Air and Radiation agrees with this recommendation. The Office of
Air and Radiation has developed and anticipates issuing a technical systems audit guidance
document with consensus from the EPA Regions to implement. This document will specify that
auditors review validation criteria and the "process for documenting any adjustments made to
18-P-0105
29

-------
raw data before submittal to AQS." The EPA Regions will use this guidance during technical
systems audits that are conducted on the monitoring agencies every 3 years.
Planned Completion Date: FY18, Q2 for completion of Technical Systems Audit Guidance
Document.
OIG Response #4: The agency concurred with the recommendation and provided acceptable
planned corrective actions and completion dates. The agency released the revised TSA
guidance in December 2017, which will be used by EPA regions to conduct TSAs. The
corrective action for Recommendation 4 has been completed.
Recommendation 5: Develop a process to provide assurances that data reported to the Air
Quality System database have met the approved zero- and span-check validation criteria
prior to regional review and approval of the air monitoring agencies' annual data
certification packages.
Response 5: The Office of Air and Radiation believes that the most important of the three
critical criteria quality control checks (zero, span, 1-point QC) is the 1-point QC (reported to
AQS) since it involves the use of both the zero air source (used for the zero check) for ozone
standard dilution, as well as the ozone standard that is used to generate and measure the span.
The 1-point QC check concentration approximates the ambient air concentrations reported by the
monitoring organization and best represents the precision and bias around the concentrations
reported by the monitoring agency. The Office of Air and Radiation believes that it is sufficient
for monitoring agencies to complete zero and span checks in accordance with their approved
QAPPs that utilize the EPA validation template critical criteria, and make these data available for
review during the EPA technical systems audits. Although the AQS reporting of zero and span
checks is not a regulatory requirement, some monitoring organizations and the EPA Regions
have requested zero and span transactions be developed in order to voluntarily submit these data
to AQS. The Office of Air and Radiation has requested that zero and span QA transactions be
added to AQS and we will provide technical guidance suggesting that monitoring agencies
submit these data to AQS.
Planned Completion Date: FY18, Q4 for completion and deployment of zero span QA
transaction in AQS for use by monitoring organizations consistent with technical guidance
posted to AMTIC.
18-P-0105
30

-------
OIG Response #5: The agency provided two corrective actions for Recommendation 5:
expanding AQS' capabilities to include zero- and span-check data and issuing a technical
guidance document suggesting that monitoring agencies submit their zero- and span-check
data. We met with the agency to obtain clarification on the corrective actions. Agency staff
stated that TSA guidance directs regions to review QC check data during TSAs. Further, the
EPA plans to provide TSA training by June 2018, which will emphasize to EPA staff that
zero- and span-check results should be reviewed during TSAs. If fully implemented, we
believe the process described in the agency's response and during our meeting is sufficient to
provide reasonable assurance that zero- and span- checks are properly used to validate data.
Thus, our final report continues to recommend that the EPA develop a process to assure that
the reported data for the zero- and span-check meet validation criteria, but we no longer
specify the timing of this process. We accept the EPA's corrective actions as meeting the
intent of our recommendation. Recommendation 5 is resolved.
If you have any questions regarding this response, please contact Mike Jones, Office of Air
Quality Planning and Standards, Office of Air and Radiation (OAQPS/OAR) Audit Liaison, at
(919)541-0528.
Attachment
18-P-0105
31

-------
TECHNICAL COMMENTS ATTACHMENT
OAR and Region 9 offered these comments for OIG review:
Page 4: "EPA's Critical QC Checks" (text box)
In 2016, the ozone concentration for the 1-point QC check changed from 10 - 100 ppb to 5 - 80
ppb. It is recommended to include a footnote about this change in the 2016 regulation.
" The span check measures the analyzer's response to a concentration at the upper range of the
analyzer's measurement capability (e.g., 500+ppb) "
Recommend revising statement to: "The span check measures the analyzer's response to a
concentration at the upper range of the analyzer's measurement capability (e.g., 70 to 90
percent of full scale) "
OIG Technical Comment #1: The final report was revised to show the change in the one-point
QC check range from 10-100 ppb to 5-80 ppb. We also revised the span-check language to
clarify that it is 80-90 percent of full scale, which can be 500 ppb or more.
Page 4: Incomplete Statement
"... if the acceptance criteria are exceeded, the data collected by that monitor from the time of
the last acceptable check to the failed check should be invalidated. " Should be corrected to
include QA Handbook language ".. .invalidated unless there are compelling reason and
justification for not doing so"
OIG Technical Comment #2: The suggested statement was added to the final report.
Page 10: Suggested Table Corrections (provided by Region 9)
Table 2: Application of critical criteria for ozone data at six monitoring agencies
Monitoring
agency
Were critical criteria implemented in
accordance with EPA's QA
Handbook?
Were the critical
criteria adopted
in the approved
QAPP?
Zero
check
One-point
check
Span
check
Arizona DEQ
Yes
Yes
Yes
Yes
Georgia DNR
No
No
No
No
Maricopa County
Yes
Yes
Yes
Yes
Michigan DEQ
No
Yes
No
No
Pima County
Yes
Yes
Yes
Yes
South Carolina
DHEC
No
No
No
No
Source: OIG analyses of air monitoring agencies' QAPPs and the EPA's QA Handbook.
18-P-0105
32

-------
Since OAQPS finalized the Validation Templates, it has been the practice of Region 9 during our
QAPP review process to require that air monitoring agencies either adopt these templates, or
require our agencies to provide justification. All three Arizona agencies reviewed agreed to adopt
the data validation templates. Region 9 expects agencies to incorporate updates to regulation and
guidance into their procedures and provide updated QAPPs during regular review cycles. This is
reinforced through notifications and TSAs.
Region 9 does not agree that Maricopa County and Pima County did not adopt the QA Handbook
Appendix D Critical Criteria. The Region also believes that it was clear to each agency that their
QAPP commitment was to use the most current QA Handbook criteria. Maricopa County did
adopt ozone critical criteria in their QAPP approved in July 2011. Element 9 of this QAPP
indicates that the validation SOP used by Maricopa (updated in 2014) must use the QA
Handbook validation criteria. Pima County adopted the validation template in their QAPP
approved in October of 2013 and their 2014 validation SOP refers the validator specifically to
Appendix D for QA/QC criteria for validation.
Region 9 requests that critical criteria adoption be changed to "Yes" for both Maricopa
and Pima County in Table 2. Alternatively, the column could be relabeled to state "Were
the May 2013 critical criteria present in the approved QAPP." All three Arizona agencies
would be "Yes" in response to this question.
OIG Technical Comment #3: The OIG agrees that Maricopa County and Pima County were
implementing critical criteria that were consistent with applicable guidance in the EPA's QA
Handbook. This is reflected in Table 2 of our report. However, we do not agree that the far
right column of Table 2 should indicate "Yes" for either Maricopa County or Pima County.
Maricopa County's 2011 QAPP and Pima County's 2013 QAPP (the most recent EPA-
approved QAPPs for these agencies) both contain acceptance criteria for zero checks that
were not consistent with the guidance in the EPA's 2013 QA Handbook. In response to our
discussion document for this report, Pima County stated, "As new revisions to validation
templates become available, PDEQ adjusts operational procedures to adhere to the changes,
but does not make correctional changes to the QAPP until the review and re-submission
period, which is every 5 years." We did, however, revise the headings in Table 2 to better
distinguish which agencies were implementing the EPA's critical criteria and which had
updated critical criteria in their QAPPs.
Page 12: Suggested rephrasing
"However, once the EPA approves the use of a monitor, the EPA publishes regulation that may
allow it to be operated at a wider temperature range than 20 to 30 degrees Celsius."
Recommend revising statement to: "However, once the EPA approves the use of a monitor, the
EPA publishes a method approval notice in the Federal Register that may allow it to be operated
at a wider temperature range than 20 to 30 degrees Celsius."
18-P-0105
33

-------
OIG Technical Comment #4: We revised the language as follows: However, once the EPA
approves the use of a monitor, the EPA publishes a Notice of Designation in the Federal
Register, which sets forth the requirements for how a monitor is operated. This designation may
allow the monitor to operate within a wider temperature range than 20-30 degrees Celsius.
"Based on our review of 69 hours of ozone data that were invalidated by Michigan DEO, at no
time were the hourly shelter temperatures outside the operating range designated by EPA
regulation."
Recommend revising statement to: "Based on our review of 69 hours of ozone data that were
invalidated by Michigan DEO, at no time were the hourly shelter temperatures outside the
operating range designated by in the EPA method approval notice. "
OIG Technical Comment #5: We revised the language as follows: Based on our review of
69 hours of ozone data that were invalidated by Michigan DEO, at no time were the hourly
shelter temperatures outside the operating range approved by the EPA.
Page 12: Suggested edits (Region 9)
"/// our view, the EPA should clarify its guidance so that agencies consistently use the
appropriate shelter temperature criteria for each specific monitor. "
Recommend revising statement to: "In our view, the EPA should clarify its guidance concerning
how shelter temperature should be considered within the data validation process that agencies
consistently use the appropriate shelter temperature criteria for each specific monitor. In some
cases, EPA may approve OAPPs that use criteria that are more stringent than EPA criteria if
these criteria are applied consistently, do not intentionally or unintentionally bias the data set,
and do not compromise completeness. If a sampling shelter climate control system is
malfunctioning, data may not be representative and this could be an additional rationale for
invalidation even if the instrument is within its operating temperature range. EPA's guidance
should clarify that a weight of evidence approach should be used when determining if
invalidation is appropriate, and that these validation decisions should be supported by multiple
lines of evidence, one of which could be instrument temperature requirements
OIG Technical Comment #6: We included the first sentence in the final report. We did not
include the other language in the final report because OAQPS and the regions need to discuss
and agree on how to clarify the guidance.
Page 13-14 Table 4: One-sided example
OIG used a zero check example that showed the ozone monitor was reading 1 ppb high (4th
column in Table 4) and therefore adjusted the monitor down by 1 ppb. OAR does not dispute
what is presented in the Table but suggests that the information be clear that the zero check can
also demonstrate that the monitor is reading 1 ppb low and that the monitor, in this case, would
be adjusted up by 1 ppb. Therefore, in a case where an 8-hour average was 75 ppb, the monitor
would be adjusted to read 76 ppb. OAR finds data from precision as well as zero data are
18-P-0105
34

-------
normally distributed around the acceptance limits so there is equal potential for positive and
negative drift in ozone monitors.
If a monitoring organization has a reliable QC system and knows that its zero system is
functioning properly, providing a zero adjustment that is well within the zero acceptance criteria
does not necessarily decrease reliability of data reported to AQS if it can be proven that the
monitor drifts slightly over a 24-hour period. EPA's concern, and why it suggests not performing
a zero adjustment, is that one may not know which - the monitor or the zero check system - is
malfunctioning and a continuous zero adjustment could be masking a technical issue that should
be addressed and not continuously corrected.
OIG Technical Comment #7: We added the following language before Table 4: Our example
illustrates zero-check results where adjustment procedures caused the adjusted data to be
lower than the raw values recorded by the monitor. However, under different circumstances,
the inverse is also possible, and adjusted values could be higher than raw values.
Page 14: Add clarification.
"Based on our analysis, about 26 percent of the AOS hourly ozone data differedfi'om the
corresponding real-time data reported in AirNow
While OAR does not dispute this finding, we believe that it is important to include the following
clarification: "The 26 percent finding includes data records where a measurement value was
reported to AirNow but no value was reported to AOS, and data records where a measurement
value was reported to AOS but no value was reported to AirNow. These specific situations
accountedfor more than half of differences noted above
OIG Technical Comment #8: We added the following language from the 2017 OIG
Management Alert report: There are a number of reasons for such differences. For example,
monitoring agencies could find certain data reported in real time to AirNow to be invalid
and, therefore, would not report the data to the AOS. Further, monitoring agencies could
apply different conventions for rounding or truncating raw data before reporting to either
database.
Page 23: Revise Improvement Section.
"The EPA is also developing a national TSA document.... " Revise to: "The EPA has developed
and anticipates issuing..."
OAR believes the TSA guidance document addresses the two bullets recommended by OIG.
•	Critical criteria that meet EPA recommendations for data validation.
•	A process for documenting any adjustments made to raw data before submittal to the AOS.
OIG Technical Comment #9: We revised the final report language to the following: In
response to our evaluation, the EPA developed and issued a national TSA guidance document
in December 2017 to ensure that all EPA regions consistently implement the TSA process.
18-P-0105
35

-------
Appendix B
Distribution
The Administrator
Chief of Staff
Chief of Operations
Deputy Chief of Operations
Assistant Administrator for Air and Radiation
Regional Administrator, Region 4
Regional Administrator, Region 5
Regional Administrator, Region 9
Agency Follow-Up Official (the CFO)
Agency Follow-Up Coordinator
General Counsel
Associate Administrator for Congressional and Intergovernmental Relations
Associate Administrator for Public Affairs
Career Deputy Assistant Administrator for Air and Radiation
Director, Office of Air Quality Planning and Standards, Office of Air and Radiation
Director, Office of Regional Operations
Audit Follow-Up Coordinator, Office of the Administrator
Audit Follow-Up Coordinator, Office of Air and Radiation
Audit Follow-Up Coordinator, Region 4
Audit Follow-Up Coordinator, Region 5
Audit Follow-Up Coordinator, Region 9
Audit Follow-Up, Office of Air Quality Planning and Standards, Office of Air and Radiation
18-P-0105
36

-------