| U.S. ENVIRONMENTAL PROTECTION AGENCY
K P»o^° OFFICE OF INSPECTOR GENERAL
Air Quality
Management Alert:
Certain State, Local and Tribal
Data Processing Practices Could
Impact Suitability of Data for
8-Hour Ozone Air Quality
Determinations
Report No. 17-P-0106 February 6, 2017
8-Hour Ozone Nonattainment Areas (as of 9/22/2016)
8-hour Ozone Classification
I I Extreme
I I Severe 15
-------
Report Contributors:
James Hatfield
Andrew Lavenburg
Jasprit Matta
Renee McGhee-Lenart
Anita Mooney
Geoffrey Pierce
Wendy Wierzbicki
Abbreviations
AQS Air Quality System
CFR Code of Federal Regulations
EPA U.S. Environmental Protection Agency
Georgia DNR Georgia Department of Natural Resources
NAAQS National Ambient Air Quality Standards
OAQPS Office of Air Quality Planning and Standards
OIG Office of Inspector General
ppb parts per billion
QAPP Quality Assurance Project Plan
South Carolina DHEC South Carolina Department of Health and Environmental Control
Cover photo: National map of the 8-hour ozone nonattainment areas (2008 standard),
as of September 22, 2016. (Source: EPA website)
Are you aware of fraud, waste or abuse in an
EPA program?
EPA Inspector General Hotline
1200 Pennsylvania Avenue, NW (2431T)
Washington, DC 20460
(888) 546-8740
(202) 566-2599 (fax)
OIG Hotline@epa.gov
Learn more about our OIG Hotline.
EPA Office of Inspector General
1200 Pennsylvania Avenue, NW(2410T)
Washington, DC 20460
(202) 566-2391
www.epa.gov/oiq
Subscribe to our Email Updates
Follow us on Twitter @EPAoig
Send us your Project Suggestions
-------
• • U.S. Environmental Protection Agency 17-P-0106
| d \ Hffirp nf Incnortnr ^onoral February 6, 2017
. u.o. ciiviiuiuiieiiidi nuieuu
\ Office of Inspector General
» V|V ?
At a Glance
Why We Did This Review
In the process of evaluating
whether selected ozone air
monitoring data meet the
criteria established by the
U.S. Environmental
Protection Agency (EPA), we
found two state monitoring
agencies that do not use
EPA-recommended data
processing practices. We are
issuing this report to alert the
EPA about these issues
before the agency starts
using the data to determine
whether air quality meets the
National Ambient Air Quality
Standard (NAAQS) for
ozone.
The EPA uses Air Quality
System (AQS) data to
determine whether an area's
air quality meets the NAAQS.
A nonattainment designation
means that an area's air
contains unhealthy levels of
pollution, and the state must
develop a plan to identify
enforceable measures to
improve air quality in that
area. The EPA plans to
designate areas for the new
ozone NAAQS in 2017.
This report addresses the
following EPA goal or
cross-agency strategy:
• Addressing climate
change and improving
air quality.
Send all inquiries to our public
affairs office at (202) 566-2391
or visit www.epa.gov/oia.
Listing of OIG reports.
Management Alert: Certain State, Local and Tribal Data
Processing Practices Could Impact Suitability of Data for
8-Hour Ozone Air Quality Determinations
There is a risk that multiple
air-monitoring agencies are not
always implementing the EPA's
recommended quality assurance
practices for ozone data. This
could lessen the quality of data
the agency uses to determine and
inform the public as to whether
the air is healthy to breathe.
What We Found
Air monitoring data the EPA received from
Georgia and South Carolina were not always
processed according to recommended
practices in the EPA's 2013 Quality
Assurance Handbook for Air Pollution
Measurement Systems (Quality Assurance
Handbook). Georgia and South Carolina
adjusted ozone data based on the results of
quality control checks known as "zero
checks" before reporting the data to the
AQS. According to the Quality Assurance Handbook, zero check adjustments,
although an accepted practice under certain conditions, should not be necessary
and may lead to more data quality uncertainty. While Georgia stopped adjusting its
data in 2015, South Carolina continued the practice.
Georgia and South Carolina were not implementing critical criteria as
recommended in Appendix D of the Quality Assurance Handbook. In Appendix D,
the EPA establishes three critical quality control checks ("zero," "one-point quality
control," and "span checks") to validate data. Georgia uses the three quality control
checks to validate its data, but the acceptance criteria that the state uses for these
checks are less stringent than what the EPA recommends. South Carolina does
not use zero checks to validate ozone data. South Carolina applies the one-point
quality control check to validate ozone data, but its acceptance criteria are less
stringent than the EPA's recommended critical criteria. South Carolina conducts
span checks, but does not follow EPA-recommended practices. Variation in the
use of acceptance criteria and critical quality control checks can impact the
integrity of data the EPA uses to make designation decisions.
We analyzed 2012-2014 ozone data across the country and determined that
about 26 percent of the hourly data reported in real time were different than
corresponding data reported to the AQS. While not all of the differences are
indicative of data adjustment practices, there is a risk that other air-monitoring
agencies are improperly adjusting their data before reporting it to the AQS. These
adjustments could impact the quality of data the EPA plans to use to determine
whether ozone levels present an adverse health risk to the public (i.e., the
designation process). Designation determinations can have significant implications
for public health and an area's economy. Therefore, it is important that the EPA
has assurance that its designation decisions are based on data that has
undergone a known, consistent and accepted quality control process.
Pending completion of our ongoing work, we are making no recommendations. We
are alerting the EPA to a potential risk in the use of ozone data for its designations
in 2017, so that the agency can take steps to further assess and mitigate risks as
needed. The agency has initiated actions to assess these risks.
-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
THE INSPECTOR GENERAL
February 6, 2017
MEMORANDUM
SUBJECT: Management Alert: Certain State, Local and Tribal Data Processing Practices
Could Impact Suitability of Data for 8-Hour Ozone Air Quality Determinations
Report No. 17-P-0106
FROM: Arthur A. Elkins Jr.
TO:
Sarah Dunham, Acting Assistant Administrator
Office of Air and Radiation
During our evaluation of the U.S. Environmental Protection Agency's (EPA's) ambient air monitoring
data changes and gaps, we found that two states were not processing ozone data in accordance with the
EPA's recommended practices. We are issuing this management alert to inform the EPA about these
issues, and the potential impact the issues could have on data the agency uses to make National Ambient
Air Quality Standards' ozone designation decisions in 2017. The project number for this evaluation was
OPE-FY16-0035.
The report represents the opinion of the Office of Inspector General (OIG) and does not necessarily
represent the final EPA position. Final determinations on matters in this report will be made by EPA
managers in accordance with established audit resolution procedures.
Action Required
Because this report contains no recommendations, you are not required to respond to this report.
However, if you submit a response, it will be posted on the OIG's public website, along with our
memorandum commenting on your response. Your response should be provided as an Adobe PDF file
that complies with the accessibility requirements of Section 508 of the Rehabilitation Act of 1973, as
amended. The final response should not contain data you do not want released to the public; if your
response contains such data, you should identify the data for redaction or removal along with
corresponding justification.
The report will be available at www.epa.gov/oig.
-------
Management Alert: Certain State, Local and Tribal
Data Processing Practices Could Impact Suitability
of Data for 8-Hour Ozone Air Quality Determinations
17-P-0106
Table of C
Purpose 1
Background 1
Scope and Methodology 3
Ozone Data Were Not Always Processed According to
EPA-Recommended Practices 4
Data Adjustments 5
Data Validations 6
Risk That Other Air-Monitoring Agencies Are Not Following
EPA-Recommended Practices 6
Agency Actions Prompted by OIG Work 7
Conclusion 7
Appendix
A Distribution 9
-------
Purpose
The U.S. Environmental Protection Agency's (EPA's) Office of Inspector
General (OIG) has an ongoing review to determine whether selected ozone air
monitoring data in the EPA's Air Quality System (AQS) meet the criteria
established by the agency. The purpose of this report is to alert the EPA that state,
local and tribal agencies may not be processing ozone ambient air monitoring data
in accordance with the EPA's recommended practices, based on findings from
two states we reviewed. When our work is complete, the OIG plans to issue a
final report.
Background
Air monitoring networks operated by state, local and tribal agencies provide the
data that the EPA uses to determine whether an area's air quality meets National
Ambient Air Quality Standards (NAAQS) set by the EPA. These standards are set
at a level to protect public health, including sensitive populations such as the
elderly, children and asthmatics, from the effects of air pollution. Table 1
identifies some of the health effects associated with ozone.
Table 1: Health effects of ozone
Short-term health effects
Long-term health effects
• Shortness of breath and pain when
taking a deep breath.
• Coughing and sore or scratchy throat.
• Inflamed and damaged airways.
• Increased frequency of asthma
attacks.
• Increased susceptibility to lung
infection.
• Aggravation of asthma, and is likely to
be one of many causes of asthma
development.
• May be linked to permanent lung
damage, such as abnormal lung
development in children.
• May increase the risk of death from
respiratory causes.
Source: OIG analysis of EPA websites describing the health effects of ozone.
In 2016, the EPA started the designation process to determine whether areas in the
nation meet the current 70 parts per billion (ppb) air quality standard. The EPA
will then make its designations in 2017. An EPA determination that an area's air
quality does not meet national standards can have significant consequences for
that area and the state. A "nonattainment" designation means that the state, local
agency or tribe must develop a plan that identifies enforceable measures for
reducing emissions to improve air quality in that area. These measures can
include more stringent permits, and additional emission controls for industry and
other sources within the nonattainment area.
17-P-0106
1
-------
Air Monitoring Databases
The EPA maintains ambient air
monitoring data in two databases—
the AQS and AirNow. Raw or real-
time data is reported to AirNow on an
hourly basis for use in calculating air
quality indexes that inform the public
of current air quality conditions. After
the monitoring agency reviews and
validates the data, the hourly averages
are submitted electronically to the
AQS on a quarterly basis.
Monitoring agencies certify annually
that ambient air monitoring data are
accurate and are entered into the AQS
as required by the Code of Federal
Regulations (CFR) through 40 CFR
Part 5 8.15. The EPA uses air
monitoring data from the AQS to
compute design values each year for
monitors meeting EPA completeness
requirements. The ozone design value
is the annual fourth-highest daily
maximum 8-hour average concentration for a monitor, and is averaged over
3 years. These design values are used to make designation determinations and to
classify nonattainment areas based on the highest-reading monitor in an area.
EPA Data Processing Requirements and Guidance
Title 40 CFR Part 58 requires each monitoring agency to establish a quality
system that includes data quality performance requirements for precision, bias and
completeness. The regulation specifically references the EPA's Quality Assurance
Handbook for Air Pollution Measurement Systems (Quality Assurance
Handbook)1 as guidance for developing a quality system for ambient air
monitoring programs.
The Quality Assurance Handbook states that "based upon validation criteria, the
data is either reported as initially measured or invalidated." The handbook allows
AirNow
>
Collects hourly, real-time and forecasted
air quality information to inform the public.
>
Communicates air quality to the public via
the air quality index.
>
Data are considered preliminary and
unofficial, and are not used for regulatory
decisions.
>
For more information, visit About AirNow.
Uses for AQS Data
>
Assess air quality.
>
Assist in attainment and nonattainment
designations.
>
Evaluate state implementation plans for
nonattainment areas.
>
Perform modeling for permit review
analysis and other air quality management
functions.
1 Quality Assurance Handbook for Air Pollution Measurement Systems. Volume II: Ambient Air Quality Monitoring
Program. EPA-454/B-13-003 (May 2013).
17-P-0106
2
-------
daily adjustments to monitors based on automated zero checks2, but these
adjustments are allowed only under certain circumstances. Adjustments based on
automated zero checks are not intended to correct data previously collected at the
monitor, which would be considered post-processing the data.
The Quality Assurance Handbook has
established three critical quality control
checks: "zero," "one-point quality
control," and "span." The three checks
are critical to maintaining the integrity of
the data collected by a monitor. The
checks involve comparing the response of
the monitor to known and certified ozone
concentrations. Differences between the
known ozone concentration and the
monitor response are compared to the
monitoring agency's acceptance criteria.
According to the Quality Assurance Handbook, the data should be invalidated to
the last acceptable check if acceptance criteria are exceeded. When data are
invalidated, they are not reported to the AQS by the monitoring agency. Instead,
null codes are reported to the AQS that provide an explanation as to why the data
are missing. Data that are invalidated are not used to calculate ambient air
averages or design values.
Scope and Methodology
We began our evaluation in January 2016, and our work is ongoing. We
conducted this performance audit in accordance with generally accepted
government auditing standards. Those standards require that we plan and perform
our work to obtain sufficient, appropriate evidence to provide a reasonable basis
for our findings and conclusions based on our audit objectives. We believe that
the evidence obtained provides a reasonable basis for the findings and conclusions
in this report based on our audit objectives.
We obtained historical AirNow data from the EPA's Office of Air Quality
Planning and Standards (OAQPS) for 2012, 2013 and 2014, as an indicator of the
initial values recorded by the ozone monitors. Those years were chosen because,
at the time we initiated our evaluation, they represented the most recent data in
the AQS that were certified as valid by monitoring agencies and the EPA. We
obtained AQS data for the same years from the EPA's public Air Data website.
EPA's Critical Quality Control Checks
> The zero check measures the analyzer's
response to zero ozone (0 ppb ozone).
> The one-point quality control check
measures the analyzer's response to a
typical ozone concentration at the site
(10-100 ppb).
> The span check measures the analyzer's
response to a concentration at the upper
range of the analyzer's measurement
capability (e.g., 500+ ppb).
2 According to the Quality Assurance Handbook, some air monitoring analyzers are capable of conducting regularly
scheduled zero and span calibrations, and automatically adjusting the monitor readings based on the results of those
calibrations.
17-P-0106
3
-------
Using CaseWare IDEA software, we compared hourly data from the AQS and the
AirNow to identify any values that did not match in the two databases. We then
compared the top 8-hour daily maximum averages from the AQS database to
those in AirNow at each site for 2012-2014. We wanted to identify where hourly
differences in the data could have impacted the design values. The results of this
comparison were used to identify states for review.
We selected two EPA Region 4 states—Georgia and South Carolina— for review.
We conducted site visits at the Georgia Department of Natural Resources (DNR)
and the South Carolina Department of Health and Environmental Control
(DHEC). We obtained raw monitoring data, quality assurance and control data,
and supporting documentation to explain data differences and gaps in a sample of
28 monitoring sites and 70 dates.3 We reviewed Georgia DNR's and South
Carolina DHEC's quality assurance project plans (QAPPs), standard operating
procedures, and AQS summary reports. We also reviewed Region 4's Technical
System Audit reports.
We interviewed Georgia DNR, South Carolina DHEC, the EPA's OAQPS, and
management and/or staff from EPA Regions 3, 4 and 9. The work of the OIG is
ongoing, and we plan to issue a report that fully addresses our assignment
objectives.
Ozone Data Were Not Always Processed According to
EPA-Recommended Practices
The air-monitoring agencies for Georgia and South Carolina did not always
process ozone data according to recommended practices in the EPA's Quality
Assurance Handbook. We found the following occurred:
• Monitoring agencies in both Georgia and South Carolina adjusted their
raw ozone data based on the results of quality control checks known as
"zero checks."
• Georgia and South Carolina were not validating data in accordance with
recommended critical criteria found in Appendix D of the Quality
Assurance Handbook.
3 We reviewed 12 sites for 26 dates in Georgia, and 16 sites for 44 dates in South Carolina.
17-P-0106
4
-------
Data Adjustments
The Quality Assurance Handbook does not recommend adjustments of
monitoring data. The handbook states that frequent adjustments or calibrations4 to
monitors should not be necessary and may lead to more data quality uncertainty.
Although the handbook allows for automated adjustments to an ozone monitor
based on the results of daily zero checks, the handbook does not recommend
adjustments to the data after it has been recorded by the monitor (i.e., post-
processing of data). However, an OAQPS quality assurance expert stated that a
daily zero-check adjustment to data already recorded by the monitor could be
acceptable if the data are not adjusted retrospectively. A daily zero-check
adjustment to data already recorded by the monitor has essentially the same
assessment value as an automated adjustment to the monitor, which is considered
reasonable under EPA guidance according to an OAQPS quality assurance expert.
Georgia and South Carolina applied zero-check adjustments to the hourly
averages we sampled.5 The adjustments to the hourly data we sampled resulted in
different 8-hour daily maximum averages than would have been calculated
without the adjustments. Since the ozone standard is based on an 8-hour average,
these adjustments could impact the EPA's design value calculations, which are
used to determine compliance with the ozone NAAQS. A manager from OAQPS
stated that the EPA would need to review the Georgia and South Carolina
adjustments in more detail to determine whether the states' practices meet the
intent of the guidance.
From 2012 through June 2015, Georgia applied a manual adjustment to its
monitoring data using results of weekly zero checks before reporting the data to
the AQS. Georgia stopped the practice in June 2015, based on Region 4's
recommendation that the state reconsider the practice.
South Carolina adjusted its ozone data based upon results of daily zero-check
procedures. Even if OAQPS was to interpret this adjustment as meeting the intent
of its guidance, EPA Region 4 had questioned whether South Carolina's monitors
were set up in a manner to provide reliable zero checks. In its 2015 Technical
Systems Audit, Region 4 identified that South Carolina's ozone monitors were not
configured according to manufacturer's operating instructions to provide zero
ozone concentrations for use during zero checks. However, Region 4 was not
aware that South Carolina was using daily zero checks to adjust data. The Quality
Assurance Handbook states that air monitors should be assembled and set up
according to instructions in the manufacturer's manual to generate quality data.
4 The EPA's Quality Assurance Handbook for Air Pollution Measurement Systems, Volume //, May 2013, defines
calibration as ".. .the comparison of a measurement standard, instrument, or item with a standard or instrument of
higher accuracy to detect and quantify inaccuracies and to report or eliminate those inaccuracies by adjustment."
Therefore, zero checks, and any subsequent adjustments made in response to a zero check, meet the definition of
calibration.
5 Not all adjusted ozone measurements were different than the unadjusted measurement data, because some zero
checks resulted in an "adjustment" of less than 1 ppb, or 0 ppb.
17-P-0106
5
-------
South Carolina disagreed with Region 4's finding and a recommendation to
reconfigure the monitors. South Carolina continues to adjust all of its hourly
ozone data based upon daily zero checks.
Data Validations
Georgia and South Carolina were not implementing critical criteria as
recommended in Appendix D of the Quality Assurance Handbook. In Appendix D,
the EPA provides measurement quality objectives and validation criteria for each
criteria pollutant, including critical validation criteria. According to the EPA,
critical criteria are vital for ensuring the integrity of the data. The handbook
provides acceptance criteria for each of the three critical quality control checks for
ozone monitoring (zero, one-point quality control, and span). The handbook also
states that observations that do not meet each and every critical criterion should be
invalidated, unless there are compelling justifications for not doing so.
Georgia applies the three quality control checks recommended in Appendix D of
the Quality Assurance Handbook to validate the state's data, but uses acceptance
criteria for the three checks that were less stringent than recommended by the
EPA. Staff at Georgia's air-monitoring agency stated that they interpreted
language in a prior section of the handbook to allow for zero and span drift
acceptance levels beyond what is recommended in the critical criteria table
provided in Appendix D of the handbook.
South Carolina does not use zero checks to validate ozone data. South Carolina
applies the one-point quality control check to validate ozone data, but its
acceptance criteria for this check is less stringent than the EPA's recommended
critical criteria. South Carolina also conducts span checks, but averages the result
with the result of the one-point quality control check to validate the state's ozone
data. Region 4 noted in its 2015 Technical Systems Audit of South Carolina that
the state's ozone validation criteria did not conform to the Quality Assurance
Handbook. However, South Carolina continues to use critical criteria for data
validation that are less stringent than the EPA's recommended critical criteria.
Variation in the use of acceptance criteria and critical quality control checks by
monitoring agencies to validate data reported to the EPA can impact the integrity
of the data used to make decisions regarding compliance with NAAQS.
Risk That Other Air-Monitoring Agencies Are Not Following
EPA-Recommended Practices
During our review, we found data indicating a risk that other monitoring agencies
are not implementing EPA-recommended data processing practices. We found
differences between data reported to the AQS and real-time data reported to
AirNow and identified QAPPs, which had not been approved since the 2013
version of the Quality Assurance Handbook.
17-P-0106
6
-------
Based on our analysis, about 26 percent of the AQS hourly ozone data differed
from the corresponding real-time data reported in AirNow. There are a number of
reasons for such differences. For example, monitoring agencies can find certain
data reported in real time to AirNow to be invalid and, therefore, not report the
data to the AQS. Further, monitoring agencies could apply different conventions
for rounding or truncating raw data before reporting to either database. However,
because we confirmed that at least some of these differences were due to data
adjustment practices, there is a risk that other monitoring agencies may have
made adjustments to the raw monitoring data before they were reported to the
AQS. These adjustments can impact the EPA's ability to assess data quality, and
determine whether the data are sufficient and comparable for making designation
decisions.
Air-monitoring agencies develop QAPPs that should identify their quality control
procedures and data-validation criteria. The EPA's critical criteria for zero checks
changed significantly in 2013. Thus, there is a risk that QAPPs that have not been
approved in the last 5 years have not been updated to include the EPA's revised
criteria. Based on data in the AQS, about 38 percent of monitoring agencies do
not have ozone monitoring QAPPs that have been approved within the last
5 years.
Agency Actions Prompted by OIG Work
In November 2016, OAQPS began conducting a review of hourly ozone data in
AirNow and the AQS from 2012-2015, to determine the risk of any data
adjustments potentially impacting data that could be used in the EPA's upcoming
designation determinations. Preliminary results from the OAQPS analysis
identified some differences between data in the AQS and AirNow for monitoring
locations in some states. OAQPS intends to expand its analysis to look at design
values and potential impacts on designation determinations. OAQPS is also in the
process of polling its regional offices to develop a list of monitoring agencies that
perform zero adjustments to ozone-monitoring data.
Conclusion
Ozone data in Georgia and South Carolina were not always processed according
to EPA-recommended practices, and there is a risk that other monitoring agencies
do not always process data in accordance with EPA-recommended practices. As a
result, ozone data that the EPA plans to use in determining whether ozone levels
present a health risk to the public (i.e., designation decisions) could have been
processed in a manner that does not achieve the data quality expected by the EPA,
or in a manner that was not comparable across different monitoring agencies.
Designation determinations can have significant implications for public health
and an area's economy. It is important that the EPA has assurance that its
designation decisions are based on data that has been processed using comparable
17-P-0106
7
-------
quality control procedures that meet the criteria in the Quality Assurance
Handbook.
Pending completion of our ongoing work, we are making no recommendations.
We are alerting the EPA to a potential risk in the use of ozone data for its 2017
designation process, so that the agency can take appropriate steps to further assess
and mitigate risks. Based on the discussion of our initial results with the EPA, the
agency has started to take actions to assess the risk that any ozone data
adjustments could pose to the designation process.
17-P-0106
8
-------
Distribution
The Administrator
Assistant Administrator for Air and Radiation
Agency Follow-Up Official (the CFO)
Agency Follow-Up Coordinator
General Counsel
Associate Administrator for Congressional and Intergovernmental Relations
Associate Administrator for Public Affairs
Regional Administrator, Region 4
Deputy Assistant Administrator for Air and Radiation
Director, Office of Regional Operations
Audit Follow-Up Coordinator, Office of the Administrator
Audit Follow-Up Coordinator, Office of Air and Radiation
Audit Follow-Up Coordinator, Region 4
17-P-0106
------- |