j A *
1®
U.S. ENVIRONMENTAL PROTECTION AGENCY
OFFICE OF INSPECTOR GENERAL
Improving Air Quality
EPA Effectively Screens
Air Emissions Data from
Continuous Monitoring Systems
but Could Enhance Verification
of System Performance
Report No. 19-P-0207
June 27, 2019

-------
Report Contributors:	Andrew Lavenburg
Bao Chuong
Jim Hatfield
Erica Hauck
Abbreviations
ARP
Acid Rain Program
CAMD
Clean Air Markets Division
CEMS
Continuous Emissions Monitoring System
CFR
Code of Federal Regulations
CSAPR
Cross-State Air Pollution Rule
ECMPS
Emissions Collection and Monitoring Plan System
EGU
Electric Generating Unit
EPA
U.S. Environmental Protection Agency
FACT
Field Audit Checklist Tool
NOx
Nitrogen oxides
OIG
Office of Inspector General
QA
Quality Assurance
QC
Quality Control
RATA
Relative Accuracy Test Audit
S02
Sulfur Dioxide
Cover photos: Clockwise, from top left: Typical coal-fired power plant, continuous emissions
monitoring analyzers, continuous emissions monitor and stack flow monitor.
(EPA photos)
Are you aware of fraud, waste or abuse in an
EPA program?
EPA Inspector General Hotline
1200 Pennsylvania Avenue, NW (2431T)
Washington, DC 20460
(888) 546-8740
(202) 566-2599 (fax)
OIG Hotline@epa.gov
Learn more about our OIG Hotline.
EPA Office of Inspector General
1200 Pennsylvania Avenue, NW (2410T)
Washington, DC 20460
(202) 566-2391
www.epa.gov/oiq
Subscribe to our Email Updates
Follow us on Twitter @EPAoig
Send us your Project Suggestions

-------
^£DS7X
• JL v
I®/
U.S. Environmental Protection Agency
Office of Inspector General
At a Glance
19-P-0207
June 27, 2019
Why We Did This Project
We conducted this audit to
determine whether selected
continuous emissions monitoring
data meet applicable quality
assurance and quality control
criteria.
Continuous emissions monitoring
systems (CEMSs) are required
under some U.S. Environmental
Protection Agency (EPA)
regulations and programs to
continuously monitor actual
emissions from stationary
sources. Two programs that
require the use of CEMSs are the
Acid Rain Program (ARP) and
the Cross-State Air Pollution
Rule (CSAPR), which are
intended to reduce emissions of
sulfur dioxide and nitrogen
oxides. These emissions trading
programs are designed to
improve air quality by setting
emissions limits and monitoring
emissions from power plants. It is
important that the CEMS data
reported to the EPA be accurate
and meet regulatory
requirements because these data
are used to assess compliance
with trading program emission
limits and progress toward
environmental goals.
This report addresses the
following:
• Improving air quality.
EPA Effectively Screens Air Emissions Data
from Continuous Monitoring Systems but Could
Enhance Verification of System Performance
What We Found
The EPA's automated screening of facility-reported
CEMS data worked as intended and was effective in
verifying the quality of the reported data. However,
we found a small number of inaccuracies and
inconsistencies in the reported data. While these
instances had no impact on whether the data met
quality assurance requirements, the inaccurate data
could have a negative impact on data users by
providing inaccurate or misleading information. The
EPA can prevent these problems by adding specific
screening checks to its existing reporting software.
Although the EPA's automated screening process was effective, the validity of
the reported data can only be fully established when that process is
supplemented with on-site field audits to verify that CEMS monitoring
requirements were met. However, we found that the EPA and state agencies
conducted a limited number of these audits. Out of over 1,000 facilities subject
to ARP and/or CSAPR requirements, the EPA conducted field audits at only
16 facilities between 2016 and the end of June 2018. In addition, nine of the
10 state agencies we contacted were not conducting field audits. In response to
our work, the EPA initiated a process to develop a streamlined CEMS field audit
approach that state and local agencies can use when conducting other on-site
visits at facilities.
Recommendations and Planned Agency Corrective Actions
We recommend that the Assistant Administrator for Air and Radiation develop
and implement electronic checks to retroactively evaluate CEMS data where
monitoring plan changes have occurred, and develop and distribute to state and
local agencies a streamlined field audit process. The EPA agreed with our
recommendations and provided acceptable corrective actions and completion
dates. All recommendations are considered resolved with corrective actions
pending.
Data from CEMS are
used to determine
whether sources, such
as power plants,
comply with emissions
limits designed to
improve air quality and
achieve environmental
and public health goals
Address inquiries to our public
affairs office at (202) 566-2391 or
OIG WEBCOMMENTS@epa.gov.
List of OIG reports.

-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
OFFICE OF
INSPECTOR GENERAL
June 27, 2019
MEMORANDUM
SUBJECT: EPA Effectively Screens Air Emissions Data from Continuous Monitoring Systems
but Could Enhance Verification of System Performance
Report No. 19-P-0207
This is our report on the subject audit conducted by the Office of Inspector General (OIG) of the
U.S. Environmental Protection Agency (EPA). The project number for this audit was
OA&E-FY18-0181. This report contains findings that describe the problems the OIG has identified and
corrective actions the OIG recommends. This report represents the opinion of the OIG and does not
necessarily represent the final EPA position. Final determinations on matters in this report will be made
by EPA managers in accordance with established audit resolution procedures.
The Office of Air and Radiation's Clean Air Markets Division is the office responsible for the issues
discussed in this report.
In accordance with EPA Manual 2750, your office provided acceptable corrective actions and milestone
dates in response to OIG recommendations. All recommendations are resolved and no final response to
this report is required. However, if you submit a response, it will be posted on the OIG's website, along
with our memorandum commenting on your response. Your response should be provided as an Adobe
PDF file that complies with the accessibility requirements of Section 508 of the Rehabilitation Act of
1973, as amended. The final response should not contain data that you do not want to be released to the
public; if your response contains such data, you should identify the data for redaction or removal along
with corresponding justification.
FROM: Charles J. Sheehan, Deputy Inspector General
TO:
William Wehrum, Assistant Administrator
Office of Air and Radiation
We will post this report to our website at www.epa.gov/oig.

-------
EPA Effectively Screens Air Emissions Data
from Continuous Monitoring Systems but Could
Enhance Verification of System Performance
19-P-0207
Table of C
Chapters
1	Introduction		1
Purpose		1
Background		1
Responsible Office		4
Scope and Methodology		5
Prior OIG Report		6
2	EPA Automated Screening of CEMS Data Is Effective
but Could Be Enhanced to Reduce Minor Inaccuracies		7
CEMS Data Electronically Reported and Screened		7
Electronic Data Quality Checks on Reported Data
Worked as Intended		7
EPA Can Enhance Its Data Quality Checks to Reduce Risks of
Inaccurate or Inconsistent CEMS Data		10
Conclusions		12
Recommendation		13
Agency Response and OIG Evaluation		13
3	EPA Should Develop a Streamlined On-Site Verification Approach
to Maximize State Participation		14
Field Audits and On-Site Verification of CEMS Intended to Verify
Performance of CEMS		14
EPA and State Agencies Conducted a Limited Number of Field Audits		15
CAMD Targets Audits Based on Several Risk-Based Factors and
Has Taken Steps to Better Document Its Selection Procedures		15
CAMD Should Develop a Streamlined Approach for
On-Site Verification		16
Field Audits Can Identify Problems Not Otherwise Detected and
Verify that Facilities Submit Valid Data to EPA		17
Conclusions		17
Recommendation		18
Agency Response and OIG Evaluation		18
Status of Recommendations and Potential Monetary Benefits		19
Appendices
A Agency's Response to Draft Report	 20
B Distribution	 22

-------
Chapter 1
Introduction
Purpose
The Office of Inspector General (OIG) for the U.S. Environmental Protection
Agency (EPA) conducted this audit to determine whether selected continuous
emissions monitoring data meet applicable quality assurance (QA) and quality
control (QC) criteria.
Background
Continuous emissions monitoring involves sampling emissions at pollution
sources on an ongoing, or continuous, basis. A continuous emissions monitoring
system (CEMS) measures actual emissions levels from a stationary source and
includes all equipment required to continuously sample, analyze and provide a
permanent record of stack emissions. CEMSs are required under some EPA
regulations and programs for either continual compliance determinations or
determinations of exceedances of the emissions standards. Two EPA programs
that require continuous emissions monitoring are the Acid Rain Program (ARP)
and the Cross-State Air Pollution Rule (CSAPR).
EPA Acid Rain Program and Cross-State Air Pollution Rule
The ARP and CSAPR are emissions trading programs designed to reduce
emissions of sulfur dioxide (SO2) and nitrogen oxides (NOx). Both programs
apply to large electric generating units (EGUs)
that burn fossil fuels to generate electricity for
sale (i.e., power plants).
The ARP, established under Title IV of the 1990
Clean Air Act Amendments, requires major
emissions reductions of SO2 and NOx—the
primary precursors of acid rain—from power
plants.
CSAPR requires certain states in the eastern half
of the United States to improve air quality by
reducing SO2 and NOx power plant emissions that
cross state lines and contribute to pollution in
downwind states. These improvements help
Affected units under ARP and
CSAPR in 2015
ARP
•	3,520 EGUs at 1,226 facilities subject to
SO2 requirements.
•	795 EGUs at 336 facilities subject to
NOx requirements.
CSAPR
•	2,820 affected EGUs at 864 facilities in
S02 program and NOx annual program.
•	3,228 affected EGUs at 946 facilities in
NOx ozone season program.
2015 Program Progress - Cross-State Air
Pollution Rule and Acid Rain Program.
19-P-0207
1

-------
downwind areas attain and maintain EPA health-
based air quality standards, known as National
Ambient Air Quality Standards
Thousands of sources nationwide are subject to
ARP and/or CSAPR requirements. SO2 and NOx
emissions from these sources can contribute to the
formation of acid rain, fine particulate matter and
ozone, which can negatively impact a person's
respiratory system. Fine particulate matter
emissions can also negatively impact people with
heart disease and are a main cause of reduced
visibility (haze) in many parts of the United States.
Both fine particulate matter and acid rain harm
sensitive ecosystems such as lakes and forests.
Both the ARP and CSAPR incorporate the use of
emissions allowances. Allowances authorize a
certain amount of pollution to be emitted by a source
and can be bought and sold among sources subject to
the programs ("allowance trading"). Emissions must
be monitored continuously during the compliance
period because emissions allowances are based on
the total mass of a pollutant emitted over a certain time period. Complete and
accurate monitoring, reporting and auditing of emissions are key to the EPA's
ability to ensure that the ARP and CSAPR programs function as intended
Continuous Emissions Monitoring Requirements per 40 CFR Part 75
Sources subject to the ARP or CSAPR must follow the monitoring regulations
in 40 CFR Part 75, which requires continuous
monitoring and reporting of SO2, carbon
dioxide and NOx emissions. Most of these
emissions are measured with CEMSs, which
monitor important information such as the
amount of pollution emitted from a smokestack
and how fast the emissions occur Included in
40 CFR Part 75 are requirements intended to:
•	Ensure that the emissions from all
sources are consistently and accurately
measured and reported.
•	Produce a complete record of emissions
data for each unit subject to the ARP or
CSAPR and also subject to Part 75
requirements.
Emissions trading programs
Emissions trading, sometimes referred to
as "cap and trade" or "allowance trading," is
an approach to reducing pollution.
Emissions trading programs work by first
setting a national or regional limit on the
overall amount of pollution that sources can
emit to the environment. Affected sources
included in the trading program, such as
power plants, then receive allowances that
authorize a certain amount of pollution.
For example, in the ARP, each allowance
authorizes a source to emit one ton of SO2.
A source can decide whether to use an
allowance for compliance, sell it to another
source, or save the allowance for
compliance in the future.
To be in compliance, a source must hold
enough allowances at the end of a
compliance period to account for the
amount of pollution it emitted. If ail sources
are collectively in compliance, total
emissions will be at or below the overall
emissions limit.
Typical coal-fired power plant; such a
facility may have multiple units subject
to the ARP, CSAPR and 40 CFR
Part 75 monitoring requirements.
(EPA photo)
19-P-0207
2

-------
•	Ensure that emissions are not underestimated.
•	Verify that emissions caps are not exceeded.
Further, 40 CFR Part 75 requires several key ongoing QA/QC tests for CEMSs to
ensure the continued accuracy of the emissions data. Three of the tests that are used
for CEMSs that measure SO2 and NOx include:
1.	Calibration error tests compare CEMS data to known reference gas
concentrations to determine whether the amount of error in the CEMS data
is within acceptable limits established by the EPA. These tests are required
to be conducted daily at two reference gas concentrations.
2.	Linearity checks also compare CEMS data to known reference gas
concentrations but do so at three different reference gas concentrations
along the full scale of the CEMS (low, mid and high reference gas
concentrations). Linearity tests are required to be conducted once each
calendar quarter.
3.	Relative accuracy test audits (RATAs) compare CEMS data to data from
independent, EPA-approved emissions monitoring methods (referred to as
reference methods). These tests are required to be conducted semiannually
or annually.
Facilities are required to report electronically to the
EPA their monitoring-related data, including a
monitoring plan, and results of required QA/QC
tests. Facilities report this information to the EPA
using an electronic reporting system called the
Emissions Collection and Monitoring Plan System
(ECMPS). It is important for reported Part 75 CEMS
data to be accurate and meet regulatory requirements
because these data are used to assess compliance
with trading program emissions limits and progress
toward environmental goals. Accurate data are also
important to verify the integrity of the allowances
that are bought and sold under the cap and trade
programs. EPA staff told us that the agency places a high priority on accounting
for all emissions and has developed a "comprehensive, holistic" approach to
overseeing the quality of Part 75 data.
EPA Process for Verifying CEMS Data Quality
The data quality process for a CEMS includes several activities spanning from the
operation of the system at the source facility to the reporting of data to the EPA.
These include proper maintenance and operation of the CEMS, required QA/QC
tests to verify the accuracy of the monitors, recording and storing electroni c
Stack testers performing a RATA. (EPA photo)
19-P-0207
3

-------
monitoring and operating data, and reporting CEMS data to the EPA. The
integrity of the emissions trading programs can break down anywhere along the
QA chain of activities, and thus the EPA uses a combination of electronic and
field auditing to verify the overall integrity of the emissions monitoring data. The
EPA's Clean Air Markets Division (CAMD), which administers the ARP and
CSAPR programs, undertakes several types of activities to oversee the quality of
facility-reported CEMS data, including:
•	Requiring that affected sources report complete data using the detailed
electronic formatting reports in the ECMPS.
•	Automated screening of facility-reported CEMS data, with electronic
QA checks that are programmed into the ECMPS.
•	Statistical analyses, ad-hoc QA checks and desk audits performed by
CAMD staff on the reported data from the ECMPS.
•	Field audits, which are conducted on-site to verify a facility's CEMS
performance and compliance with monitoring requirements.
•	Training and technical assistance for facilities and EPA regional and
state/local agency personnel.
We focused our work primarily on the automated screening checks in the ECMPS
and on-site field audits. The EPA uses automated screening checks to verify data
quality once the data from the CEMS have been recorded and/or reported to the
EPA, while field audits are used to verify on-site conditions and performance of
the CEMS. Figure 1 provides an overview of where in the process the EPA uses
automated screening checks and field audits to oversee CEMS data quality.
Figure 1: Areas where EPA uses automated checks and field audits to oversee the quality of CEMS data
Transferof CEMS data
to ECMPS, and reporting
to EPA
Collection and storage
of data from the CEMS.
Ongoing QA/QC tests to
verify the accuracy of
monitors.
CEMS routine operation
and maintenance.
• Field audits
•Electronic screening
checks
•Field audits
•Electronic screening
checks
•Field audits
•Electronic screening
checks
Source: OIG analysis.
Responsible Office
CAMD, within the Office of Air and Radiation, manages programs that reduce air
pollution from power plants to address several environmental problems. These
include programs to address acid rain, ozone and particle pollution, and the
movement of air pollution across state lines. Programs that CAMD is responsible
for include the ARP and CSAPR. As such, CAMD is also responsible for assuring
the quality of monitoring data reported under these programs.
19-P-0207
4

-------
Scope and Methodology
We conducted our audit from April 2018 through May 2019 in accordance with
generally accepted government auditing standards. Those standards require that we
plan and perform the audit to obtain sufficient, appropriate evidence to provide a
reasonable basis for our findings and conclusions based on our objective. We
believe that the evidence obtained provides a reasonable basis for our findings and
conclusions based on our objective.
To determine whether selected CEMS data meet applicable QA/QC criteria, we
evaluated both the automated screening and the field audit aspects of the EPA's
QA process through a review of monitoring data, field audit reports, and requests
for information from EPA regions and state agencies.
To evaluate the automated data screening process, we selected a sample of units
subject to ARP and/or CSAPR that had CEMSs in place to monitor both NOx and
SO2. The team identified a universe of 725 affected units subject to the EPA's
ARP or CSAPR that used CEMSs to monitor for both SO2 and NOx emissions
under 40 CFR Part 75 monitoring requirements. From this universe, we reviewed
77 randomly selected units.1 We then reviewed data reported to the ECMPS for
the CEMSs in our sample to determine whether the CEMSs were meeting key QA
requirements and the data were consistent with selected EPA reporting
instructions.
For the units in our sample, we obtained emissions monitoring and applicable QA
data that were collected and reported to the EPA between January 1, 2016, and
March 31, 2018. Most data were obtained from the EPA's Field Audit Checklist
Tool (FACT). FACT is a publicly available Windows desktop application that
allows users to easily view monitoring plans, and QA and emissions data that are
reported to the ECMPS by sources subject to Part 75 monitoring requirements.
Data for linearity checks and RATAs were provided to the OIG by CAMD
directly from the ECMPS.
We evaluated the data to determine whether the CEMSs operating on units in our
sample were meeting certain QA requirements for relative accuracy, quarterly
linearity checks and daily calibration.2 Where monitors did not meet required
performance specifications for these elements, we reviewed monitoring data to
determine whether the data were properly characterized to reflect periods where
CEMSs were not meeting QA requirements. Additionally, we verified reported
test results against the supporting data associated with the test (i.e., a test labeled
1	We randomly selected 85 units for review but found that eight of those units were no longer operating. These units
were removed from our sample, and we reviewed the remaining 77 units.
2	40 CFR Part 75 includes requirements for six main QA tests: calibration error tests, interference checks, flow-to-
load ratio, leak checks, linearity checks and RATAs. We chose not to focus on interference checks, flow-to-load
ratio or leak checks because these checks are used to test flow monitors. We were primarily focused on QA tests for
SO2 and NOx concentration monitors.
19-P-0207
5

-------
as passing included results to support that characterization), checked to see
whether appropriate values were reported with test results, and verified certain
calculations used to determine compliance with QA performance specifications.
To evaluate the field audit component of the EPA's oversight process for
reviewing CEMS data quality, we requested that the EPA provide to us all field
audits conducted by CAMD or its contractor from January 1, 2016, through
March 31, 2018. We reviewed these audit reports to identify the types of findings
and recommendations being made in the audits. Additionally, we obtained results
for CAMD's Targeting Tool for Field Audits3 for each quarter from January 1,
2016, through March 31, 2018. Based on the Targeting Tool for Field Audits
results, we identified a sample of 12 facilities and contacted CAMD, EPA regions
and state agencies to determine whether any facilities in our sample had been
audited.
Prior OIG Report
Our office has not previously conducted any audits that directly addressed
whether CEMS data were meeting QA/QC requirements. However, we reported
in September 20094 that the EPA did not have reasonable assurance that the gases
used to calibrate emissions monitors for the ARP and continuous ambient
monitors for the nation's air monitoring network were accurate. We
recommended that the Office of Air and Radiation implement oversight programs
to assure the quality of the EPA protocol gases used to calibrate CEMSs and also
that the EPA's Office of Research and Development update and maintain the
protocol gas procedures. In response to the report, the Office of Air and Radiation
promulgated a final rule establishing a largely self-supported Protocol
Verification Gas Program5 and implemented a plan to have laboratories conduct
routine protocol gas verification activities and communicate results to the EPA.
3	The Targeting Tool for Field Audits, developed by CAMD and its contractor, identifies potential candidates for
field audits based on eight data-quality-related factors.
4	Report No. 09-P-0235. EPA Needs an Oversight Program for Protocol Gases, issued September 16, 2009.
5	Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing, 76 Fed.
Reg. 17288 (Mar. 28, 2011).
19-P-0207
6

-------
Chapter 2
EPA Automated Screening of CEMS Data Is Effective
but Could Be Enhanced to Reduce Minor Inaccuracies
The EPA's automated process for screening CEMS data reported to the EPA
worked as intended and was effective in verifying the quality of reported data.
However, we identified minor inaccuracies in some of the reported data. While
these inaccuracies had no impact on whether the data met QA requirements, the
inaccurate data could have negative impacts on data users. For example, users
could use inaccurate data in independent calculations or could be unable to
accurately query the database. The EPA can prevent the inaccuracies by adding
specific screening checks to its existing reporting software.
CEMS Data Electronically Reported and Screened
The EPA's electronic reporting software for CEMS data—ECMPS—and the
built-in QA checks in the software are significant elements of the agency's process
for verifying the quality of data that facilities report to the EPA. CAMD provides
the ECMPS software for facilities to submit monitoring plans, QA test results, and
emissions and operations data. The software includes thousands of automated QA
checks designed to verify that the reported data are complete, properly formatted,
mathematically correct, consistent with program requirements, and in accordance
with the methods and systems specified in the monitoring plan. For example, for
each of the CEMS QA/QC tests we reviewed, the owner/operator reports data from
that test along with a test result stating whether the CEMS "passed" or "failed."
The automated ECMPS checks are intended to evaluate whether the QA/QC data
reported for the test ("passed" or "failed") were accurate.
When a facility enters CEMS data into the ECMPS, the ECMPS completes a QA
assessment of the data files and generates a feedback report identifying any errors.
According to the EPA, errors deemed "critical" by the ECMPS checks must be
corrected before the ECMPS allows the data to be submitted to the EPA.
Electronic Data Quality Checks on Reported Data Worked as Intended
Based on our analyses of data for three key QA/QC tests (daily calibration error
checks, quarterly linearity checks and RATAs), we believe the automated
screening checks the EPA had in place in the ECMPS were effective in verifying
that the reported data met QA requirements. Specifically, we found that:
• All facility-reported test results ("passed" or "failed") were supported by
the underlying QA/QC data.
19-P-0207
7

-------
•	Data reported in the ECMPS showed that RATA and linearity checks in
our sample were conducted within the time frames required by 40 CFR
Part 75.
•	Reference gas concentrations for daily calibration error checks and
quarterly linearity checks were within required ranges.
Cumulatively, these findings demonstrated that the EPA's electronic checks were
working as intended and were effective in verifying that reported data met key
program requirements.
Test Results Supported by Underlying Test Data in ECMPS
We reviewed data in our sample against performance criteria for three key, ongoing
QA tests on NOx and SO2 CEMSs that are required by 40 CFR Part 75: daily
calibration error checks, quarterly linearity checks and semiannual or annual
RATAs. For each of these tests, the EPA identifies performance specifications6
used to evaluate the acceptability of the CEMSs. CEMSs must meet the
performance specifications for valid emissions monitoring data to be reported from
the CEMSs. For each test, the EPA provides an alternate performance specification
that can satisfy the QA requirements in cases where the primary, or standard,
performance specification is not met. If either the standard or alternate performance
specification is met, the CEMS is considered to have met the QA requirements and
passed the test. Table 1 shows the QA/QC test results for the 77 CEMS units in our
sample as they were reported to the ECMPS.
Table 1: Reported QA/QC test pass/fail rates for CEMS units in our sample
Test result
reported to
ECMPS
QA/QC test
Daily calibration
Quarterly
linearity
Annual/semiannual
RATA
Passed
228,779 (98.98%)
2,208 (98.97%)
881 (99.32%)
Failed or aborted
2,353 (1.02%)
23 (1.03%)
6 (0.68%)
Total
231,132 (100%)
2,231 (100%)
887 (100%)
Source: OIG analysis of CEMS data provided by CAMD and/or obtained via EPA's FACT database.
As shown in Table 1, most of the CEMS QA/QC test results for the units in our
sample were reported as passing the performance specifications for the three key
QA/QC tests examined. To evaluate whether the ECMPS checks were effective in
verifying that these test results were correctly reported by facilities, we used the
reported data for each QA/QC test to independently calculate whether the tests
met performance specifications and test results were correctly characterized by
the reporting facility as either passing or failing. Our review included 231,132
daily calibration error tests, 2,231 linearity tests and 887 RATAs from the
6 These are thresholds identified by the EPA in 40 CFR Part 75 that define the amount of CEMS measurement error
permitted for each QA/QC test.
19-P-0207
8

-------
77 units in our sample. We verified that 100 percent of the reported test results in
our sample were supported by the QA/QC test data reported.
Frequency of RATA and Linearity Tests Complied with Required
Time Frames
In addition to the performance specifications required for each QA/QC test, the
EPA requires that the tests be conducted at certain intervals or within specific
time frames as part of its QA requirements. For the data in our sample, we found
semiannual/annual RATA and quarterly linearity check tests were conducted
within time frames required by 40 CFR Part 75 in nearly all cases.7 In rare
instances where tests did not occur within required time frames, facilities
followed applicable reporting requirements in accordance with 40 CFR Part 75.
Reference Gas Concentrations for Daily Calibration Error Checks
and Quarterly Linearity Checks Were Within Required Ranges
The EPA requires that CEMSs be
tested with certified reference gases at
certain concentration ranges,
depending on the span8 of the monitor,
for both daily calibration error checks
and quarterly linearity checks. We
reviewed the reported test result data
for daily calibration error checks and
linearity checks to determine whether
the reference gas concentrations for
each test met the requirements in
40 CFR Part 75. We found that the
reference gas concentrations used for
these tests were within the required
ranges. However, we found some
instances where incorrect span data were displayed in the FACT database. The
data were reported correctly in the ECMPS, and we were also able to verify the
correct values in facility monitoring plans. Therefore, these issues did not affect
the validity of the data. As a result of our work, CAMD corrected the FACT
display issues in an updated version of FACT released on December 17, 2018.
Calibration gas cylinders. (EPA photo)
7 We evaluated the time frames for semiannual/annual RATA and quarterly linearity checks in our sample but did
not evaluate this aspect of the daily calibration error checks.
x Spent means the highest pollutant or diluent concentration or flow rate that a monitor component is required to be
capable of measuring under Part 75. 40 CFR § 72.2.
19-P-0207
9

-------
EPA Can Enhance Its Data Quality Checks to Reduce Risks of
Inaccurate or Inconsistent CEMS Data
Although the automated screening checks the EPA had in place were effective in
verifying that reported data were consistent with key program requirements, we
found a small number of inaccuracies and inconsistencies in the reported data that
could be improved with enhanced ECMPS checks. In less than 1 percent of the
records we reviewed, we found situations where monitor spans reported in the
ECMPS did not match the span in the applicable monitoring plan. Also, for
approximately 2.4 percent of the QA test records we reviewed, facilities did not
accurately report which performance standard a CEMS passed during a required
QA test. In both types of situations, the EPA's ECMPS software did not have
screening checks in place at the time of our data review that were designed to
identify these types of issues. However, CAMD has started implementing
corrective actions to address these issues.
Monitoring Plan Changes Were Not Accurately Reflected in a
Small Number of Reported Daily Calibration Error Checks
We found three facilities where a small percentage of reported daily calibration
error values were not consistent with independently calculated values—that is, the
daily calibration error values reported by these facilities did not match those that
the OIG independently calculated based on the monitor span and mean difference
values (reference concentration-measured concentration) in the ECMPS. All three
facilities reported monitoring data successfully using one set of monitoring plan
span records. Span values for each monitor are important because they are used to
calculate calibration error. However, through subsequent monitoring plan
submissions, the facilities changed the underlying span records that applied to
previously reported data. This resulted in inaccurate (old) span values appearing
in the ECMPS that did not reflect the updated monitoring plans. When the OIG
used the span values in the ECMPS to independently calculate calibration errors,
our values did not match the reported values for some daily calibration error
results. Figure 2 summarizes the type of information included in facility
monitoring plans and why changes to monitoring systems should be updated in
monitoring plans and reported to the ECMPS.
19-P-0207
10

-------
Figure 2: Incorporating monitoring plan changes into ECMPS
•	The monitoring plan describes how a facility monitors its emissions.
•	Monitoring plan data define relationships between stacks, pipes and units; specify
locations at a facility from which emissions are monitored; and identify systems of
monitoring equipment by detailing the individual system components.
•	The monitoring plan is a "living" document in that it must be continuously updated to
reflect changes to the monitoring systems over time.
•	As technology advances, the monitors originally described in the monitoring plan may
be replaced or the monitoring methodology changed. Also, facility operations may
change and necessitate the use of additional monitors or alternative placement of
existing monitors.
•	For any modification, replacement or other change to an approved monitoring system
or monitoring methodology, the monitoring plan must be updated using the ECMPS
Client Tool.
•	Some elements included in monitoring plans (e.g., monitor span and range values) are
used to determine compliance with 40 CFR Part 75 QA requirements. Therefore, it is
important the ECMPS includes appropriate monitoring plan changes.
Source: OIG analysis.
CAMD stated that because the span changes in the monitoring plan submissions
were made after the evaluation and submission of the emissions file in the
ECMPS, it was difficult for the current version of the ECMPS to identify those
errors. We found this situation only in a very small number of daily calibration
error results that we reviewed (8 out of 231,132, or 0.003 percent). However,
because the ECMPS did not identify these types of situations, there is a risk of
more data points being subject to this type of error, particularly in a situation
where monitoring plan changes applied to more days in a calendar quarter than
the specific instances we saw. If the ECMPS is not able to reconcile monitoring
plan changes retroactively to applicable data that had been previously submitted,
there is a potential risk that the EPA's automated screening process would not
identify certain critical QA and data quality issues.
Based on the OIG's review, CAMD contacted the facilities to resolve the
discrepancies with their reported data and monitoring plans and had them
resubmit the applicable data. As of February 2019, all three facilities had
resubmitted data to the ECMPS to address the issue. In March 2019, CAMD
began implementing a multistep process to identify monitoring plan changes that
could affect previously reported data. According to the Chief of CAMD's
Emissions Monitoring Branch, in the long-term, CAMD plans to implement an
additional ECMPS check that forces retroactive monitoring plan changes to
require the reevaluation and resubmission of any affected QA/QC tests and hourly
emissions data. We believe that adding this type of check to the ECMPS should
result in the detection of monitoring plan changes (e.g., monitor span values) that
will address the inaccuracies we found.
19-P-0207
11

-------
For a Small Percentage of QA/QC Tests, Facilities Incorrectly
Reported Which Performance Standard Was Used to Pass the Test
A small percentage of QA/QC tests for which the monitors met required
performance specifications nonetheless were not accurately labeled in the ECMPS
as meeting either the primary or alternate performance specification. As noted
above, for each of the three QA/QC tests assessed, the EPA identifies both a
standard and alternate performance specification that the CEMS must meet to
produce valid data. According to the EPA's ECMPS reporting instructions, a test
result of "PASSED" should be reported when the test was passed using the
standard performance specification, and a test result of "PASSAPS" should be
reported when the test was passed using the alternate performance specification.
Although this was accurately reported for most test results we reviewed, a small
percentage of results were reported incorrectly, as shown in Table 2.
Table 2: QA/QC test results that did not correctly distinguish between passing the
standard or alternate performance specification
QA/QC test
Total test
results
reviewed
Reported
"PASSED" but
should have
reported
"PASSAPS"
Reported
"PASSAPS" but
should have
reported
"PASSED"
Daily Calibration Error
231,132
5,720 (2.47%)
0 (0.00%)
Linearity Checks
2,231
0 (0.00%)
1 (0.04%)
RATA
887
5 (0.56%)
3 (0.34%)
Total
234,250
5,725 (2.44%)
4 (0.002%)
Source: OIG analysis of CEMS data provided by CAMD and/or obtained via EPA's FACT database.
While these situations do not impact the validity of data from the CEMS, they
could affect data users who seek to distinguish between the CEMS meeting either
the standard or alternate performance standards. As a result of our findings, in
March 2019, CAMD implemented a new ECMPS check to address this issue.
Conclusions
The EPA's existing electronic checks worked as intended and were effective in
verifying that data as reported to the EPA met minimum quality requirements.
However, we found a small number of inaccuracies and inconsistencies in the
reported data that, while having no impact on the validity of the data, could
provide data users with inaccurate or misleading information. The EPA has taken
steps to correct these issues but should finalize a long-term fix to add a check in
the ECMPS that forces retroactive monitoring plan changes to require reporting
entities to reevaluate and resubmit any affected QA/QC tests and hourly
emissions data.
19-P-0207
12

-------
Recommendation
We recommend that that the Assistant Administrator for Air and Radiation:
1. Develop and implement electronic checks in the EPA's Emissions
Collection and Monitoring Plan System or through an alternative
mechanism to retroactively evaluate emissions and quality assurance data
in instances where monitoring plan changes are submitted after the
emissions and quality assurance data have already been accepted by the
EPA.
Agency Response and OIG Evaluation
The agency concurred with the recommendation and provided an acceptable
planned corrective action and completion date. CAMD began implementing a
multistep process to identify monitoring plan changes that could affect previously
reported data. As a longer-term corrective action, CAMD plans to implement an
automated check in the ECMPS requiring facilities to reevaluate and resubmit
affected data when facilities make retroactive span record changes.
Recommendation 1 is considered resolved with corrective actions pending.
Appendix A contains the agency's response to the draft report.
19-P-0207
13

-------
Chapter 3
EPA Should Develop a Streamlined On-Site
Verification Approach to Maximize State Participation
Although the EPA has an effective system for screening data that facilities report
to the EPA on the proper performance of monitoring systems, the EPA and states
conducted few field audits and on-site verifications to verify the integrity of that
data. The field audit process is critical in verifying proper performance of
monitoring systems at facilities subject to 40 CFR Part 75 requirements and
identifying problems that could lead to inaccurate emissions reporting. The EPA
has limited resources to conduct field audits, and most state agencies contacted
were not directly involved in conducting the types of comprehensive field audits
identified in the EPA's Part 75 CEMS Field Audit Manual.
Field Audits and On-Site Verification of CEMS Intended to Verify
Performance of CEMS
Field audits consist of activities primarily conducted on-site at a facility to verify
that a facility's CEMS is performing properly. The EPA considers field audits a
critical part of the process for verifying the quality of facility-reported CEMS
data. While the automated screening process described in Chapter 2 focuses on
data reported by a facility, a field audit is aimed
at evaluating the monitoring process to verify
whether it is performing in an optimal manner
to produce quality data. The EPA's Part 75
CEMS Field Audit Manual provides
recommended procedures and activities to be
conducted during an on-site audit. Some of
these activities include visually inspecting the
monitoring equipment, observing calibration
error tests, reviewing physical records including
a facility's QA/QC plan, and interviewing
facility personnel involved in monitoring.
There are no requirements in 40 CFR Part 75 for the EPA or state/local air
agencies to conduct Part 75 CEMS field audits, but the EPA expects state and
local agencies to play an integral role. For example, the EPA's Part 75 CEMS
Field Audit Manual states that the "EPA relies on State and local agencies to
conduct field audits of monitoring systems to assess the systems performance and
a source's compliance with monitoring requirements." Additionally, the Office of
Air and Radiation's 2018 National Program Manager guidance states that the
EPA expects state and local agencies to "[pjerform electronic and field audits of
monitor certifications, Part 75 continuous emissions monitoring systems (CEMS),
According to the EPA's Part 75
CEMS Field Audit Manual, the
integrity of the emissions
trading programs can break
down anywhere along the QA
chain of activities, and thus the
EPA uses a combination of
electronic and field auditing to
verify the overall integrity of the
emissions monitoring data.
19-P-0207
14

-------
and emissions reporting by sources. States and locals should perform Part 75
CEMS field audits in accordance with the field audit manual."
EPA and State Agencies Conducted a Limited Number of Field Audits
From the start of 2016 to the end of June 2018, CAMD or its contractor conducted
Part 75 CEMS field audits at 16 facilities. In 2015 over 1,200 facilities were
subject to ARP and Part 75 CEMS requirements. CAMD has allocated limited
resources to conduct such audits. In 2016 and 2017, CAMD spent approximately
$60,000 per year to conduct eight and six audits each year, respectively, and
approximately $69,000 to conduct six audits in 2018. According to CAMD's
Chief of the Emissions Monitoring Branch, CAMD expects the amount of funding
for field audits to decrease in 2019 and the future.
Despite the EPA's expectation that state and local agencies play an integral role in
conducting field audits, only one of the 10 states we contacted (Michigan) told us
it conducts Part 75 field audits. However, even Michigan has not conducted any
Part 75 field audits recently; staff from the Michigan Department of
Environmental Quality said they have been focused on other requirements in
recent years. A manager within the Air Resources Division of another state (New
Hampshire) told us that while his staff do not conduct Part 75 audits per se, they
conduct onsite activities and verification that are equivalent to (or go beyond)
such audits every year at all six affected facilities in the state.
According to CAMD, key reasons why states do not conduct Part 75 field audits
are that there are no specific requirements for them to do so and because states
face competing priorities. Although CAMD told us that nothing precludes state or
local agencies from using Clean Air Act Section 105 grant funds9 to conduct
Part 75 field audits, such audits are not currently included in states' Section 105
grant commitments with the EPA. According to CAMD, Section 105 grant work
plans used to include state and local agency commitments to conduct Part 75 field
audits at 10 percent of the applicable facilities in their jurisdictions. However,
these commitments were removed sometime between 2004 and 2010.
CAMD Targets Audits Based on Several Risk-Based Factors and
Has Taken Steps to Better Document Its Selection Procedures
Due to the limited resources available to conduct field audits, CAMD told us it
selects facilities to audit based on several factors. These factors include facilities'
total emissions, operating history, monitoring methodology, control equipment,
anticipated retirement date and type of fuel combusted with priority given to coal-
burning facilities. CAMD also considers the interest of EPA regions or state/local
agencies in a facility, ECMPS errors, and ad-hoc audit results. In addition, CAMD
9 Section 105 of the Clean Air Act provides the EPA authority to administer grants to state and local air pollution
control agencies to support implementation of Clean Air Act activities.
19-P-0207
15

-------
uses results from its Targeting Tool for Field Audits, which identifies potential
candidates for field audits based on eight data-quality-related factors. However, at
the time of our fieldwork, the process was not documented in a standard
procedure.
We reviewed data from the EPA's Emissions & Generation Resource Integrated
Database (known as "eGRID")10 for the facilities subject to the 16 field audits
conducted by CAMD since 2016. We confirmed that these facilities were among
those with high electric generating capacity and high annual NOx and SO2
emissions, which CAMD told us are important factors in targeting facilities for
audits.
CAMD personnel told us that it would be difficult, given the number of factors
considered, to create a standard operating procedure with clear-cut criteria for
audit candidate selection. However, in response to our work, CAMD updated its
standard operating procedures to include guidance for selecting audit candidates,
as well as specific directions for CAMD analysts to document their assessment of
the candidate facilities (i.e., explanation for why a facility is or is not a good
candidate for a field audit) and provide comments and/or recommendations to the
field audit coordinator. We believe it is important to document factors considered
and any justifications for choosing an audit candidate. This documentation could
help inform future audit candidate selections, particularly in cases where certain
factors used in the justification of one audit candidate become linked to specific
risks or problems once audits are completed.
CAMD Should Develop a Streamlined Approach for On-Site Verification
While nine out of 10 states we contacted do not conduct full Part 75 field audits,
seven states told us that they conduct at least some CEMS-related activities
recommended in the EPA's CEMS Field Audit Manual during site visits to
conduct Clean Air Act full compliance evaluations. Some states also told us that
they review excess emissions and RATA reports and/or observe stack testing or
RATAs at facilities. We believe there is an opportunity for CAMD to coordinate
with the states to develop guidance and tools to conduct streamlined reviews
focusing on the highest-priority activities from the EPA's Part 75 field audit
manual. States can then apply a streamlined Part 75 CEMS review process during
full compliance evaluations or other onsite visits.
In response to our audit, as of March 2019, CAMD was developing procedures
for streamlined or focused audits to be included in the Part 75 Field Audit
Manual. The streamlined procedures highlight certain areas of Part 75 CEMSs to
review when a comprehensive CEMS audit is not possible. CAMD was in the
process of working with states to obtain feedback from the state agencies on the
new guidance. We believe CAMD should complete this process of consulting
10 The Emissions & Generation Resource Integrated Database is a comprehensive source of data on the
environmental characteristics of almost all electric power generated in the United States.
19-P-0207
16

-------
with the states to best assess what activities are the highest priority and the most
feasible to include in such a streamlined audit process. In developing this
streamlined review process, CAMD should also assess findings and
recommendations from its recent field audits to identify any common problem
areas at facilities that can be included in the review.
Field Audits Can Identify Problems Not Otherwise Detected and
Verify that Facilities Submit Valid Data to EPA
Although limited in number, field audits conducted by CAMD appeared valuable
in identifying on-site conditions to improve Part 75 CEMS QA. The 16 field
audits CAMD conducted between 2016 and June 2018 resulted in
50 recommendations for facilities to improve their Part 75 monitoring programs.
Nearly all these recommendations addressed conditions that would not have been
identified without on-site audits. Most findings and recommendations were
directed toward updating monitoring and/or QA/QC plans, recording events in
maintenance logs, or using proper substitute data procedures.11
An Environmental Engineer at CAMD told us that on-site review of facilities'
QA/QC plans is an important aspect of field audits. That individual said that
although Part 75 requires facilities to develop a QA/QC plan for Part 75 CEMS,
these plans are not required to be electronically submitted to the EPA. Therefore,
a field audit allows the EPA to verify that QA/QC plans are complete and that the
CEMS data reported electronically to the EPA are valid. When on-site audits and
verification of CEMS performance are lacking, the EPA does not have adequate
confirmation that the CEMSs are being operated in accordance with EPA
requirements and generating accurate data.
Conclusions
On-site audits of CEMS implementation and performance are important parts of
the QA process for verifying the quality of CEMS data reported to the EPA.
However, the EPA conducts a limited number of CEMS field audits, and most
state agencies we contacted were not directly involved in conducting
comprehensive Part 75 field audits. As a result of our findings, the EPA had taken
steps that we believe will help maximize its resources for conducting on-site
CEMS audits. These actions included developing documented procedures to
improve its processes for (1) tracking field audit recommendations and resulting
corrective actions and (2) choosing audit candidates. The EPA could encourage
more on-site review and verification of CEMSs by state agencies by providing
"Although these field audits were successful in identifying recommendations, the EPA did not have an effective
system in place for tracking these recommendations and resulting corrective actions. In December 2018, CAMD
updated its process for tracking audit recommendations and corrective actions based on the OIG's audit. Tracking
recommendations and corrective actions could increase the effectiveness, and allow the EPA to better assess the
impacts, of the audits that CAMD conducts.
19-P-0207
17

-------
additional guidance so that states can incorporate streamlined on-site reviews of
Part 75 CEMSs into their existing on-site visits to facilities.
Recommendation
We recommend that that the Assistant Administrator for Air and Radiation:
2. Develop and distribute to state and local agencies a streamlined field audit
process that agencies can use during full compliance evaluations or other
onsite visits at facilities.
Agency Response and OIG Evaluation
The agency concurred with the recommendation and provided an acceptable
planned corrective action and completion date. CAMD plans to develop a
streamlined audit procedure including a pre-audit tool to help state and local
agency personnel prepare for an audit. Recommendation 2 is considered resolved
with corrective actions pending. Appendix A contains the agency's response to
the draft report.
19-P-0207
18

-------
Status of Recommendations and
Potential Monetary Benefits
RECOMMENDATIONS
Rec.
No.
Page
No.
Subject
Status1
Action Official
Planned
Completion
Date
Potential
Monetary
Benefits
(in $000s)
1
13
Develop and implement electronic checks in the EPA's
Emissions Collection and Monitoring Plan System or through an
alternative mechanism to retroactively evaluate emissions and
quality assurance data in instances where monitoring plan
changes are submitted after the emissions and quality assurance
data have already been accepted by the EPA.
R
Assistant Administrator for
Air and Radiation
3/31/25

2
18
Develop and distribute to state and local agencies a streamlined
field audit process that agencies can use during full compliance
evaluations or other onsite visits at facilities.
R
Assistant Administrator for
Air and Radiation
9/30/19

C = Corrective action completed.
R = Recommendation resolved with corrective action pending.
U = Recommendation unresolved with resolution efforts in progress.
19-P-0207
19

-------
Appendix A
Agency's Response to Draft Report
^DST*%
<
33
^3% I
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, DC 20460
May 24. 2019
OFFICE OF
AIR AND RADIATION
MEMORANDUM
SUBJECT: Response to Office of Inspector General Management Project No. OA&E-FY18-
0181. "EPA Effectively Screens Air Emissions Data from Continuous Monitoring
Systems hut Could Enhance Verification of System Performance"
FROM: William L. Wehrum
Thank you for the opportunity to review and comment on the Office of Inspector General's
(OIG s) report EPA Effectively Screens Air Emissions Data from Continuous Monitoring Systems
but Could Enhance Verification of System Performance. We appreciate the effort that the OIG has
made to alert the Office of Air and Radiation (OAR) to opportunities to enhance the quality of data
from continuous emissions monitoring systems (CEMS). We agree with the findings and
recommendations identified in the report and are grateful for OIG's engagement and review, as it
helped the Clean Air Markets Division (CAMD) make multiple improvements of its systems.
OIG noted that "EPA"s electronic checks were working as intended and were effective in
verifying that reported data met key program requirements." OIG also noted that CEMS met
performance standards approximately 99 percent of the time for the three quality assurance tests
that underwent review but that there were some minor inaccuracies in the reported data. OIG
acknow ledged that these inaccuracies did not affect the validity of the data but could impact data
users. CAMD has already made changes to its systems and procedures based on discussions with
OIG. and OIG has acknowledged these actions in its report, including:
•	Correcting the display of span data in EPA's Field Audit Checklist Tool (FACT) database
as of December 2018:
•	Adding a new Emission Collection and Monitoring Plan System (ECMPS) check to ensure
that the correct labels are applied to quality assurance tests based on whether the test was
passed under the primary specification or alternate performance specification as of March
2019: and
•	Updating CAMD standard operating procedures (SOPs) to include general criteria for
selecting candidate facilities for field audit as of March 2019.
TO:
James L. I latfield
Director. Air Directorate
Office of Audit and Evaluation
Assistant Administrat
19-P-0207
20

-------
('AMD looks forward to implemenftnii additional actions in response to the I wo
recommendations listed in OICCs report. Below are OAR's responses to Oki's specific
recommendations.
Recommendation 1: Develop and implement electronic cheeks in the ITA\ [ (MI'S or
ihrouyh an alternative mechanism to retroactiv eh ev aluate emissions and quality assurance
data in instances where monitoring plan changes are submitted after the emission;, and
quality assurance data h:»\c already been accepted h\ I'PA.
Response I: I he 01 lice ut Air and Radiation agrees with this recommendation. As 01(1
acknowledged in its report, CAM!) has already addressed this issue In implementing a
post-submission data eheck thai is run at the end of each reporting period. The new cheek
identities any monitorinii plan submissions containing changes to monitoring span records
dial occur prior 10 the current emissions reporting period. If am chanties were made, the
check recalculates quality assurance tests that were submitted prior to the spat! change and
verifies the pass'fail status of each test, If the status of any test changes. CAM I) analysis
uill contact the affected facility and request the correction and resubmission of the
impacted data. As of February 2014. ('AMD had insured that the discrepancies in the data
used in OfCfs rev ievv were resolved and resubmitted.
In the long term, ('AMD will implement an additional check in the FCMPS forcing
retroactive span record changes to require the reevaluation and resubmission of any affected
quality assurance tests and hourly emissions records. CAMP has initiated the process of re-
engineering FCMPS. In order to minimi/e additional expenditures on the current version of
R 'MPS. CAMI) will focus on adding the check to the new version of FCMPS.
Planned Completion Date: The post-submission ad-hoc data cheek will he in operation bv
the end ol (J2 201'?, fhe new FCMPS with the cheek will be complete by Q] 2025.
Recommendation 2: Develop and distribute to sink and local a»eneies a streamlined field
audit process that agencies can use durin" full compliance e\ ablations or other onsiic % Kits
at facilities.
Response 2: I he Office of Air and Radiation agrees with this recommendation, field audits
are an important component of the ChMS quality assurance process. In consultation with
the slates, CAMP has developed a streamlined audit procedure that is included in the
revised field Audit Manual. In addition. CAMP has developed an easy -lo-ttse spreadsheet
tool that can be populated with data reported by the facility. 1 his tool will help auditors
prepare for an audit and help them quickly identify potential areas for inquiry, fhe
streamlined audit procedure and spreadsheet tool are current!} going through peer review
by the states.
Planned ( ompletion Date; Both the rev iseil field Audit Manual with the streamlined audit
procedure and the audit spreadsheet tool will be published by the end of Q3 2010.
If you have any questions regarding this response, please contact Jeremv Schreifels.
CAMP at (202) 343-9127,
19-P-0207
21

-------
Distribution
The Administrator
Associate Deputy Administrator and Chief of Operations
Chief of Staff
Deputy Chief of Staff
Agency Follow-Up Official (the CFO)
Agency Follow-Up Coordinator
General Counsel
Associate Administrator for Congressional and Intergovernmental Relations
Associate Administrator for Public Affairs
Director, Office of Continuous Improvement, Office of the Administrator
Assistant Administrator for Air and Radiation
Principal Deputy Assistant Administrator for Air and Radiation
Deputy Assistant Administrator for Air and Radiation
Senior Advisor to the Assistant Administrator for Air and Radiation
Director, Office of Atmospheric Programs, Office of Air and Radiation
Audit Follow-Up Coordinator, Office of the Administrator
Audit Follow-Up Coordinator, Office of Air and Radiation
19-P-0207

-------