Guidance on Vehicle Inspection and
Maintenance (I/M) Test Data Statistics
as Part of Annual I/M Reporting
Requirements
United Stales
Environmental Prutuclion
Agency

-------
Guidance on Vehicle Inspection and
Maintenance (I/M) Test Data Statistics
as Part of Annual I/M Reporting
Requirements
Transportation and Climate Division
Office of Transportation and Air Quality
U.S. Environmental Protection Agency
SEFA
United States
Environmttntsl ProlGfiliOn
Ag en cy
EPA-420-B-20-033
May 2020

-------
1. Purpose of Guidance
The Environmental Protection Agency (EPA) is providing this guidance to help states and local
agencies develop annual reports for vehicle emission inspection and maintenance (I/M)
programs. Annual reports are required for all federally-mandated I/M programs, and these state
reports cover many aspects of I/M program(s), including the number of vehicles tested and the
number and percentage of passed, failed, and/or waived vehicles by model year. These reports
contain data that can be useful in assessing the effectiveness of an I/M program.
This guidance provides clarifications and examples for general and specific annual I/M reporting
requirements, and it reflects the current best practices in use by I/M programs across the United
States. In addition, this guidance was written as a result of a 2018 audit by the EPA Office of
Inspector General (OIG) regarding EPA oversight.1 This guidance is not intended to cover all of
the requirements that need to be met in the annual I/M reports, but rather focus on those where
additional clarification could improve implementation. Greater consistency in the accounting
and reporting of annual I/M data will aid I/M program managers and EPA in monitoring and
evaluating I/M program effectiveness as well as identifying local, state, and national trends.
2. What are the Clean Air Act and regulatory requirements?
The 1990 Amendments to the Clean Air Act (CAA) required I/M programs for certain areas
across the country based upon various criteria, such as air quality status, population, and/or
geographic location. The CAA established two performance levels of I/M programs: "basic" I/M
for ozone nonattainment areas classified as moderate, and "enhanced" I/M. Pursuant to CAA
sections 182, 184 and 187, enhanced I/M programs are mandated in the following areas:
•	All serious or worse ozone nonattainment areas that had a 1980 urban population of
200,000 or more;
•	Metropolitan statistical areas with a 1990 population of 100,000 or more in the Ozone
Transport Region (regardless of their air quality classification); and
•	All moderate or worse CO nonattainment areas with a design value greater than 12.7 parts
per million (ppm) at the time of classification that had a 1980 urban population of 200,000
or more.
EPA promulgated the original I/M rule in 1992, and EPA has since amended the rule several
times. The I/M rule establishes the technical, procedural and administrative requirements to be
met by basic and enhanced I/M programs.
1 "Collecting Additional Performance Data from States Could Help EPA Better Assess the Effectiveness of Vehicle
Inspection and Maintenance Programs" (Report No. OPE-FY17-0018, September 25, 2018). Complete report
available at: www.epa.gov/office-inspector-general/report-coHecting-additional-perFormance-data-states-woiild-
faetp-epa-better.
- 1 -

-------
States with basic and/or enhanced I/M programs are required to submit reports every July to their
EPA Regional Office to satisfy reporting requirements of the I/M rule, 40 CFR 51 Subpart S.
The introductory paragraph to Section 51.366 - Data analysis and reporting includes the
following introductory sentence:
Data analysis and reporting are required to allow for monitoring and evaluation of the
program by program management and EPA, and shall provide information regarding the
types of program activities performed and their final outcomes, including summary
statistics and effectiveness evaluations of the enforcement mechanism, the quality
assurance system, the quality control program, and the testing element.
The annual I/M reporting requirement set forth in the I/M rule calls for four types of reports:
1)	Test Data Report (section 51.366(a));
2)	Quality Assurance Report (section 51.366(b));
3)	Quality Control Report (section 51.366(c)); and
4)	Enforcement Report (section 51.366(d)).
Typically, these four reports are combined into one annual I/M report that is developed and
submitted to EPA by the state air agency. As a result, the annual I/M report covers a wide range
of statistics from the operating program, including the number of vehicles tested, the number and
percentage of passed, failed, and/or waived vehicles by model year, the number of stations and
inspectors operating in the program area, the number of audits conducted, and enforcement
actions taken. This guidance provides general information about the annual I/M reporting
requirements as well as specific clarifications on certain terms and statistics relevant to the Test
Data Report section of the annual report. As noted above, this guidance is not intended to cover
all of the requirements that need to be met in the annual I/M reports, but rather focus on those
that need additional clarification.
3. What OIG recommendation is addressed by this guidance?
In 2006, the EPA OIG released a report of an audit regarding EPA's oversight of I/M Programs
nationally.2 One of the findings of this report was that EPA's efforts to oversee I/M programs
were hampered by the failure of many states to submit the I/M summary data reports required
under 40 CFR 51.366. As a result of the report, EPA's Office of Transportation and Air Quality
(OTAQ) increased coordination with the Regional Offices through monthly conference calls and
an annual workshop to facilitate the I/M programs' annual reporting process and to evaluate
national I/M trend data.
As a follow-up, the OIG conducted another audit in 2018. The final report entitled, Collecting
Additional Performance Data from States Could Help EPA Better Assess the Effectiveness of
2 "EPA's Oversight of the Vehicle Inspection and Maintenance Program Needs Improvement" (Report No. 2007-P-
001, October 5, 2006). Complete report available at: www.epa.gov/sites/production/fi1es/2015-
11 /d o c u in e n l:s/2 008100 5 - 2 00 7 - p - 00001. p d f.
-2-

-------
Vehicle Inspection and Maintenance Programs (OPE-FY17-0018)3 concluded that progress in
the annual I/M reporting process had been made but that some of the test statistics were being
reported inconsistently:
"The agency strengthened its oversight of these annual reports since we issued our
report on the vehicle inspection and maintenance program in 2006. However, further
improvements should be made. "4
The 01G audit made several recommendations to EPA's Office of Air and Radiation (OAR) for
assuring consistent and effective implementation of I/M programs. The complete list of the
OIG's recommendations may be found in Appendix A of the OIG report. Recommendation #5
of the OIG's 2018 report addressed consistent and accurate reporting of annual I/M program test
data and statistics, and OAR responded that guidance would be issued:
Recommendation 5: Develop and implement guidance on the calculation of individual
test statistics in state reports, in order to provide consistency in state reports across
regions.
Response 5: OAR agrees with this recommendation and will respond by directing OTAQ
to issue guidance clarifying how program statistics such as the rates of vehicle failures,
waivers, and disappearing vehicles should be calculated.
To satisfy this recommendation, OTAQ worked closely with the EPA Regional Offices to
develop this guidance document so that states and areas that submit annual I/M reports can do so
in an efficient and consistent manner.
4. General Guidance for I/M Annual Reports
This section provides a general overview for state and local I/M program agencies to use when
developing the Test Data section of their annual I/M report.
a. I/M Test Types
Some of the test data reporting of section 51.366(a) requires the statistics to be differentiated by
test type. The requirements and procedures for the various tests applicable to I/M programs may
be found in 40 CFR 51.357 Test procedures and standards and in the Appendices of 40 CFR 51
Subpart S. For the purposes of reporting these statistics, I/M test types can be generally
classified into three categories:
o Tailpipe Tests - This category includes all types of I/M testing that use an emissions
gas analyzer connected to the vehicle's tailpipe to analyze the exhaust emissions
under certain defined engine load conditions to determine the pass/fail outcome based
on established thresholds or cutpoints. Tailpipe testing encompasses both idle,
3	This report was released on September 25, 2018 and is available at: www.epa.gov/office-inspector-general/report-
collecting-additional-performance-data-states-would-help-epa-better.
4	Ibid., 'At a Glance' preface.
-3-

-------
steady-state and transient testing. Examples include Acceleration Simulation Mode
(ASM), Two Speed Idle (TSI), IM240, and BAR-90/97 tests. With only a few
program exceptions, tailpipe tests are generally performed on model year 1995 and
older gasoline vehicles.
o OBD Tests - This category of I/M tests applies to the on-board diagnostic (OBD)
system checks on model year 1996 and newer gasoline vehicles.
o Alternative Tests - This category includes other tests that are performed as part of an
I/M program including gas cap pressure tests, and visual/anti-tampering inspections.
b.	Reporting Period
EPA regulations require that annual I/M reports are submitted to the I/M program's
corresponding EPA Regional Office by July of the year following the calendar year in which the
I/M tests were performed pursuant to sections 51.366(a), (b), (c), and (d) of the I/M rule. For
these reporting purposes, a vehicle's calendar year test cycle may extend up to four months into
the next calendar year based on the vehicle's compliance deadline. For example, if a given
vehicle requires I/M program compliance by the end of November, the outcome of that vehicle,
via tests or waivers, may need to be tracked though the end of March of the following calendar
year.
The compliance-tracking timeframe of four months beyond a vehicle's applicable compliance
deadline is a relevant and necessary performance indicator that is consistent with the timeframe
established for I/M programs with a computer matching enforcement mechanism pursuant to 40
CFR 51.366(d) (3) (i) and (iii).
c.	Separate Statistic Tracking (for different testing schemes and frequencies)
If an I/M program includes subsets of vehicles that are tested at a different frequency than other
vehicles in the program (for example, fleet vehicles tested quarterly vs. non-fleet vehicles tested
annually) the corresponding statistics should be broken out separately in the annual report. In
addition, if there are two or more I/M program areas in a given state that are substantively
different from one another (i.e., they cover different vehicle classes or model years, use different
tests, or test at different frequencies), each area's statistics should also be reported separately.
Similarly, any I/M program that conducts a remote-sensing vehicle emitter profiling program
such as Clean Screening that excepts vehicles from their normal periodic emission test as a result
of an in-use remote-sensing survey should track and report these excepted vehicles separately.5
5 'Excepted vehicles' is used here distinctly for vehicles that comply via an I/M program's Clean Screen testing
scheme to distinguish these vehicles from exempted vehicles which are non-subject vehicles or rather, those vehicles
that do not require an emissions inspection due to any of the I/M program's exemption criteria, such as fuel type,
vehicle age or Gross Vehicle Weight Rating (GVWR).
-4-

-------
5.
Specific Guidance for Selected I/M Test Data Statistics
This section of the guidance covers specific terms that OTAQ and the Regional Offices have
identified as needing further clarification, based on EPA's reviews of state annual I/M reports.
This section also outlines best practices for deriving selected 40 CFR 51.366(a) (1) and (2) Test
Data section statistics. Test data in section 51.366(a) (2) are required to be broken out by model
year and vehicle type, as well as the number and percentage of vehicles.
Figure 1 is provided below to be used as a general reference for the remainder of this section.
This figure is in intended to illustrate the general process and possible results for an I/M tested
vehicle:
Retest(return to
test attempt)
No longer subject
(scrapped or sold
outside area)
Exempt Vehicles
No Known Final
Outcome
Retest(return to
test attempt)
Compliance
Certificate
Tested Vehicles
Waiver
Inconclusive
(rejected, voided)
Test Attempt
Pass
Fail
Figure 1 - Overview of Possible I/M Test Results for an I/M Program
Although Figure 1 does not include the details for implementing I/M testing, it does provide
context for the relationship between the possible outcomes of I/M testing and the test data
statistics discussed below.
a. Number of vehicles tested by model year and vehicle type
Section 51.366(a) (1) requires the annual I/M report to include the number of vehicles tested by
model year and vehicle type. This statistic does not represent total tests performed but rather
counts the number of individual vehicles tested. Vehicles tested multiple times (or by more than
-5-

-------
one test type) are to be counted only once for this statistic. In other words, every vehicle
included in this tally should have a unique Vehicle Identification Number (VIN).
Consider, for example, a car which undergoes I/M testing for registration purposes in March. A
few months later that same car is sold and tested again for change-of-ownership purposes in
December of that same year. This vehicle is counted only once toward the number of vehicles
tested.
Unique VIN accounting prevents reporting errors due to the double-counting of a vehicle
undergoing more than one test during the reporting period. A VIN occurring in this dataset more
than once for a given calendar year is an indication of double-counting and should be corrected
prior to including the statistic in the annual I/M report.
In addition, every subject vehicle presented for testing and undergoing a test attempt should be
counted toward this tally even if the vehicle's only test attempt is incomplete or yields an
inconclusive result. For example, a vehicle that does not return for a retest after being rejected
(e.g., because the vehicle's OBD system is not yet meeting the program's readiness monitor
criteria) should be counted as a tested vehicle. Incomplete or inconclusive test attempts include
voided, aborted or rejected attempts, which are discussed in greater detail next.
b. Number of vehicles failing initially, per test type
Section 51.366(a) (2) (i) requires the annual I/M report to include the number of vehicles failing
initially per test type by model year and vehicle type. This statistic equals the number of subject
vehicles failing their first completed I/M test of that particular test type during the reporting
period. For these vehicles that fail the initial inspection, only the initial inspection of that test
type is to be included in the final tally; retests are not to be included in this statistic.
Vehicles tested multiple times can be counted no more than once per test type. Thus, every
vehicle included in this tally for a given test type should have a unique VIN. For purposes of
compliance with an I/M program, a vehicle will typically only receive either an OBD or tailpipe
test (but not both) based on the vehicle's model year.
Only a complete test performed in its entirety6 and providing a conclusive failing result should
be considered as an initially failing test for these reporting purposes. Voided,7 aborted8 or test
attempts which end due to invalid conditions9 should not be counted as initially failing tests.
6	However, pursuant to 40 CFR 51.357(a)(3) and (11), a tailpipe test in which the drive cycle is ended early as the
result of an approved fast pass or fast fail algorithm is considered an official completed test.
7	For tailpipe testing, 'Void test conditions' are described at sections (I) (3), (H)(3), (III) (3), (IV) (3), (V)(3) and
(VI) (3) of Appendix B to 40 CFR 51 Subpart S.
8	The term 'aborted test attempt' encompasses a variety of situations in which an I/M test was started or logged into
the vehicle inspection software, but no pass/fail result was recorded for whatever reason. Examples of aborted test
attempts include the software timing-out (e.g., due to lack of inspector activity) or data-entry errors preventing
proper VIN-decoding or a valid compliance certificate/sticker from being issued.
9	40 CFR 51.357(a)(3).
-6-

-------
An initial OBD test attempt that does not meet the required criteria for completed readiness
monitors is considered a rejection.10 In addition, vehicles shall be rejected from testing if the
exhaust system is missing or leaking, or if the vehicle is in an unsafe condition for testing.11
Vehicles rejected from testing are not to be counted as having completed (and thus failed) that
test type with one exception: If upon returning for a follow-up OBD test attempt, the vehicle still
does not meet the readiness criteria, this second attempt is considered an initial test and the
vehicle shall be failed.12 The number of vehicles with incomplete readiness status for any
module supported by OBD systems is reported separately in the annual report pursuant to section
51.366(a)(2)(xxiii).
In other words, although inspection station software and I/M program vehicle inspection
databases should tally all attempted I/M tests, a vehicle that is initially rejected, or with an
inconclusive result, should not be counted as initially failing. However, that vehicle is counted
as a 'tested vehicle' as explained above in Section 5.a. Number of vehicles tested by model year
and vehicle type.
But what constitutes a failing vehicle for the purposes of annual I/M reporting?
•	For tailpipe testing, a failing vehicle is one that completed the corresponding test
procedure but was not in compliance because its emission levels were above the
program's established cutpoints pursuant to the applicable test procedures and pass/fail
criteria found in Appendices B and C of 40 CFR 51 Subpart S.
•	For OBD testing, the pass/fail criteria may be found at 40 CFR 85.2222 Onboard
diagnostic test procedures and 40 CFR 85.2207 Onboard diagnostic test standards.
Examples of failing conditions for OBD tests include — a missing, tampered, or
inoperable vehicle connector, a Diagnostic Trouble Code (DTC) count of one or more
with a Malfunction Indicator Light (MIL) commanded on, and a failed MIL bulb-check.13
There are four subsets of initially failing vehicles:
•	Vehicles passing upon retest;
•	vehicles receiving a waiver;
•	vehicles with no known final outcome; and
•	vehicles no longer subject (e.g., scrapped or sold out of the area).
c. Number of initially failed vehicles receiving a waiver
Section 51.366(a) (2) (v) requires the annual I/M report to include, by model year and vehicle
type, the number of initially failed vehicles receiving a waiver. This waiver count should equal
the number of vehicles in the subset of vehicles failing either the tailpipe or OBD test, that do not
pass either inspection upon one or more retests but instead finish the inspection process by
receiving a waiver. The criteria for issuing a waiver can be found at 40 CFR 51.360 Waivers and
10	40 CFR 51.357(a)(5).
11	Ibid.
12	40 CFR 85.2222(c)(1).
13	Often referred to as key-on/engine-off test.
-7-

-------
compliance via diagnostic inspection. This statistic is not influenced in any way by the number
of retests that occurred prior to the issuance of the waiver.14 Vehicles tested multiple times (or by
more than one test type) are to be counted no more than once for this statistic. Thus, every
vehicle included in this tally should have a unique VIN.
For example, consider a vehicle that undergoes and fails an IM240 test. For these reporting
purposes, that vehicle now counts as a tested vehicle and as an initially failing vehicle under the
tailpipe test type. The motorist then performs some maintenance on the vehicle and returns for a
retest. The vehicle fails the retest. Now the motorist takes it to a recognized repair technician
and spends more than the I/M program's waiver cost threshold on qualified repairs15. The
vehicle then fails its third test, and after the other applicable conditions for a waiver are met, a
waiver is issued. In this example, this vehicle is counted as both initially failing the tailpipe test
type as well as in the number of failed vehicles receiving a waiver.
The number of initially failing vehicles receiving a waiver should only include vehicles that
initially failed for either the tailpipe or OBD test. It should not include vehicles that only fail an
alternative test like a tampering inspection or gas cap check (though vehicles which fail one or
more of these alternative tests in addition to a tailpipe or OBD test should be included if the
waiver is issued for qualifying repairs completed for the cause of the OBD or tailpipe test
failure). The reason for excluding vehicles that only fail an alternative test (such as a tampering
or gas cap inspection) from the waiver figure is because vehicles that only fail for these reasons
should not qualify for a waiver. Specifically, the cost to correct vehicle tampering cannot be
counted toward meeting the applicable waiver cost limit,16 while the cost of replacing a gas cap
should not come anywhere close to exceeding even the lowest waiver cost threshold.
Per 40 CFR 51.360(a) (9), a time extension, not to exceed the period of the inspection frequency,
may be granted to obtain needed repairs on a vehicle in the case of economic hardship when
waiver requirements have not been met. For these reporting purposes, the number of vehicles
receiving a waiver should include the number of vehicles receiving a time extension. However,
time extensions are to be tracked separately and are also reported annually as part of the
Enforcement Report.17
d. Percentage of initially failed vehicles receiving a waiver (i.e., waiver rate)
Section 51.366(a) (2) (v) also requires the reporting of the percentage of initially failed vehicles
receiving a waiver by model year and vehicle type. This percentage is generally referred to as
the waiver rate. It is important to note that some of the confusion in calculating the waiver rate is
due to the use of total test counts as the denominator, instead of the counts of vehicles initially
failing either tailpipe or OBD testing. In other words, the waiver rate should be calculated as the
number of initially failed (OBD or tailpipe tested) vehicles receiving a waiver (as covered above
in Section 5.c. Number of initially failed vehicles receiving a waiver) divided by the total number
14	An initially failing vehicle must always undergo a retest prior to the issuance of a waiver (40 CFR 51.360(a) (1)).
15	The minimum cost threshold for waivers in basic and enhanced I/M programs may be found at 40 CFR
51.360(a)(6) and (7) respectively.
16	40 CFR 51.360(a)(3).
17	40 CFR 51.366(d)(l)(v).
-8-

-------
of vehicles initially failing the tailpipe or OBD test types (i.e., the sum of the tailpipe and OBD
test figures obtained from Section 5.b. Number of vehicles failing initially, per test type). The
waiver rate can therefore be calculated by the equation:
#Waivered Vehicles
Waiver Rate(%) =	x 100%
#Initial OBD Failed Veh. + #Initial Tailpipe Failed Veh.
For this calculation, the number of initially failing vehicles should not include vehicles failing
alternative tests.
e. Number of vehicles with no known final outcome
Section 51.366(a) (2) (vi) requires the annual I/M report to include the number of vehicles with no
known final outcome (regardless of reason). This reported statistic should equal the sum of the
number of subject vehicles in the subsets of vehicles initially failing tailpipe or OBD tests that:
do not pass a retest; do not receive a waiver; get sold or move outside the program area; are
scrapped; or are otherwise prevented from operating within the program area. Note that here
again, for consistency, and because alternative testing varies greatly or is not conducted in many
jurisdictions, the no known final outcome figure should not include vehicles failing alternative
tests.
For these reporting purposes, a vehicle may be tracked up to four months beyond its compliance
deadline in determining whether it is a no known final outcome vehicle. Similarly, if a given
vehicle's initial failing test occurred before the end of the I/M report's calendar year, the four-
month timeframe for compliance may extend into the next calendar year as discussed above in
Section 4.b. Reporting Period. However, a specific unique VIN should not occur more than
once in this statistic.
I/M program agencies are encouraged to use registration/title audits, residency verifications,
third-party vehicle history data and other means to determine if a vehicle has legitimately been
moved/sold outside the I/M program area(s) or has been scrapped. Such vehicles should not be
included in the no known final outcome vehicle tally. For example, a motorist may have sold the
vehicle to a party outside the I/M program area after receiving a failing OBD or tailpipe test but
before four months have passed since the vehicle's registration (or emission sticker) expired. If
this vehicle's new registration can be tracked by the I/M reporting agency, the vehicle should not
be included in the no known final outcome vehicle statistic. However, it should be assumed that
failing vehicles with expired registrations (or emission stickers) beyond the four-month
compliance timeframe are no known final outcome vehicles unless the I/M reporting agency can
verify that it is no longer subject to the I/M program. Likewise, every effort should be made to
remove from this count exempt (or non-subject) vehicles that may have failed a test for whatever
reason despite not needing a passing I/M test for compliance in the corresponding area (due to
any of the program's exemption criteria, such as fuel type, vehicle age or GVWR). The same
applies to vehicles failing reciprocity tests; they should also be excluded from this calculation.18
18 Reciprocity tests are I/M inspections that some programs conduct as a courtesy to motorists from other states
requiring an emissions test who may be away from their home state for an extended period (e.g., out-of-state active
military or college students). See 40 CFR 51.356(a)(3).
-9-

-------
f. Percentage of vehicles with no known final outcome (i.e., no known final outcome
rate)
Section 51.366(a) (2) (vi) also requires the reporting of the percentage of vehicles with no known
final outcome. This term is also referred to as the no known final outcome rate which can be
expressed as:
#NKFO Vehicles
NKFO Rate(/o)	qBD Failed Veh. + #Initial Tailpipe Failed Veh. X
This rate should be calculated as the number of initially failed vehicles with no known final
outcome (as covered above in Section 5.e. Number of vehicles with no known final outcome)
divided by the total number of vehicles initially failing the tailpipe or OBD test (i.e., the sum of
the tailpipe and OBD test figures obtained from Section 5.b. Number of vehicles failing initially,
per test type).
6.	Who can I contact for more information on this guidance?
For questions concerning a particular state or I/M program area, contact the I/M coordinator
through the primary Mobile Source Contact at your EPA Regional Office. A listing of Regional
Mobile Source Contacts is available at: www.epa.gov/transportation-air-pollution-and-climate-
change/office-transportatlon-and-i lity- contacts-to pic.
General questions about this guidance can be directed to Joe Winkelmann at EPA's Office of
Transportation and Air Quality: winkelmann.iosepfa@epa.gov.
Additional information regarding vehicle emission I/M programs can be found on EPA's website
at: www.epa.gov/state-and-local-transportation/vehicle-emissions-inspection-and-maintenance.
Another good resource for I/M programs is the EPA-supported National OBD Clearinghouse:
www.obdclearinghouse.com/.
7.	Does this guidance create any new requirements?
No, this guidance is based on CAA requirements, existing associated regulations, and does not
create any new requirements. The CAA and EPA's I/M rule at 40 CFR Part 51, Subpart S
contain legally binding requirements. This document is not a substitute for those provisions or
regulations, nor is it a regulation itself. Thus, it does not impose legally binding requirements on
EPA, states, or the regulated community, and may not apply to a particular situation based upon
the circumstances. EPA retains the discretion to consider and adopt approaches on a case-by-
case basis that may differ from this guidance but still comply with the statute and applicable
regulations. This guidance may be revised periodically without an opportunity for public
comment. This guidance is for I/M annual reporting and statistical purposes only and is not
intended to suggest a change to the testing regimen, or regulatory compliance criteria of an I/M
program.
- 10-

-------