* O \

KWJ

*1 PRO^^

Recommended Procedures for Development of
Emissions Factors and Use of the WebFIRE
Database


-------

-------
EPA-453/B-24-001
August 2024

Recommended Procedures for Development of Emissions Factors and Use of the WebFIRE

Database

U.S. Environmental Protection Agency
Office of Air Quality Planning and Standards
Sector Policies and Programs Division
Research Triangle Park, NC


-------
Disclaimer

This document has been approved for publication by the Office of Air Quality Planning and
Standards (OAQPS), U.S. Environmental Protection Agency. Mention of trade names or
commercial products in this document does not constitute endorsement by the agency.

This document provides guidance on how the EPA intends to develop emissions factors and
how regulated parties and the general public can use the WebFIRE database. This document is
not intended, nor can it be relied upon, to affect legal rights or obligations. The EPA may decide
to follow the guidance provided in this document, or to act at variance with the guidance based
on its analysis of the specific facts presented.

i


-------
TABLE OF CONTENTS

Section	Page No.

Section 1.0	What is the Purpose of This Document?	1-1

Section 2.0	What is an Emissions Factor?	2-1

2.1	Emissions Data	2-2

2.2	Activity Data	2-3

Section 3.0	How Have We Historically Developed Emissions Factors?	3-1

Section 4.0	How are Emissions Factors Used?	4-1

Section 5.0	What are EPA's Revised Procedures for Developing Emissions Factors?	5-1

5.1	Data Collection	5-1

5.2	Test Data Evaluation	5-4

5.3	Detection Limit Procedures for Calculating Test Run Averages	5-5

5.4	Grouping of Candidate Data and Identification of Outliers	5-6

5.5	Emissions Factor Derivation and Quality Assessment	5-7

Section 6.0	EPA's Interactive Database for the Emissions Factors Program - What is

WebFIRE?	6-1

6.1	What is WebFIRE?	6-1

6.2	How is WebFIRE Used?	6-1

6.3	Who Uses WebFIRE?	6-3

6.4	How Does WebFIRE Improve Emissions Factor Identification and Development?
	6-4

Section 7.0	How Do I Find an Emissions Factor?	7-1

7.1	How Do I Identify and Retrieve an Emissions Factor from WebFIRE?	7-1

7.2	How Do I Obtain Background Information for My Selected Emissions Factor?.. 7-4

Section 8.0	What Parameters Should I Consider When Using or Deriving an Emissions Factor?

	8-1

8.1	Source Category and Process Considerations	8-1

8.2	Control Device Considerations	8-3

8.3	Pollutant Test Method Considerations	8-3

Section 9.0	How Do I Develop a User-Defined Emissions Factor?	9-1

9.1	How Do I Use WebFIRE to Create a User-Defined Emissions Factor?	9-1

ii


-------
9.2 What are the Potential Impacts Associated with Applying a User-defined

Emissions Factor?	9-4

Section 10.0 How Do I Submit Data to WebFIRE?	10-1

10.1	How Are Emissions Tests Documented?	10-3

10.2	What are the CDX and CEDRI and What are their Roles in Submitting Data to

WebFIRE?	10-6

Section 11.0 What is the Data Review and Public Participation Process for Emissions Factor

Development?	11-1

Appendix A - Procedures for Determining Individual Test Report Quality Ratings
Appendix B - Procedures for Handling Test Data That are Below the Method Detection Limits
Appendix C - Procedures For Determining Statistical Outliers

Appendix D - Emissions Factor Development and Data Quality Characterization Procedures

Appendix E - Statistical Procedures for Determining Valid Data Combinations

Appendix F - Source classification codes for source categories containing 15 or fewer sources

iii


-------
LIST OF FIGURES

Figures	Page No.

Figure 5-1. EPA's Revised Procedures for Developing Emissions Factors	5-2

Figure 6-1. WebFIRE Overview	6-2

Figure 7-1. Procedures for Retrieving Emissions Factors from WebFIRE	7-3

Figure 9-1. Emissions Factor Derivation in WebFIRE	9-3

Figure 10-1. Typical Work Flow When Using the ERT	10-5

Figure 11-1. Overview of the WebFIRE Public Participation and Emissions Factor Development

Process	11-2

Figure C-l. Procedures to Identify Data Outliers in a Candidate Data Set	C-2

Figure D-l. Emissions Factor Representativeness Areas for Source Categories Containing More

Than 15 Sources	D-4

Figure D-2. Emissions Factor Representativeness Areas for Source Categories Containing 15 or

Fewer Sources	D-5

Figure D-3. Plot of CTR and N Data from Table D-3	D-9

Figure D-4. Plot of Selected Data from Table D-6	D-ll

iv


-------
LIST OF TABLES

Tables	Page No.

Table 7-1. Data Fields Reported by WebFIRE Emissions Factor Search	7-4

Table A-l. Test Report Quality Rating Tool	A-5

Table B-l. Summary of WebFIRE Procedures for Handling ADL, BDL, and DLL Test Data in

Calculating a Test Average from Run-Level Data	B-2

Table D-l. FQI and Boundary Line Equations	D-2

Table D-2. Individual Test Data and Various Characteristics	D-7

Table D-3. Individual Test Data Values Selected for Developing an Emissions Factor for a Source

Category Containing 15 or Fewer Sources	D-9

Table D-4. Individual Test Data and Various Characteristics for a Source Category with 15 or

Fewer Sources	D-10

Table E-l. Emissions Factor Characteristics for Group A and B	E-2

Table E-2. Emissions Factor Characteristics for Group C and D	E-3


-------
Acronym

ADL

AFS

AFSEF

AIRS

AP-42

ASCII

ASTM

BDL

CAA

CARB

CAS

CBI

CDX

CEDRI

CEMS

CO

C02

CROMERR

CSV

CTR

DGM

DLL

EIS

EMC

EPA

ERT

ESA

ESP

EST

FAQs

FIRE

FQI

HAP

HTML

ITR

L&E

MACT

MDL

MLE

LIST OF ACRONYMS

Term

Above Detection Limit
Air Facility System

AIRS Facility Subsystem Emission Factor
Aerometric Information Repository System

Compilation of Air Pollutant Emission Factors, Volume I: Stationary Point and
Area Sources

American Standard Code for Information Interchange
American Society for Testing and Materials
Below Minimum Detection Limit
Clean Air Act

California Air Resources Board
Chemical Abstracts Service
Confidential Business Information
Central Data Exchange

Compliance and Emissions Data Reporting Interface
Continuous Emissions Monitoring System
Carbon Monoxide
Carbon Dioxide

Cross-Media Electronic Reporting Regulation

Comma Separated Values

Composite Test Rating

Dry Gas Meter

Detection Level Limited

Emission Inventory System

Emission Measurement Center

Environmental Protection Agency

Electronic Reporting Tool

Electronic Signature Agreement

Electrostatic Precipitator

Eastern Standard Time

Frequently Asked Questions

Factor Information Retrieval

Factor Quality Index

Hazardous Air Pollutants

Hypertext Markup Language

Individual Test Rating

Locating and Estimating

Maximum Achievable Control Technology

Minimum Detection Limit

Maximum Likelihood Estimator

vi


-------
NEI

NELAP
NESHAP

National Emissions Inventory

National Environmental Laboratory Accreditation Program
National Emission Standard for Hazardous Air Pollutants

Acronym

NOx

NSPS

OAQPS

PDF

PDS

PM

PMio

OA

QA/QC
Ql

RATA

SCC

SES

S02

STAC

THC

TRI

XATEF
XML

LIST OF ACRONYMS (Continued)

Term

Oxides of Nitrogen

New Source Performance Standard

Office of Air Quality Planning and Standards

Portable Document Format

Project Data Set

Particulate Matter

Particulate Matter with an aerodynamic diameter of 10 microns or less

Quality Assurance

Quality Assurance/Quality Control

Qualified Individual

Relative Accuracy Test Audit

Source Classification Code

Source Evaluation Society

Sulfur Dioxide

Stack Testing Accreditation Council

Total Hydrocarbons

Toxics Release Inventory

Crosswalk/Air Toxics Emission Factor System

Extensible Markup Language

vii


-------
Section 1.0
What is the Purpose of This Document?

This guidance document describes the procedures, data evaluation criteria, and
associated tools and data management systems that the U.S. Environmental Protection Agency
(EPA) recommends for developing air pollutant emissions factors for stationary emissions units
or processes. This document supersedes the previous EPA guidance document for emissions
factor development (Procedures for Preparing Emission Factor Documents (EPA-454/R-95-015,
November 1997)).

This document presents an introduction to emissions factors and provides the historical
background for how and why the EPA has developed emissions factors for stationary emissions
units or processes. This document also describes the approach and procedures recommended
by the EPA for developing new or revising existing emissions factors.

This document provides an overview of the EPA's WebFIRE - an online data storage and
emissions factor retrieval and development tool. Also discussed are the EPA's Electronic
Reporting Tool (ERT) and WebFIRE template spreadsheet that facilitate the development and
documentation of emissions test reports. In addition, this document presents procedures that
may be followed by individuals and entities submitting emissions data and related process data
to WebFIRE. Finally, this document provides an overview of the data review and public
participation process that the EPA follows when developing new or revised emissions factors.

This document is organized as follows:

Section

Contents

2.0

An overview of the characteristics that define an emissions factor.

3.0

A brief summary of the EPA's historical procedures used to develop
emissions factors and the various support programs prepared by the
agency.

4.0

A discussion of the various uses and limitations of emissions factors.

5.0

An overview of the agency's revised approach for developing EPA
emissions factors.

l-l


-------
Section 1.0

What is the Purpose of This Document?

Section

Contents

6.0

An overview of WebFIRE, the EPA's online application for storage,
retrieval and development of emissions factors.

7.0

The steps users can follow to identify and retrieve emissions factors
from WebFIRE.

8.0

Considerations that should be evaluated when using or deriving
emissions factors.

9.0

The procedures users can follow to develop a user-defined emissions
factor from a collection of related data contained in WebFIRE.

10.0

The steps to follow to submit emissions and related process data to
WebFIRE.

11.0

The process by which the public can participate in the periodic
development of EPA's emissions factors.

This document also contains the following appendices:

Appendix

Contents

A

Procedures for determining individual test report quality ratings

B

Procedures for handling test data that are below the method detection limits

C

Procedures for determining statistical outliers

D

Emissions factor development and data quality characterization procedures

E

Statistical procedures for determining valid data combinations

F

Source classification codes for source categories containing 15 or fewer
sources

1-2


-------
Section 2.0
What is an Emissions Factor?

An emissions factor is used to estimate air pollutant emissions from a normally-
operating process or activity (e.g., fuel combustion, chemical production). An emissions factor
relates the quantity of pollutants released to the atmosphere from a process to a specific
activity associated with generating those emissions. For most application purposes, users
typically assume that an emissions factor represents the average emissions for all emitting
processes of similar design and characteristics (i.e., the emissions factor represents a
population average).

The simplest form of an emissions factor is a ratio of the mass of pollutant emitted per
unit of activity generating the emissions (e.g., pounds of particulate matter (PM) emitted per
ton of coal burned). Typically, emissions factors are used to estimate process emissions as
follows:

E = A x EF x [1 - (ER/100)]

Where:

E = emissions estimate,

A = activity rate,

EF = emissions factor, and

ER = overall emissions reduction achieved by controls (%).

Emissions factors for more complex processes or activities (e.g., paved and unpaved roads,
organic liquid storage tanks) are typically expressed using empirical equations. The empirical
equation relates independent variables to the source emissions and typically provides for
improved predictive accuracy when compared to a simple emissions factor. For example, the
following emissions factor for vehicles traveling on unpaved surfaces at industrial sites was
taken from the EPA's Compilation of Air Pollutant Emission Factors, Volume I: Stationary Point
and Area Sources (AP-42) (Fifth Edition, Section 13.2.2):

2-1


-------
Section 2.0

What is an Emissions Factor?

E = k (s/12)a (W/3)b

Where:

E =	particle size-specific emissions factor (pound/vehicle miles traveled),

k =	particle size multiplier (pound/vehicle miles traveled),

s =	surface material silt content (%),

a, b =	particle size-specific empirical constants, and

W=	mean vehicle weight (tons).

2.1 Emissions Data

Typically, emissions data are obtained through direct measurement of releases from a
process or activity (i.e., a sample of the process emissions is collected and analyzed). The
emissions rate for the source, expressed in terms of mass of pollutant emitted per time unit
(e.g., pounds of PM per hour), is calculated as the arithmetic average of the available, quality-
assured test data. Depending on the sampling location and configuration of the process and
associated control devices (if any), emissions data can reflect controlled or uncontrolled
emissions.

Direct measurements of facility or process emissions are conducted for a variety of
reasons such as:

•	Characterize process emissions and/or control device performance,

•	Assess changes in process or control device operation on emissions, and

•	Demonstrate compliance with federal, state, local, or tribal air regulations.

Emissions testing may also be conducted for purposes such as conducting relative accuracy test
audits (RATAs), linearity checks (i.e., measures an instrument's ability to provide consistent
sensitivity throughout the weighing range) and routine calibrations of continuous emissions
monitoring system (CEMS) equipment.

The emissions rate for a specific process can also be determined by using a mass
balance approach. In general, mass balances are appropriate for use in situations where the
mass of all the materials entering and exiting a process can be quantified. Using this mass
balance approach, pollutant emissions are calculated as the difference in process inputs and

2-2


-------
Section 2.0

What is an Emissions Factor?

outputs. For certain processes, a mass balance provides an easier and less expensive estimate
of emissions than would be obtained by direct measurement. For example, carbon dioxide
(CO2) emitted from a fuel combustion process can be estimated from the stoichiometric
relationship of the chemical reactants (i.e., carbon contained in the fuel and oxygen in the
combustion air), the amount of each reactant that is consumed in the combustion process and
the amount of carbon remaining in any residual material (e.g., ash). Although a mass balance
approach may be suitable for certain processes, this approach may not be appropriate to
estimate emissions from a process or activity in which the accuracy or uncertainty of the
quantities of input and output materials is a concern.

2.2 Activity Data

The composition and magnitude of emissions generated by a process unit are affected
by a variety of process parameters such as raw materials and fuels used; process operating
conditions; equipment configuration and age; and the skill and experience of process operators.
Activity data for use in developing emissions factors are the parameter(s) that directly influence
the quality and quantity of emissions from a process unit. Generally, activity data are collected
during an emissions test to verify that the process is operating at the desired production level
(e.g., to satisfy an operating permit emissions limit). Activity data are typically expressed either
in terms of a process input or output per time unit (e.g., gallons of oil burned per hour, tons of
cement produced per day). For example, the activity data for a PM emissions factor for
plywood manufacturing processes could be expressed in terms of the square feet of plywood
produced per day. For an emissions rate determined using a material balance approach, the
activity data would typically include one or more process parameters used in the material
balance.

2-3


-------
Section 3.0

How Have We Historically Developed Emissions Factors?

To assist the EPA in carrying out its responsibilities under the Clean Air Act, the Agency
has developed methods with which to characterize and quantify air pollutant emissions from
processes and activities on a nationwide basis. Because there were a large number of diverse
emissions sources, developing national estimates based upon site-by-site emissions testing was
not feasible. Consequently, we developed criteria and non-criteria pollutant emissions factors
for certain industrial processes or source categories for use in preparing emissions inventories.
These emissions factors were based upon emissions test data, material balance calculations,
modeling and engineering judgment.

In 1972, the EPA's Office of Air Quality Planning and Standards (OAQPS) published the
first document containing the EPA's emissions factors and supporting documentation
(Compilation of Air Pollutant Emission Factors, Volume I: Stationary Point and Area Sources (AP-
42)). As an aid to end users, OAQPS developed relative quality ratings for the AP-42 emissions
factors, based upon the EPA's analysis of the quality of the underlying test data values and how
representative the emissions factor was for the particular source category for which it was
developed. The letter-grade ratings (e.g., A for excellent, E for poor) were based primarily on
engineering judgment and did not incorporate statistical error bounds or confidence intervals.

Since its initial publication, we have periodically revised and updated AP-42 to
incorporate new data and emissions-estimating methodologies. The last hard-copy version of
AP-42 (fifth edition) was published in 1995; although, we have released six supplements
(Supplement A through Supplement F) through 2000. After 2000, updates to AP-42 were
provided only electronically. Currently, the fifth edition of AP-42, the supplements and related
information are available at: www.epa.gov/air-emissions-factors-and-quantification/ap-42-
compilation-air-emissions-factors.

In addition to AP-42, we developed several other compilations of available emissions
factors. To provide the user community with additional emissions factor information for air

3-1


-------
Section 3.0

How Have We Historically Developed Emissions Factors?

toxic pollutants beyond what was available in AP-42 at the time, we initiated the Locating &
Estimating (L&E) document series in 1984. Unlike AP-42, which is organized by source category,
the majority of the L&E documents focused on a specific pollutant (e.g., arsenic, benzene) or
related group of pollutants (e.g., polycyclic organic matter). The L&E documents made use of
AP-42 emissions factors, where available; however, in some cases, the AP-42 emissions factors
were revised or supplemented to present the most complete assessment of the emissions for
the specific air pollutant. A total of 36 individual L&E documents were produced through 1998.

We also compiled the Aerometric Information Retrieval System (AIRS) Facility
Subsystem Emission Factors (AFSEF) and the Crosswalk/Air Toxics Emission Factors (XATEF)
databases in 1990. The AFSEF database documented all emissions factors for criteria pollutants
that existed in the AIRS mainframe look-up tables as of March 1990. The XATEF database
contained emissions factors for toxic air pollutants that were developed based upon data
available to the EPA through October 1990. Ultimately, the EPA retired the AFSEF and XATEF
databases and created the Factor Information Retrieval (FIRE) Data System. The FIRE database
contains emissions factors from all AP-42 sections posted by September 1, 2004, the L&E
document series and the retired AFSEF and XATEF databases.

Other specialized studies have produced documents containing average emissions rates
for various processes which have been posted on the CHIEF web page and which may still
represent the most currently-available estimation tools for those processes.

In 1997, we provided guidance materials (Procedures for Preparing Emission Factor
Documents, EPA-454/R-95-015, November 1997) that described the procedures, technical
criteria and standards and specifications for developing and reporting air pollutant emissions
factors for publication in either AP-42 or the L&E document series. This 1997 guidance
document covered the compilation, review and analyses of new data and information and
preparation of supporting documentation for emissions factor development.

Although OAQPS has focused significant effort and resources on developing emissions
factors, the procedures and guidance we have historically followed (documented in the EPA's

3-2


-------
Section 3.0

How Have We Historically Developed Emissions Factors?

Procedures for Preparing Emission Factor Documents, November 1997) have not kept pace with
the increased volume of available emissions data or advances in information technology. For
example, although AP-42 is available online, the format is analogous to a hard-copy document
which is not conducive to incorporating new data, making corrections to data, or conducting
data analyses. Also, because of their complex and somewhat subjective nature, the past
emissions factor development procedures were slow to incorporate new emissions test data
and did not encourage active public participation. To address these shortcomings, we have
revised our approach for developing emissions factors to be more responsive and transparent.
Section 5.0 discusses our revised approach to developing and documenting emissions factors.

3-3


-------
Section 4.0
How are Emissions Factors Used?

Emissions factors are used to develop emissions estimates for processes and activities in
cases where direct measurements are unavailable. Emissions factors are typically developed to
represent long-term (e.g., annual) average emissions and, accordingly, data used for developing
the emissions factors is usually based on emissions testing collected during normal process
operating conditions. Short-term emissions from a particular process will vary significantly over
time (i.e., within-process variability) because of fluctuations in normal process operating
conditions, control device operating conditions, raw materials, ambient conditions and other
factors. Because of the relatively short duration of emissions tests and the limited range of
conditions they represent, the available emissions and process data used to develop an
emissions factor are not sufficient to account for these short-term emissions fluctuations.

Historically, emissions factors developed by the EPA were intended to be available for
use in preparing regional and national emissions inventories when valid site-specific
information (including material balances or other engineering calculations) were not available.
These inventories are typically the first part of the development of a regional or national
control strategy to reduce area-wide emissions. These inventories are important tools in air
quality management because they are used to estimate ambient pollutant concentrations; to
model pollutant dispersion and transport in the atmosphere; and to develop and assess control
strategies. Despite their original purpose, we are aware that emissions factors have been
applied by other entities (e.g., federal, state, tribal and local agencies; consultants; industries)
for purposes beyond the intended use of supporting national and regional emissions inventory
programs.

We remain concerned that emissions factors have been applied to these non-emissions
inventory uses without consideration of the limitations inherent in the use of emissions factors
(e.g., factors are not particularly suitable to developing short-term or site-specific emissions
estimates). Users of emissions factors should consider the impact of the reliability of emissions

4-1


-------
Section 4.0

How are Emissions Factors Used?

factors on their non-inventory programs (e.g., apply statistical procedures to account for
variability). Such creators and users of emissions factors may wish to conduct periodic retesting
to confirm or revise as necessary, the emissions factor.


-------
Section 5.0

What are EPA's Revised Procedures for Developing Emissions Factors?

Beginning in 2003, OAQPS, the National Academy of Sciences and the EPA's Office of
Inspector General conducted a review of the agency's emissions factors program. Based upon
the feedback received from stakeholders (e.g., industry, state/local/tribal entities, the EPA's
program offices, environmental organizations), we revised our historical approach to
developing emissions factors to reduce the level of subjectivity involved in the emissions factor
development process. Our revised approach also improves the transparency and
responsiveness of the process and encourages meaningful public participation. Figure 5-1
provides an overview of our revised approach for developing new emissions factors and for
revising existing factors using test data submitted electronically to WebFIRE. The following
sections describe the key revisions that we implemented in our approach regarding the
collection of emissions data and supporting documentation, the evaluation of data, and the
development and assessment of emissions factors.

5.1 Data Collection

Based upon the review of our emissions factor program, we found that most emissions
testing information and associated data are currently generated electronically. To take
advantage of advances in information technology and the more widespread availability of
electronic data production, our revised approach focuses on collecting new emissions data
available in an electronic format.

To aid facilities in planning and reporting the results of emissions tests, we developed
the ERT and the WebFIRE template spreadsheet (see Section 10.1). The ERT replaces time-
intensive manual methods for test planning, test data compilation and reporting and data
quality assurance evaluations. The WebFIRE template spreadsheet is an alternative way that
allows data that were collected using test methods not supported by the ERT and data collected
prior to January 1, 2012, (the inception date of the ERT) to be submitted to EPA. Because of the
prevalence of electronic data, we believe that our transition from the use of predominantly

5-1


-------
Section 5.0

What are EPA's Revised Procedures for Developing Emissions Factors?

hard-copy resources (e.g., test reports, technical publications) for emissions factor
development to the use of data in an electronic format will be relatively easy. The use of an
electronic format will facilitate the ongoing collection, incorporation and analysis of new test
data and supporting documentation. Also, use of the ERT and the WebFIRE template
spreadsheet will enable us to streamline the emissions factor development process through
more rapid data handling and quality assurance checks.

Figure 5-1, EPA's Revised Procedures for Developing Emissions Factors

5-2


-------
Section 5.0	What are EPA's Revised Procedures for Developing Emissions Factors?

5-3


-------
Section 5.0

What are EPA's Revised Procedures for Developing Emissions Factors?

5.2 Test Data Evaluation

Historically, the EPA's quality ratings of emissions test data and test reports were largely
subjective because each test program presented different issues (i.e., no two facilities, their
operation or the tests conducted at those facilities are exactly alike). Typically, the EPA
developed letter-grade quality ratings (A through E) for test reports based upon the agency's
review of the following criteria areas:

•	Process operation,

•	Test method and sampling procedures,

•	Process information, and

•	Analysis and calculations.

To reduce the subjectivity of our qualitative assessment of the emissions, process and control
device data collected during an emissions test, we have developed a more objective rating
system for test reports (see Appendix A). The rating system is intended to produce unbiased
and consistent assessments of the information included in test reports which, in turn, will help
us to better characterize the process and the quality of emissions values.

The rating system consists of a set of objective review questions developed for the EPA's
manual and instrumental test methods that assess the quality of the process, control device
and measurement data collected during an emissions test in the following criteria areas:

•	General information,

•	Process and control device information,

•	Sampling locations,

•	Test methods and reporting requirements,

•	Sampling equipment calibrations,

•	Sample recovery,

•	Laboratory analysis, and

•	Documentation.

The Individual Test Rating (ITR) is a numeric score determined for each test report as the
prorated sum of the individual scores assigned to each review question based upon the
answers provided (see Appendix A).

5-4


-------
Section 5.0

What are EPA's Revised Procedures for Developing Emissions Factors?

Our rating system is designed to allow for potential increases in the ITR value through
independent review by a regulatory agency. In cases where a regulatory reviewer affirms the
original responses provided to the review questions, additional points are assigned to the ITR
value originally assigned by the ERT or the WebFIRE template spreadsheet when the
measurement data were initially recorded by the testing contractor. If the regulatory reviewer
determines that the initial review points were incorrectly assigned, the points originally
assigned to a particular review question are deducted from the ITR.

5.3 Detection Limit Procedures for Calculating Test Run Averages

The EPA defines the minimum detection limit (MDL) as "the minimum concentration of
a substance that can be measured and reported with 99 percent confidence that the analyte
concentration is greater than zero and is determined from an analysis of a sample in a given
matrix containing the analyte." Essentially, the MDL is the smallest amount of a substance that
an analytical method can reliably be distinguished from zero, at a specified confidence level,
from the instrument signal produced by a blank sample. The ERT and the WebFIRE template
spreadsheet provide emissions data at the test run level and provide a flag for each test run
that designates whether the emissions value is above the MDL of the test method (ADL), below
the MDL (BDL), or detection level limited (DLL) for cases where some of the measurement data
used to derive the test run value are BDL.

For each ERT or WebFIRE template spreadsheet submission, we calculate the average of
the test run values and assign a detection limit flag to the average as follows:

1.	If all test runs are ADL, we calculate the average of the test run values and assign an
ADL flag.

2.	If the test runs are a mix of ADL and DLL (or all DLL) data, we calculate the average of
the test run values and assign a DLL flag.

3.	If all test runs are BDL, we calculate the average using the BDL/2 test run values and
assign a BDL flag.

5-5


-------
Section 5.0

What are EPA's Revised Procedures for Developing Emissions Factors?

4. If the test runs are a mix of ADL (or DLL) and BDL data, we calculate the average
value excluding the BDL/2 test run value(s) that are greater than the highest ADL (or
DLL) value and assign a DLL flag.

Appendix B contains a more detailed discussion of the procedures that we follow for handling
detection limit flags.

5.4 Grouping of Candidate Data and Identification of Outliers

To assemble data sets for calculating candidate emissions factors, we group the
calculated average emissions values (and the underlying data for existing AP-42 emissions
factors, if available) by unique combinations of Source Classification Code (SCC), pollutant,
control device configuration and units (i.e., mass of emissions per activity). We then subject the
candidate data set to statistical outlier tests to determine if we should eliminate any average
emissions values from the emissions factor calculations. A statistical outlier refers to one or
more values that do not conform to the statistical pattern established by other values under
consideration for the same process. These outlier values can be caused by an unusual process
condition or circumstance that produced an unexpected and unrepresentative variation in the
process emissions.

For the purposes of identifying outliers, our revised approach for developing emissions
factors uses the Dixon Q test or the Rosner test, depending on the number of average emission
values in the candidate data set. If there are fewer than three values in the subject data set,
WebFIRE does not conduct the outlier analysis or calculate the candidate emissions factor.
Appendix C contains a detailed discussion of the procedures we use to determine outliers. Our
procedure omits outliers when calculating the value of the candidate emissions factor.

Although not included when calculating candidate emissions factors, we do not remove values
identified as outliers by the Dixon Q or Rosner tests from the WebFIRE database. Facilities
routinely submit emissions data to WebFIRE and, as the data population in the unique
groupings increases overtime, the statistical characteristics (e.g., average, minimum, and
maximum values) of the data grouping that determine whether a value is an outlier also change

5-6


-------
Section 5.0

What are EPA's Revised Procedures for Developing Emissions Factors?

(i.e., a value identified as an outlier today may not be identified as an outlier tomorrow when
the data population contains more values).

5.5 Emissions Factor Derivation and Quality Assessment

After evaluating the candidate data set for outlier values, we follow a step-wise
procedure to: (1) calculate an emissions factor value using the average emissions values that
result in the highest quality rating and the most representative factor for the source category of
interest, and (2) assign the quality rating of the resulting emissions factor. The procedures for
calculating the emissions factor value and assessing factor quality are based upon an evaluation
of the number of individual sources in the source category for which we are developing the
emissions factor, the individual test report quality rating (ITR) and the number of individual test
data values used to calculate the emissions factor. WebFIRE does not calculate a candidate
emissions factor value in cases where all the data in a unique grouping are BDL. Appendix D
contains a detailed description of the emissions factor development, and data quality
characterization procedures.

5-7


-------
Section 6.0

EPA's Interactive Database for the Emissions Factors Program - What is WebFIRE?

6.1	What is WebFIRE?

WebFIRE is the EPA's online emissions factors repository, retrieval, and development
tool. The WebFIRE database contains the EPA's emissions factors for criteria and hazardous air
pollutants (HAP) for industrial and non-industrial processes. In addition, WebFIRE contains the
individual test data values, where available, and supporting documentation used to develop the
factors and other data submitted to the EPA by federal, state, tribal, and local agencies;
consultants; and industries. For each emissions factor and individual test data value, WebFIRE
contains descriptive information such as industry and source category type, control device
information, the pollutants emitted and supporting documentation. The home page for
WebFIRE and links to Frequently Asked Questions (FAQs) and background information on data
contained in the WebFIRE system can be found at: www.cfpub.epa.gov/webfire/.

6.2	How is WebFIRE Used?

Currently, the primary function of WebFIRE for the public is to provide storage and
retrieval of emissions factors and emissions test data. The EPA also expanded WebFIRE to
provide public users tools for calculating and assessing the representativeness of emissions
factors derived from a set of individual test data values selected by the public user. Figure 6-1
provides an overview of WebFIRE and its basic functionality for the public user.

6-1


-------
Section 6.0

EPA's Interactive Database for the Emissions Factors Program-What is WebFIRE?

Figure 6-1, WebFIRE Overview

6-2


-------
Section 6.0

EPA's Interactive Database for the Emissions Factors Program-What is WebFIRE?

To retrieve an EPA emissions factor, WebFIRE allows users to specify one or more of the
following search criteria: SCC, pollutant, control device, and AP-42 section. WebFIRE allows
users to include revoked emissions factors in the search results. The emissions factor search
also provides a link that presents the data values used to derive the selected emissions factor
and the data values excluded from the emissions factor calculation. Section 7.0 provides a more
detailed discussion of the WebFIRE emissions factors search and retrieval tools.

The EPA also added to WebFIRE a tool that allows a user to calculate an emissions factor
from a set of individual test data values stored in WebFIRE. These WebFIRE tools incorporate
our revised approach for developing emissions factors (see Section 5.0). In general, the user
selects the individual test data values to be used in developing an emissions factor and
WebFIRE evaluates the data set to identify outlier values and confirms that the data set does
not consist of only BDL values. Following the outlier value analyses and BDL assessment,
WebFIRE calculates an emissions factor value from the data set that best represents the
process of interest. WebFIRE also assigns a relative quality rating to the user-defined emissions
factor. Section 9.0 discusses the WebFIRE emissions factor development tools in more detail.
Appendices B through D contain the BDL and outlier analyses and the calculations and
procedures for deriving an emissions factor.

6.3 Who Uses WebFIRE?

The data storage, retrieval, and emissions factor development capabilities of WebFIRE
are available online to all public and private entities. Examples of WebFIRE users include, but
are not limited to:

•	Federal, state, local, or tribal air pollution control and regulatory agency personnel
(example uses include: emissions inventory development, preparation of emissions
estimates for dispersion modeling, comparison of a site-specific emissions factor to
an EPA emissions factor for a given process).

•	Environmental staff at industrial facilities (example uses include: emissions and
process data submittal; comparison of process emissions to an EPA emissions factor
or other related data).

•	Environmental organizations (example uses include: air emissions oversight).

•	Engineering consultants, university researchers, and international air agencies.

6-3


-------
Section 6.0

EPA's Interactive Database for the Emissions Factors Program-What is WebFIRE?

Periodically, the EPA will use the test data and development tools contained in WebFIRE to
revise existing and derive new emissions factors as discussed in Section 11.0. The EPA also
anticipates using the test data submitted to WebFIRE to inform our air rule development efforts
under the Clean Air Act.

6.4 How Does WebFIRE Improve Emissions Factor Identification and

Development?

The emissions factor repository, retrieval, and development tools in WebFIRE facilitates
the EPA's progress towards our goal of developing an interactive emissions factors program
that will incorporate new data as they become available and produce high-quality emissions
factors in a timely manner. We also believe that the benefits of online data access and
electronic data submittal provided by WebFIRE will provide for easier, more effective
involvement by the public interested in developing and improving emissions factors.

WebFIRE also allows the EPA to shift the role of OAQPS from that of sole developer of
emissions factors to that of a facilitator. Because of this shift, we can focus more resources on
overseeing the emissions factor program and ensuring that the program develops high-quality
emissions factors using a consistent and transparent approach.

6-4


-------
Section 7.0
How Do I Find an Emissions Factor?

7.1 How Do I Identify and Retrieve an Emissions Factor from WebFIRE?

The WebFIRE emissions factor search allows you to focus the factor retrieval process by
entering multiple search criteria including:

•	SCC,1

•	Control device type,

•	Pollutant or pollutant group type, and

•	Specific AP-42 section.

WebFIRE also allows you to expand your search to include emissions factors that have been
revoked by EPA. Figure 7-1 provides an overview of the factor retrieval process.

For each of the search criteria, WebFIRE provides a dropdown menu presenting the
available selections in the database. For the SCC, control device, and pollutant criteria, WebFIRE
also provides a filter box that allows you to limit the dropdown selections to only those that
contain the filter text (e.g., entering "refine" in the SCC text box reduces the dropdown menu
selections to only those SCCs that contain "refine" in the code or text description, such as
"refinery"). Table 7-1 lists the data fields WebFIRE provides for each emissions factor record.

Depending on the search criteria you enter, WebFIRE will return one or more emissions
factors. At this stage of the search, you have the option of: (1) creating a summary report of the
information shown on the results page (Figure 7-1, Option A), or (2) obtaining additional
background information for the emissions factor that you selected (see Section 7.2). To
accommodate various end uses of the retrieved data (e.g., emissions calculations, incorporation
into a text file), WebFIRE provides you with the following reporting formats:

1 The EPA uses the SCCs to organize data for anthropogenic air pollutant sources that have similar production and
emissions characteristics (e.g., gasoline storage tanks, polymer manufacturing facilities) into related groups or
source categories. Section 8.1 provides an overview of the SCC system.

7-1


-------
Section 7.0

How Do I Find An Emissions Factor?

•	Comma Separated Values (CSV) format (for importation into a spreadsheet or
database),

•	Extensible Markup Language (XML) format (for importation into XML parsing
applications),

•	American Standard Code for Information Interchange (ASCII) format (for importation
into other applications), and

•	Hypertext Markup Language (HTML) format (for printing).

7-2


-------
Section 7.0

How Do I Find An Emissions Factor?

Figure 7-1. Procedures for Retrieving Emissions Factors from WebFIRE

7-3


-------
Section 7.0

How Do I Find An Emissions Factor?

Table 7-1. Data Fields Reported by WebFIRE Emissions Factor Search

Emissions Factor Record
Data Elements

Description

Emissions factor

Numerical value and units of the emissions factor

see

Source Classification Code

Pollutant name

Chemical name of pollutant factor

NEI pollutant code

National Emissions Inventory (NEI) code assigned to the
pollutant

CAS number

Chemical Abstract Service (CAS) number assigned to the
pollutant

Pollutant code

Identification number assigned to the pollutant in the National
Emissions Inventory (NEI)

Quality score

Composite Test Rating (CTR) for EPA factors

Emissions factor
representativeness

Qualitative characterization of how well an emissions factor
statistically represents the population of similar facilities in a
source category

Primary control device

The first control device applied to the process

Second control device

The second control device applied to the process

Third control device

The third control device applied to the process

Fourth control device

The fourth control device applied to the process

Fifth control device

The fifth control device applied to the process

Status

Identifies emissions factors as individual test data value, EPA
factor, or draft emissions factor undergoing review

Data source type

Refers to the original document(s) from which factors were

obtained

Restriction type

Refers to caveats or special considerations prior to use of the
emissions factor

References

Test report or citation where the factor was derived

AP-42 section

Identifies the specific AP-42 section where the process data
can be found

Formula

Empirical equation used to express an emissions factor

Date

Represents the date the emissions factor was
developed/revised

Notes

Additional information to assist the user in understanding and
applying an emissions factor

7.2 How Do I Obtain Background Information for My Selected Emissions Factor?

At the search results page, WebFIRE provides you the option of retrieving additional
detailed information for the emissions factor that you selected (Figure 7-1, Option B) by clicking
on the "Details" link located at the right-hand side of the search results page. This detailed

7-4


-------
information is intended to give you a better understanding of your specific factor so you can
make better decisions regarding its applicability.

For the AP-42 factors we developed using our historical approach discussed in Section
3.0, the Details link provides you with information such as the citation for the data; the
applicable AP-42 section; formulas and equations that are applicable to the factor; and
information on process configurations, operating conditions, control device configurations, and
test conditions relevant to the emissions factor that you selected. For the factors we develop
using our revised approach discussed in Section 5.0, the Details link provides you with three
tables. Table 1 summarizes the emissions factor and Tables 2 and 3 present the data EPA
included and excluded from the calculation, respectively. Tables 2 and 3 also provide links to
the electronic submissions of the individual emissions test reports.

7-5


-------
Section 8.0

What Parameters Should I Consider When Using or Deriving an Emissions Factor?

When you are selecting or deriving an emissions factor for use in developing an
emissions estimate for a particular process or activity, the primary considerations are generally:

•	How well the emissions factor represents the process for which the emissions
estimate is being developed,

•	The effect on emissions due to the presence (or absence) of a control device or
technique, and

•	The underlying test method used to measure the pollutant(s) represented by the
emissions factor.

8.1 Source Category and Process Considerations

The EPA uses SCCs to classify different types of anthropogenic emissions activities. Each
SCC represents a unique, source category-specific process or function that emits an air
pollutant. The SCCs are used as a primary identifying data element in EPA's WebFIRE, the NEI
and other EPA databases. The SCCs are also used by many regional, state, local, and tribal
agency emissions data systems.

There are two types of SCCs: 8-digit and 10-digit. The 8-digit SCCs follow the pattern
1-22-333-44 and the 10-digit SCCs follow the pattern 11-22-333-444. The codes use a
hierarchical system in which the definition of the emissions process becomes increasingly more
specific as you move from left to right. The first level of description provides the most general
information on the category of emissions. The fourth category is the most detailed and
describes the specific emitting process. Point source SCCs have historically had only 8 digits;
however, there are numerous 10-digit SCCs that characterize point source processes such as
aircraft emissions and ground support equipment emissions at airport facilities. Ten-digit SCCs
primarily represent nonpoint and mobile source emissions.

You can download the current list of SCCs and their descriptions from the EPA's
Emission Inventory System (EIS) website: https://sor-scc-

api.epa.gov/sccwebservices/sccsearch/. At this website, you can search for SCCs by entering

8-1


-------
Section 8.0

What Parameters Should I Consider When Deriving a User-Defined Emissions Factor?

keywords, sectors, or partial numeric codes in the filter box or explore the SCC list using a
keyword filter box or selecting one or more preset filter options (e.g., include retired SCCs).

The EPA periodically updates and improves the SCCs. As technologies have changed over
the years, the EPA has recognized the need to remove outdated SCCs and add SCCs for new
emissions processes. A review of existing SCCs has shown several instances of duplicate SCCs
for the same process. The EPA is retiring duplicate SCCs to help ensure that each emissions
process has a unique SCC. In addition, the EPA is working to assign SCCs to emissions sources
that are currently regulated, but do not have SCCs. We are also making other changes to ensure
that the assignment of an SCC is consistent with the descriptions associated with the hierarchy
of digits that comprise each SCC.

The SCC revisions improve the overall organization of the SCC list by reducing the
likelihood of a user choosing an incorrect SCC for their particular process. The EPA designed the
SCCs to categorize processes that create emissions; therefore, one objective of revising the
SCCs is to remove control device descriptions from the current SCC list. Another objective of the
SCC revision process is to reduce the use of miscellaneous SCCs, such as those including "99"
codes. Often these are labeled in the SCC list as "other not classified," "specify in comments
field," or "miscellaneous." These types of labels are not sufficient to classify emissions
processes. Therefore, the EPA is removing these SCCs from WebFIRE. The EPA's new approach
will allow entities submitting test data to WebFIRE to propose new SCC(s) for their emissions
processes in an effective and logical way. Upon receipt of a request to establish a new SCC, the
EPA will perform an analysis to determine if the proposed SCC is unique or if an existing SCC
should be used. We will base our analysis upon the uniqueness of the emissions profile of the
process and other relevant considerations.

It is important to note that the revisions that we are currently making to the SCC
process do not change the fundamental role that SCCs play in the emissions factor program or
the way that users will be able to search for specific emissions factors. These revisions should
improve the overall data quality of the emissions factors by ensuring that the data upon which
the emissions factors are based are grouped in the appropriate SCC. In addition, the WebFIRE

8-2


-------
Section 8.0

What Parameters Should I Consider When Deriving a User-Defined Emissions Factor?

emissions factor search function automatically applies EPA's updated SCCs to the search results.
For example, if a user searches for an emissions factor using an SCC that EPA has retired or
mapped to a new SCC, the WebFIRE search results alert the user to the change in SCC status
and provide emissions factors for the new SCC, if available, and for the SCC the user originally
selected when specifying the search criteria. The SCC bulk download available from the EPA EIS
website provides a cross-walk so that you can identify revised SCCs by their old SCC number.

8.2	Control Device Considerations

In addition to assessing the production process or activity for which you are selecting or
developing an emissions factor, you should have a clear understanding of the operation and
performance characteristics of any control techniques or technologies that are used to reduce
emissions from the process. When you are selecting or developing a controlled emissions
factor, you should determine if the control device reflected in the emissions factor record is
comparable to the type and configuration of any control device that is applied to the process
for which you are developing the emissions estimate. You may also want to assess whether the
pollutant of interest is reduced or eliminated by a particular type of control device, or
determine whether a piece of equipment functions as an integral part of the process (e.g., a
cyclone that separates product from a pneumatic conveying system, cooling coils in a vapor
degreaser that reduce solvent loss) or whether it is a control device (e.g., a cyclone that reduces
PM emissions from a wood sawmill, a thermal oxidizer that reduces organic emissions from a
process vent). You may also find that a clear understanding of control device operation is useful
when assessing the performance of control devices that are operated in series (WebFIRE
accommodates up to five control devices for a single emissions factor record).

8.3	Pollutant Test Method Considerations

The selection of a test method and how the method is applied to measure emissions
from the process can affect the representativeness of the emissions data and the resulting
emissions factor developed from the data. The majority of the emissions factors contained in
WebFIRE are based upon direct emissions measurements. In most cases, these measurements
were obtained using the EPA's reference test methods that were created to support

8-3


-------
Section 8.0

What Parameters Should I Consider When Deriving a User-Defined Emissions Factor?

development, implementation and compliance with federal standards (e.g., New Source
Performance Standards (NSPS), National Emission Standards for Hazardous Air Pollutants
(NESHAP)). In addition, some emissions factors are based upon data collected using non-EPA
test methods (e.g., methods developed by the California Air Resources Board (CARB)).

The EPA's reference test methods provide direct measurement of specific chemical
species (e.g., carbon monoxide (CO), sulfur dioxide (SO2)), emissions from a process or control
device. The EPA's reference test methods for measuring PM or total hydrocarbons (THC)
measure the emissions of a group or class of pollutants rather than an individual compound or
chemical species. In these cases, for example, the term "filterable PM" is considered to apply to
the material that is captured upstream and on the sampling train filter maintained at a specific
temperature. Consequently, the temperature at which the sampling train is operated affects
the amount of "filterable" material collected (e.g., operating the sampling train at a lower
temperature would tend to capture more compounds that have high vapor pressures).

When you are considering an emissions factor developed from PM or THC data, you
should be aware of the underlying test method and conditions under which the test was
conducted to determine if the emissions factor is appropriate for the pollutant for which you
are using or preparing the emissions estimate. Often, an understanding of how the method is
conducted can overcome confusion related to applying the data and to comparing emissions
from different facilities.

8-4


-------
Section 9.0

How Do I Develop a User-Defined Emissions Factor?

9.1 How Do I Use WebFI RE to Create a User-Defined Emissions Factor?

WebFIRE allows the public to develop a user-defined emissions factor using the same
procedures that the EPA follows to develop new or to revise existing emissions factors (see
Section 5.0). Figure 9-1 shows the steps for developing a user-defined emissions factor.

First, a user should obtain all the individual test data values contained in WebFIRE that
are related to the emissions process of interest to you by specifying the appropriate search
criteria at the emissions factor development page in WebFIRE. After you have obtained the list
of individual test data values that match your search criteria, you can select the candidate data
set containing the values that you want to use to develop the user defined emissions factor by
highlighting the check box next to each data record of interest. WebFIRE calculates the
emissions factor value from this candidate data set using the outlier, BDL, factor derivation and
quality assessment tools discussed in Section 5.0. This development tool is not applicable to the
emissions factors that are expressed as empirical equations because they contain more than
one variable.

After WebFIRE calculates the user-defined emissions factor, you can generate a report
to provide documentation of the emissions factor development. The report provides a
summary of the user-defined emissions factor, the test data values used to derive the factor,
the corresponding SCC for the emissions factor, applicable control devices, the Composite Test
Rating (CTR) for the factor (see Appendix D), and how well the emissions factor represents air
emissions from the process associated with the SCC. The report also shows the values and
supporting information for the individual test data values that were used to derive the
emissions factor. Because the WebFIRE database will not retain user-defined emissions factors
after they are created, we recommend preparing a report for any user-defined emissions factor
that you develop.

9-1


-------
Section 9.0	How Do I Develop a User-Defined Emissions Factor?

9-2


-------
Section 9.0

How Do I Develop a User-Defined Emissions Factor?

Figure 9-1. Emissions Factor Derivation in WebFIRE

9-3


-------
Section 9.0

How Do I Develop a User-Defined Emissions Factor?

9.2 What are the Potential Impacts Associated with Applying a User-defined

Emissions Factor?

Applying a user-defined emissions factor may affect whether or not you conclude your
source is subject to certain regulations. For example, applying a user-defined emissions factor
to a site-specific emissions estimate could show that a facility is not subject to a particular
emissions standard where the previous use of an emissions factor indicated that the emissions
standard was applicable. For this reason, we encourage you to be judicious and responsible in
your application of a user-defined emissions factor. We also encourage you to create and
maintain the WebFIRE report (see Section 9.1) that documents the development of the user-
defined emissions factor.

9-4


-------
Section 10.0
How Do I Submit Data to WebFI RE?

To ensure consistency of data submittals from many different facilities and entities, we
have two ways to submit data. For EPA to develop or revise existing emissions factors, it is
important for a user to submit the data results to WebFIRE in the format of the EPA's ERT or an
ERT compatible XML schema. If you have data you would like submitted to WebFIRE that is
collected by methods not supported by the ERT, you can voluntarily use the WebFIRE template
spreadsheet, available at: www.epa.gov/electronic-reporting-air-emissions/electronic-
reporting-tool-ert. and send the file to EPA via mail or email. The ERT (see Section 10.1) is an
electronic alternative to submitting paper test reports and supporting documentation. The
WebFIRE template spreadsheet allows for electronic submission of data collected using test
methods not supported by the ERT, or test data collected prior to January 1, 2012.

The Compliance and Emissions Data Reporting Interface (CEDRI) at the EPA's Central
Data Exchange (CDX) is the data upload application for submitting ERT files. The CDX (see
Section 10.2) is part of the Environmental Information Exchange Network and provides industry
an easy and secure reporting service.

If you have an existing CDX account (e.g., you submit reports for the EPA's Toxics
Release Inventory (TRI) Program), you can use your current user ID and password to log in to
CDX by navigating to the https://cdx.epa.gov/ link, entering your user ID and password then
selecting the "Log In" button in the header of the page. After you log in, you will be able to
select the "Add Program Service" button on the MyCDX Services page to add CEDRI to your list
of CDX applications. You will then be able to follow the instructions provided on the subsequent
pages to complete the identity verification process to obtain approval from EPA to access
CEDRI.

If you do not have an existing account with the CDX, you can complete the online
registration process by navigating to the CDX home page (https://cdx.epa.gov/) and clicking the
"Register with CDX" button in the header of the page. After completion of the user registration

10-1


-------
Section 10.0

How Do I Submit Data to WebFIRE?

component, you will be able to follow the instructions provided on the subsequent pages to
complete the identity verification process in order to obtain approval to access the CEDRI data
upload program. During the registration process, you have the option of registering as a
"preparer" or as a "certifier." If you are preparing reports for signature and subsequent
submission by an authorized representative of a facility, you should register as a preparer. The
certifier is the duly authorized representative of the source or more commonly referred to as
the "owner or operator" of the facility. The certifier is authorized to modify the package a
preparer has assembled, sign and submit the package to the CDX. Contractors are prohibited
from registering as a certifier. Contractors are permitted to register as a preparer and may
assemble submission packages, such as the ERT, for the certifier's approval and signature.

If you are the signature authority for the facility (i.e., certifier), you may either use the
LexisNexis electronic identity validation service or the paper-based Electronic Signature
Agreement (ESA) validation process to register as a certifier. We strongly encourage certifiers to
use the electronic identity validation process as the paper-based approval of the ESA typically
takes 5 to 10 business days. If you choose to use the paper-based validation process, you will be
required to mail your signed ESA to the CDX Reporting Center. The CDX Reporting Center will
request the phone number of the signature authority's employer/authorizing official to verify
employment.

For any questions regarding the CDX, the CDX Help Desk (https://cdx.epa.gov/Help) is
available for data submission technical support between the hours of 8:00 am and 6:00 pm
(Eastern Standard Time (EST)) at 1-888-890-1995 or helpdesk@epacdx.net. The CDX Help Desk
can also be reached at 970-494-5500.

The WebFIRE template spreadsheet should not be submitted through the CEDRI upload
application. The template should be emailed to the EPA's Info CHIEF mailbox at:

Chief lnfo@epa.gov or mailed on electronic media to EPA at the following address:

U.S. EPA

Group Leader

Measurement Policy Group, OAQPS

10-2


-------
Section 10.0

How Do I Submit Data to WebFIRE?

Mail Code D243-05

RTP, NC 27711

10.1 How Are Emissions Tests Documented?

There are two approaches to documenting emissions tests electronically: the ERT, and
the WebFIRE template spreadsheet. The ERT is a database application developed by EPA to aid
facilities in planning and reporting the results of emissions tests. The ERT replaces time-
intensive manual test planning, test data compilation and reporting, and data quality assurance
evaluations. When utilized to its fullest potential, the ERT can also facilitate coordination
among the facility, the testing contractor and the regulatory agency (e.g., for compliance
demonstrations) in planning and preparing for the emissions test. The current version of the
ERT, a list of the EPA test methods that are currently supported by the ERT and guidance on the
use of the ERT can be found at: www.epa.gov/electronic-reporting-air-emissions/electronic-
reporting-tool-ert. The EPA's Emission Measurement Center (EMC) provides information
regarding the EPA's test methods and can be found at: www.epa.gov/emc/.

The ERT documents the following key information, some of which is required by the EPA
reference test methods for stationary sources:

•	SCC specification,

•	Process data from existing air permits (e.g., process throughput rates),

•	Process rate levels during actual testing,

•	Descriptions of the source, unit process and control devices associated with the test,

•	Process upsets or malfunctions during testing,

•	Process flow diagram,

•	Sampling locations,

•	Test methods used,

•	Deviations made to any test method, and

•	Output flow rates and pollutant concentrations.

Figure 10-1 shows the typical steps followed when using the ERT. The ERT consists of: (1) a
database application, (2) the project data set (PDS), and (3) a data upload spreadsheet. The
database application contains all of the data input screens, reports, calculations and other

10-3


-------
Section 10.0

How Do I Submit Data to WebFIRE?

items necessary to create and distribute a test report. The application also incorporates our
evaluation system (see Section 5.2 and Appendix A) so that each test is assigned a numeric
score (the ITR) that assesses the quality of the measurement data and associated information
collected during an emissions test. The PDS database contains the measurement data for a
single test report. This file is exchanged between the source test contractor, the client and the
regulatory agency, if necessary (e.g., for a compliance test). To provide flexibility to ERT users,
the Microsoft Excel" spreadsheet can be used to upload the sampling hardware and field
measurement data recorded during a test into the PDS rather than entering the data directly
into the PDS through the application.

10-4


-------
Section 10.0

How Do I Submit Data to WebFIRE?

Figure 10-1. Typical Work Flow When Using the ERT

Upon completion, the ERT contains all of the emissions data and supporting information
(e.g., equipment calibration documentation) prepared and collected for the test. In addition,
testers can attach supporting information to the ERT or an electronic copy (PDF) of the entire
report (optional) and create a submission package.

When creating the submission package, the ERT automatically creates an XML export
file for the WebFIRE emissions factor database. The format of this ERT output file is specifically
designed to provide inputs for the data fields contained in WebFIRE (e.g., emissions value and
units, SCC, ITR). To facilitate incorporation of the data into WebFIRE, the output file is

10-5


-------
Section 10.0

How Do I Submit Data to WebFIRE?

configured to accept emissions values expressed in terms of mass of pollutant emitted per unit
of activity. The output file also accepts emissions test results that are expressed as a
concentration or an emissions rate (i.e., mass emitted per time unit) which may be able to be
expressed in units that are suitable for use in emissions factor development.

The EPA developed the WebFIRE template spreadsheet for companies, associations, and
agencies to provide emissions data collected using test methods not supported by the ERT. The
WebFIRE template spreadsheet is also applicable to emissions data collected prior to January 1,
2012 (test methods used to collected emissions data after this date include the requisite data
to enter EPA's emissions factor development process). The structured format of the WebFIRE
template spreadsheet organizes the basic source information used for emissions factor
development, including supplementary information for more detailed characterization of the
source and the emissions measurements, and the spreadsheet contains the same rating system
used in the ERT for assessing the quality of the emissions test and assigning the ITR.
Additionally, submitters should include a PDF copy of the entire report documenting the source
test in the completed WebFIRE template spreadsheet file. Submission of the WebFIRE template
spreadsheet to the EPA is not a requirement but a voluntary option to provide stack test data
for EPA to use in potentially deriving an emissions factor. Use of the ERT and the WebFIRE
template spreadsheet will provide for consistent criteria to quantitatively assess the quality of
the data collected during the emissions test and to standardize the test report contents. The
use of the ERT and the WebFIRE template spreadsheet also improves the availability of the
supporting documentation necessary to conduct such an evaluation. Additionally, the ERT and
the WebFIRE template spreadsheet lay the groundwork for future capabilities to electronically
exchange information contained in the test reports with facility, state, local, or federal data
systems.

10.2 What are the CDX and CEDRI and What are their Roles in Submitting Data to WebFIRE?

Electronic environmental data submissions to EPA, including submission of emissions
data for use in WebFIRE, can be made through the CDX using the CEDRI data upload
application.

10-6


-------
Section 10.0

How Do I Submit Data to WebFIRE?

The CDX is part of the Environmental Information Exchange Network that was
developed by the EPA and the states to facilitate online sharing of electronic environmental
information among EPA, states, tribes, localities and other entities. The CDX is a broad-based
tool that offers industry, states, tribes and other stakeholders a fast, easy and secure reporting
service. As part of EPA's e-government initiative, the CDX helps to ensure that both the public
and regulatory agencies can access the information used to document environmental
performance, understand environmental conditions and make sound decisions to protect the
environment.

The benefits of the CDX to the EPA and related program offices include:

•	Elimination of redundant infrastructure and its associated costs,

•	Facilitation of faster, lower-cost implementation of new or modified data flows,

•	Integration of data to agency data repositories,

•	Establishment of consistent procedures for electronic signatures,

•	Reduction in the time needed to make information publicly accessible,

•	Reduction in the record management costs by elimination of redundant
recordkeeping, and

•	Compliance with the Cross-Media Electronic Reporting Regulation (CROMERR).

The benefits to industry, states, local agencies and tribes associated with the CDX
include:

•	Reduction of overall reporting burden,

•	Improvement in data accessibility,

•	Electronic confirmation that information was received and that the electronic form
was filled out correctly,

•	Reduction in the time and costs associated with environmental data submission
requirements,

•	Simplification of reporting to a single point in the EPA instead of many separate
programs,

•	Faster securing of submission through built-in edit and data quality checks,

•	Improvement of security and transmission of confidential business information (CBI)
through registration and authentication,

•	Reduction of burden of complying with new or changing requirements, and

•	Streamlining of reporting through the Exchange Network and Web Services.

10-7


-------
Section 10.0

How Do I Submit Data to WebFIRE?

The EPA expects facilities to produce and submit an increased amount of new emissions
test data in response to regulations that require the electronic submission of emissions tests to
demonstrate compliance with federal air regulations.

In the future, we anticipate that the EPA will use the capabilities of the CDX to provide
for electronic exchange of information in test reports with facility, state and federal data
systems. For example, the ERT and WebFIRE template spreadsheet allow sources to document
facility-specific information that may also be required under other regulatory data systems,
such as the Air Facility System (AFS). Such systems contain compliance, enforcement and
permit data for stationary sources of air pollution regulated by the EPA and state/local/tribal
agencies. Transfers to other data systems such as the NEI, TRI and Title V reporting may also be
desirable.

CEDRI job aides can be found at: https://www.epa.gov/electronic-reporting-air-
emissions/cedri. Files submitted through the CDX/CEDRI are stored in the CDX CROMERR
archive and a copy of the file is retained in the WebFIRE database.

To submit files through the CEDRI application, you should accept the certification
conditions that the documents and attachments were prepared under your direction or
supervision and that, to the best of your knowledge, the information is true, accurate and
complete. After accepting the certification conditions, you will be prompted to re-validate your
username and password, answer the validation question and officially sign the submission.
Shortly after submission, you will receive email notification stating whether the files were
successfully or unsuccessfully submitted. Submissions can fail for a variety of reasons, including
presence of an invalid file (e.g., improper file extension), an incomplete file, or system errors. If
any system errors occur after you upload and sign the submission file, you will be prompted to
re-submit the files or contact the CDX Help Desk.

10-8


-------
Section 11.0

What is the Data Review and Public Participation Process for Emissions Factor Development?

An overview of the public participation and data review process used by the EPA when
implementing Clean Air Act section 130 for source test and/or emissions factor data is shown in
Figure 11-l.The Clean Air Act states "The Administrator shall permit any person to demonstrate
improved emissions estimating techniques, and following approval of such techniques, the
Administrator shall authorize the use of such techniques. Any such technique may be approved
only after appropriate public participation."

Periodically, the EPA will review, compile, and analyze the data contained in WebFIRE
for the purposes of revising existing and developing new emissions factors, as appropriate. We
generally consider the following criteria to determine if emissions factor development is
warranted:

•	The amount of new source test/emissions factor data that have been received,

•	The degree of variability with existing emissions factors in WebFIRE, and

•	EPA's programmatic needs related to new rules, policies, and other EPA tools.

If we receive a substantial amount of new information for a given process type and that
process is a significant emitter of one or more pollutants, the agency may consider review and
development of new emissions factors. If we receive only a few new data values for a process
type, it is less likely that the new data alone would initiate the extensive factor review and
development process. Another point that we consider is the difference and variability between
the existing emissions factors in WebFIRE and the newer data. If the newer data do not
significantly change the existing factor(s), the need to revise the factor would be less urgent.
Lastly, decisions to initiate factor review and development may be tied to programmatic issues
and schedules occurring within the EPA. For example, new data or the need for improved
emissions factors may be driven by new regulations that are under development or that were
recently promulgated. Also, emissions inventory requirements may be in place that call for new
emissions factors.

ll-l


-------
Section 11.0 What is the Data Review and Public Participation Process for Emissions Factor Development?

Figure 11-1. Overview of the WebFIRE Public Participation and Emissions Factor Development

Process

11-2


-------
Section 11.0 What is the Data Review and Public Participation Process for Emissions Factor Development?

When one or more of these considerations call for it, the EPA may be prompted to
initiate the emissions factor review and development activities. As a result of this process, the
EPA will generally draft new and/or revised emissions factors for specific processes (i.e., SCCs).
The EPA will publicly announce the availability of these draft emissions factors and invite public
review and comment via the Air and Emissions and Quantification Website
(https://www.epa.gov/air-emissions-factors-and-quantification/documentation-supporting-
draft-and-final-emissions-factors). The public announcement would be in the form of an EPA
Listserv email notification via the InfoCHIEF Listserv (www.epa.gov/chief/chief-listserv). The
public can join the CHIEF Listserv by sending an email to ioin-chief@lists.epa.gov. These
notifications will describe the nature of the new emissions factors that EPA developed and their
associated source categories. Typically, the public would have a 60-day review and comment
period for the draft factors. Examples of some topics to consider when preparing comments
include, but are not limited to:

•	The validity and accuracy of the test methods applied to obtain sample
measurements,

•	The validity and accuracy of the analytical procedures used to quantify
measurements,

•	The completeness, thoroughness and transparency of the source test
documentation,

•	The correlations made between process parameters and test data conditions,

•	The accuracy of the assigned SCC and control device codes, and

•	The adequacy and accuracy of the process description for the source category and
the associated documentation.

The process for submitting comments (e.g., format and method of submittal, due dates,
submittal address) would be described in the data availability announcements. Commenters
should review all information pertinent to the correct calculation of emissions factors from the
underlying test data. The review should address how well the mass or concentration
measurement data were combined with process operating data (e.g., fuel use, material
throughput, item production, power output) to yield an emissions factor. If controls are in
place, control device operating conditions should be correctly associated with process
conditions and factored into the emissions factor development. It is particularly important that

11-3


-------
Section 11.0 What is the Data Review and Public Participation Process for Emissions Factor Development?

reviewers confirm the process and source category associations made for the data. New or
revised process flow diagrams and/or schematics should be submitted if an industry has
undergone significant changes since the last revision. These process associations should be
made using SCCs, recognizing that, in some cases, new SCCs may be required.

At the conclusion of the public comment period, the EPA evaluates the comments
received and makes any appropriate modifications to the data in WebFIRE. If commenters
provided new emissions test data for use in emissions factor development, we would consider
combining the newer data with the existing data for a given source type or category. When
determining valid combinations of existing and new data, we use statistical analyses that are
based upon the Student's t-test (see Appendix E). If the comments identify issues or raise
questions that the EPA cannot address, the original submitter may be contacted for
reconciliation. After all comments are addressed and the EPA is satisfied with the quality of the
emissions factor data, we will make the final emissions factor available to the public in WebFIRE
(https://cfpub.epa.gov/webfire/). The previous emissions factor, if any, would be flagged as
"revoked."

11-4


-------
Appendix A

Procedures for Determining Individual Test Report

Quality Ratings


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

1.0 Introduction

Historically, the EPA's quality ratings of emissions test data and test reports were largely
subjective because each test program presented different issues (i.e., no two facilities, their
operation or the tests conducted at those facilities are exactly alike). Typically, the EPA
developed letter-grade quality ratings (A through E) for test reports based upon the agency's
review of the following criteria areas:

•	Process operation,

•	Test method and sampling procedures,

•	Process information, and

•	Analysis and calculations.

To reduce the subjectivity of quality reviews, the individual test rating (ITR) assigned by
the Electronic Reporting Tool (ERT) is based upon process, control device, and emissions testing
documentation provided by the source and responses to questions that assess the quality of
the process, control device, and emissions data collected during a source test. The
methodology used by the ERT for assessing the quality of emissions test data follows the same
basic principles as the EPA's historic methodology. However, the ERT procedure provides a
consistent objective framework for test contractors to follow when compiling test reports, and
for regulatory agency reviewers to follow when assessing data quality.

The test report quality rating methodology consists of three components: (1) the
assignment of points by the ERT based upon the source's entry of information into specific data
areas and attachments, (2) an adjustment of the points assigned by the ERT based upon a
regulatory agency review, and (3) the normalization of the points for a maximum ITR of 100
such that the ERT assigned score is 80 percent of the total and the remainder is based upon the
regulatory agency review.

Table A-l shows the types of information and documentation used by the ERT to assign
points and the questions that are used to evaluate the quality of data submitted to the ERT. The
information requested in the table is indicative of a complete and well-documented test report.
The ERT assigns points based on the assumption that the information and documentation
provided by the source is true, accurate, and complete. The adjustment to the points assigned
by the ERT may result in a modest increase in the points when the regulatory agency review
verifies that the information contained in the documentation provides an acceptable level of
quality. The adjustment to the points assigned by the ERT may result in a decrease when the
regulatory agency review reveals incorrect measurement procedures, unrepresentative process
operation, or other inaccurate information.

Supplementary points are assigned by the ERT when documentation is provided
showing certification or accreditation of those individuals or organizations involved with the
testing program. It is important to note that well-performed and documented test reports will
receive a sufficiently high rating to justify their use in developing emissions factors without any

A-l


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

supplementary points. Neither a state review nor participation by accredited organizations or
certified individuals is required. However, these added components can improve the ITR of the
test report.

It is also important to note that while a significant level of subjectivity has been
removed from the quality assessment of source tests for emissions factors development, the
points awarded are not a direct indicator of the precision, accuracy, and usability of the data for
other purposes. For simplicity, the point assignment employs a "Yes/No" criteria rather than a
graded assessment.

Some of the components may not directly affect the precision, accuracy, or usefulness
of the final result, but would bolster the confidence in the result. For example, reagent blanks
and calibrations conducted prior to a test verify that the reagents and equipment comply with
method requirements for the first test and increase the probability that the blanks and
calibrations conducted after the test will comply with the method requirements. Also, some
components do not result in completely unusable results at a given value. For example, a test
with results below the method detection level may be adequate for demonstrating compliance
when emissions calculated at the detection limit are significantly below the applicable limit. The
judgment of an experienced and knowledgeable individual can estimate the range of potential
change that a minor variation in an established test methodology has on the final result. While
the use of a specific emissions test may not be used for emissions factor development, this data
may be usable for other purposes when the bounds for that use are defined and assessed.

2.0 ERT Assessment

The ITR of the source test report is based upon the information and attachments
provided by the source. The ERT calculates the score based upon the completeness of the
report in the areas of process data, control device information, test method performance and
quality assurance. The information listed under "Supporting Documentation Provided" in
Table A-l identifies information the source or source test contractor provides and the criteria
the ERT uses in assigning points to calculate the quality indicator. The EPA assigned different
relative weightings to supporting documentation components due to the importance
associated with their potential to affect the overall precision, accuracy, representativeness, and
reliability of the final results.

Only those items related to the information collected during the test are used in
calculating the initial score. Using the completeness of the data and supplemental attachments,
we normalize the score so that the ITR score is limited to 80 points when only the ERT
assessment is performed.

Table A-l also identifies criteria that, if satisfied, can provide supplementary points
above the maximum of 76 awarded by the ERT. The ERT awards supplementary points
whenever:

A-2


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

1.	The source test company meets the competency requirements as an Air Emission
Testing Body (AETB) as defined by the American Society for Testing and Materials
(ASTM) standard D7036-12 or the field test leader is a current Qualified Individual
(Q.I) as defined by ASTM standard D7036-12.

2.	The analysis laboratory is certified or accredited to perform the analysis.

The ERT assigns an extra two points for each of the above accreditations or certifications
that are demonstrated in the test report. As a result, the ERT could assign a maximum of 80
points if a Ql was crew leader or the test company was an AETB and the laboratory was
accredited by a national independent or state accreditation program.

Some of the information requested in Table A-l is specific to certain test methods. For
example, the isokinetic sampling requirements (listed under "Raw sampling data and test
sheets") is only applicable when the test method collects pollutants that are in a particulate
form. In cases like this, the ERT would not include the points associated with these items in
either the points assigned or the maximum potential points used to normalize the ITR score. As
a result, the ERT will not give the test report a lower rating if the test method used does not
require isokinetic sampling. Instead, quality ratings depend upon the testing requirements. For
example, if an instrumental test method is used, the ERT will use only those questions that
pertain to the method to evaluate the quality of the test. Because the ERT normalizes the
overall score based upon the maximum score that can be assigned for any given method, the
fact that some questions that do not apply to the particular test method are not scored does
not reduce the overall maximum score possible for one test method relative to another
method.

3.0 Regulatory Agency Review

The quality of an emissions factor is only as good as the source data upon which it is
based. In the majority of cases, the test report, which is typically prepared by the testing
contractor, is the only documentation available for assessing the potential reliability (e.g.,
precision, accuracy, representativeness) of the emissions data for emissions factor
development as described in Appendix D. In all cases, the quality of the underlying source data
can be more thoroughly assessed when the test report is independently reviewed by a
regulatory agency.

The maximum quality rating for a test report that is not reviewed by a regulatory agency
is 80 points (76 points assigned for the base ERT review and 4 additional points assigned if
testing or analyses were conducted by certified or accredited individuals and organizations).
The regulatory agency review can raise the initial ITR score to a maximum of 100 points.
However, a negative evaluation by a regulatory reviewer can result in reducing the value of the
initial scoring significantly.

A-3


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

Under the ERT quality rating procedure, the regulatory agency reviewer evaluates the
responses to certain questions (shown in Table A-l) contained in the Quality Assessment (QA)
Review section of the ERT. If the reviewer makes the assessment requested by the question and
concludes that the documentation is complete, correct, and provides support of the proper
performance of this item, additional points are added to the score given by the ERT. The points
that are added with a positive response are shown in the fifth column of Table A-l. If the
reviewer determines that points were incorrectly assigned (i.e., the information contained in
the ERT file is incomplete, erroneous, or not consistent with the test method), points are
deducted from the value determined by the initial scoring. The points deducted from the initial
score for each component are shown in the sixth column of Table A-l. In addition, the
possibility exists that the ERT did not assign points for an item, or part of an item, because that
item was not documented in the correct location of the test report. If a positive validation of a
misplaced item is provided by the regulatory reviewer, the ERT adds the prorated points
(shown in the fourth column of Table A-l) that would have been assigned for the appropriate
placement of the item in the test report.

Regulatory agency reviewers may submit their review to EPA at any time, but we
anticipate the majority of the reviews will be associated with agency assessments of test
reports prepared by facilities to demonstrate compliance with applicable regulations. We
recognize that the public comment and review process that is associated with revising or
establishing an emissions factor (see Section 11) may result in additional reviews. These reviews
will be evaluated by EPA staff and any corrections may be incorporated into the existing quality
assessment of the test data, as appropriate. The results of the regulatory agency review and
accepted public reviews may be used in calculating a new or revised emissions factor.

A-4


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

Table A-l. Test Report Quality Rating Tool

Supporting Documentation Provided

Points Assigned if
Documentation is
Present

Regulatory Agency Review

Prorated
Documentation
Points

Points Added with
Affirmative
Response

Points Subtracted
with Negative
Response3

As described in ASTM D7036-12
Standard Practice for Competence of Air
Emission Testing Bodies, does the testing
firm meet the criteria as an AETB or is
the person in charge of the field team a
Ql for the type of testing conducted? A
certificate from an independent
organization (e.g., Stack Testing
Accreditation Council (STAC), California
Air Resources Board (CARB), National
Environmental Laboratory Accreditation
Program (NELAP)) or self-declaration
provides documentation of competence
as an AETB.

2

As described in ASTM D7036-12
Standard Practice for Competence of Air
Emission Testing Bodies, does the
testing firm meet the criteria as an AETB
or is the person in charge of the field
team a Ql for the type of testing
conducted? A certificate from an
independent organization (e.g., STAC,
CARB, NELAP) or self-declaration
provides documentation of competence
as an AETB.

2

0

2

1 Was a representative of the regulatory
agency on site during the test?

0

1

0

Is a description and drawing of test
location provided?

3

Is a description and drawing of test
location provided?

3

1

3

Has a description of deviations from
published test methods been provided,
or is there a statement that deviations
were not required to obtain data
representative of typical facility
operation?

6

Is there documentation that the source
or the test company sought and
obtained approval for deviations from
the published test method prior to
conducting the test or that the tester's
assertion that deviations were not
required to obtain data representative
of operations that are typical for the
facility?

6

2

6

Were all test method deviations
acceptable?

6

0b

6b

a This column shows the points subtracted with a negative response if points were assigned for this item by the initial scoring.

A-5


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

b These points are added for an affirmative response or subtracted for a negative response if this item is applicable to the test method used. If this item is not applicable,
points are neither added nor subtracted.

Table A-l. Test Report Quality Rating Tool (Cont.)

Supporting Documentation
Provided

Points Assigned if
Documentation is
Present

Regulatory Agency Review

Prorated
Documentation
Points

Points Added with
Affirmative
Response

Points Subtracted
with Negative
Response3

Is a full description of the process
and the unit being tested (including
installed controls) provided?

3

Is a full description of the process and
the unit being tested (including
installed controls) provided?

3

1

3

Has a detailed discussion of source
operating conditions, air pollution
control device operations and the
representativeness of measurements
made during the test been provided?

6

Has a detailed discussion of source
operating conditions, air pollution
control device operations and the
representativeness of measurements
made during the test been provided?

6

2

6

Were the operating parameters for
the tested process unit and
associated controls described and
reported?

60

Is there documentation that the
process monitors have been
calibrated and that the calibration is
acceptable?

12

4

12



Was the process capacity
documented?

12

4

12



Was the process operating within an
appropriate range for the test
program objectives?

12

4

12



Were process data collected
concurrent with testing?

12

4

12



Were data included in the report for
all parameters for which limits will be
set?

12

4

12

a This column shows the points subtracted with a negative response if points were assigned for this item by the initial scoring.

b These points are added for an affirmative response or subtracted for a negative response if this item is applicable to the test method used. If this item is not applicable,
points are neither added nor subtracted.

A-6


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

Table A-l. Test Report Quality Rating Tool (Cont.)

Supporting Documentation
Provided

Points Assigned if
Documentation is
Present

Regulatory Agency Review

Prorated
Documentation
Points

Points Added with
Affirmative
Response

Points Subtracted
with Negative
Response3

Is there an assessment of the
validity, representativeness,
achievement of data quality
objectives (DQO) and usability of the
data?

9

Did the report include descriptions of
the representativeness of the facility
operations, control device operation,
and the measurements of the target
pollutants, and were any changes
from published test methods or
process and control device
monitoring protocols identified?

9

3

9

Have field notes addressing issues
that may influence data quality been
provided?

0

Were all sampling issues handled
such that data quality was not
adversely affected?

0

0

111

Manual Test Method Questions

Have the following been included in
the report: dry gas meter (DGM)
calibrations, pitot tube and nozzle
inspections?

54

Was the DGM pre-test calibration
within the criteria specified by the
test method?

9

3

9



Was the DGM post-test calibration
within the criteria specified by the
test method?

9

3

9



Were thermocouple calibrations
within method criteria?

9

3

9



Was the pitot tube inspection
acceptable?

9

3

9



Were nozzle inspections acceptable?

9

3

9



Were flow meter calibrations
acceptable?

9

3

9

a This column shows the points subtracted with a negative response if points were assigned for this item by the initial scoring.

A-l


-------
Appendix A	Procedures for Determining Individual Test Report Quality Ratings

b These points are added for an affirmative response or subtracted for a negative response if this item is applicable to the test method used. If this item is not applicable,
points are neither added nor subtracted.

A-8


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

Table A-l. Test Report Quality Rating Tool (Cont.)

Supporting Documentation
Provided

Points Assigned if
Documentation is
Present

Regulatory Agency Review

Prorated
Documentation
Points

Points Added with
Affirmative
Response

Points Subtracted
with Negative
Response3

Was the Method 1 sample point
evaluation included in the report?

12

Were the appropriate number and
location of sampling points used?

12

4

12

Were the cyclonic flow checks
included in the report?

12

Did the cyclonic flow evaluation show
the presence of an acceptable
average gas flow angle?

12

4

12

Were the raw sampling data and test
sheets included in the report?

126

Were all data required by the method
recorded?

12

4

12



Were the required leak checks
performed and did they meet
method requirements?

30

10

180



Was the required minimum sample
volume collected?

18

6

18



Did probe, filter and impinger exit
temperatures meet method criteria
(as applicable)?

24

8

24





Did isokinetic sampling rates meet
method criteria?

24

8b

120b



Was the sampling time at each point
greater than 2 minutes and the same
for each point?

18

6

18

Did the report include a description
and flow diagram of the recovery
procedures?

30

Was the recovery process consistent
with the method?

6

2

6



Were all blanks collected in the field?

6

2b

6b

a This column shows the points subtracted with a negative response if points were assigned for this item by the initial scoring.

b These points are added for an affirmative response or subtracted for a negative response if this item is applicable to the test method used. If this item is not applicable,
points are neither added nor subtracted.

A-9


-------
Appendix A	Procedures for Determining Individual Test Report Quality Ratings

A-10


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

Table A-l. Test Report Quality Rating Tool (Cont.)

Supporting Documentation
Provided

Points Assigned if
Documentation is
Present

Regulatory Agency Review

Prorated
Documentation
Points

Points Added with
Affirmative
Response

Points Subtracted
with Negative
Response3



Where performed, were blank
corrections handled per method
requirements?

9

3b

9b



Were sample volumes clearly marked
on the jar or measured and
recorded?

9

3

9

Was the laboratory
certified/accredited to perform these
analyses?

2

Was the laboratory
certified/accredited to perform these
analyses?

2

0

2 (only if points were
assigned in the initial
ERT scoring)

Did the report include a complete
laboratory report and flow diagram
of sample analysis?

132

Did the laboratory note the sample
volume upon receipt?

9

3

9



If sample loss occurred, was the
compensation method used
documented and approved for the
method?

9

0

120



Were the physical characteristics of
the samples (e.g., color, volume,
integrity, pH, temperature) recorded
and consistent with the method?

9

3

9



Were sample hold times within
method requirements?

9

3b

9b



Does the laboratory report document
the analytical procedures and
techniques?

6

2

6



Were all laboratory QA requirements
documented?

15

5

15

a This column shows the points subtracted with a negative response if points were assigned for this item by the initial scoring.

A-ll


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

b These points are added for an affirmative response or subtracted for a negative response if this item is applicable to the test method used. If this item is not applicable,
points are neither added nor subtracted.

Table A-l. Test Report Quality Rating Tool (Cont.)

Supporting Documentation
Provided

Points Assigned if
Documentation is
Present

Regulatory Agency Review

Prorated
Documentation
Points

Points Added with
Affirmative
Response

Points Subtracted
with Negative
Response3



Were analytical standards required
by the method documented?

12

4

12



Were required laboratory duplicates
within acceptable limits?

12

4

12



Were required spike recoveries
within method requirements?

12

4

12



Were method-specified analytical
blanks analyzed?

12

4

12



If problems occurred during analysis,
is there sufficient documentation to
conclude that the problems did not
adversely affect the sample results?

15

0

15



Was the analytical detection limit
specified in the test report?

6

2

6



Is the reported detection limit
adequate for the purposes of the test
program?

6

2b

6b

Were the chain-of-custody forms
included in the report?

12

Do the chain-of-custody forms
indicate acceptable management of
collected samples between collection
and analysis?

12

4

12

Instrumental Methods Questions

Did the report include a complete
description of the instrumental
method sampling system?

3

Was a complete description of the
sampling system provided?

3

1

3

A-12


-------
Appendix A	Procedures for Determining Individual Test Report Quality Ratings

a This column shows the points subtracted with a negative response if points were assigned for this item by the initial scoring.

b These points are added for an affirmative response or subtracted for a negative response if this item is applicable to the test method used. If this item is not applicable,
points are neither added nor subtracted.

A-13


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

Table A-l. Test Report Quality Rating Tool (Cont.)

Supporting Documentation
Provided

Points Assigned if
Documentation is
Present

Regulatory Agency Review

Prorated
Documentation
Points

Points Added with
Affirmative
Response

Points Subtracted
with Negative
Response3

Did the report include calibration gas
certifications?

27

Were calibration standards used prior
to the end of the expiration date?

12

4

12

Did calibration standards meet
method criteria?

15

5

15

Did the report include interference
tests?

9

Did interference checks meet method
requirements?

9

3b

9b

Were the response time tests
included in the report?

12

Was a response time test performed?

12

4

12

Were the calibration error tests
included in the report?

12

Did calibration error tests meet
method requirements?

12

4

12

Did the report include drift tests?

9

Were drift tests performed after each
run and did they meet method
requirements?

9

3

9

Did the report include system bias
tests?

24

Did system bias check results meet
method requirements?

24

8

120

Were the converter efficiency tests
included in the report?

12

Was the NOx converter test
acceptable?

12

4b

12b

Did the report include stratification
checks?

15

Was a stratification assessment
performed?

15

5

15

Did the report include the raw data
for the instrumental method?

54

Was the duration of each sample run
within method criteria?

9

3

9



Was an appropriate traverse
performed during sample collection,
or was the probe placed at an
appropriate center point (if allowed
by the method)?

12

4

12

a This column shows the points subtracted with a negative response if points were assigned for this item by the initial scoring.

A-14


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

b These points are added for an affirmative response or subtracted for a negative response if this item is applicable to the test method used. If this item is not applicable,
points are neither added nor subtracted.

Table A-l. Test Report Quality Rating Tool (Cont.)

Supporting Documentation
Provided

Points Assigned if
Documentation is
Present

Regulatory Agency Review

Prorated
Documentation
Points

Points Added with
Affirmative
Response

Points Subtracted
with Negative
Response3



Were sample times at each point
uniform and did they meet the
method requirements?

9

3

9



Were sample lines heated sufficiently
to prevent potential adverse data
quality issues?

12

4

12



Were all data required by the method
recorded?

12

4

12

a This column shows the points subtracted with a negative response if points were assigned for this item by the initial scoring.

b These points are added for an affirmative response or subtracted for a negative response if this item is applicable to the test method used. If this item is not applicable,
points are neither added nor subtracted.

A-15


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

4.0 Rationale for Evaluation Criteria

The rationale for including the specific information considered in calculating the ITR are
provided below.

1.	Completeness Review-The documentation specified in the "Supporting
Documentation Provided" are used to assess certain aspects of the test program
impacting the quality (e.g., accuracy, precision, reliability, representativeness,
consistency with published methods, etc.) of the test data. A complete test report
should include: information on the location and contacts for the facility, information
on the contacts for the test team, information describing the tested process
including process and control device operations relevant for characterizing
emissions, information describing the characteristics of the test location(s), a
schematic or drawing of the test location(s), description of the published test
method(s) used, descriptions of the changes that were necessary to conduct the
test, identification of any relevant applicable requirements for which the test will be
used, and the identification of any audit and data quality indicators used for
verifying the reliability of the test method(s) performed. Documentation of the
conduct of the test methods, deviations from required test methods and laboratory
reports describing the analysis of the test samples are valuable as indicators of the
precision and accuracy of emissions data. The conditions during the time of sampling
and the operating parameters for the process and any air pollution controls are
indicative of the reliability and representativeness of the emissions measured during
the test period. If the various pieces of information listed here are not provided,
conformance to the test method cannot be determined and the precision and
accuracy of the data cannot be verified.

2.	Calibration Reports - Calibration reports provide documentation that equipment has
been inspected, properly maintained and is operating correctly during testing. If
calibration data are not present, or if the calibration data have expired, the results of
testing cannot be considered accurate. Calibration errors will lead to inaccurate
measurements and therefore inaccurate emissions rates.

•	Manual Test Methods - Equipment used to measure flow rate and temperature
should be properly inspected and calibrated to ensure accurate results. Flow rate
and temperature are important factors in source testing and have a direct
impact on the calculation of emissions rates. Faulty or mis-calibrated equipment
can lead to inaccurate readings and inaccurate results.

•	Instrumental Test Methods - Similarto the manual methods, this information is
used to determine if analyzers are operating correctly for each test. This data
includes pre-test calibration checks, bias determinations for each test run, and
equipment operational checks. If the information in this section is missing, the
data contained in the test report cannot be considered accurate.

A-16


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

3.	Raw Data Reports

•	Manual Test Methods -The documentation in this section of the raw data report
verifies the information reported in the test program and confirms that field QA
activities have been performed. This section provides documentation of stack
characteristics, exhaust gas conditions and sample point evaluation, all of which
are important for properly characterizing emissions. A complete laboratory
report, including recovery procedures and chain-of-custody forms, provides a
good indication of how well the samples were recovered, handled, and analyzed.

•	Instrumental Test Methods - With the exception of raw data, this information is
required by the reference methods and is used to verify that operating limits for
instrumentation are within acceptable ranges. Stratification checks are now
required by the EPA reference methods in some instances and this
documentation verifies that sampling procedures were appropriate for the
exhaust conditions at the time of the test.

•	Process and Facility Operation - Process and operating data are key components
in demonstrating that the facility is operating within normal conditions and that
the data collected are representative of normal operation. This information also
allows for the calculation of production-based emissions factors. Documentation
of control devices and their monitoring parameters verifies that devices are
working properly, provides information that can later be used as indicators of
continued performance and assures that testing was conducted under typical
control conditions.

4.	QA Review - The evaluation criteria listed below are based upon the QA

requirements of the EPA's reference methods, New Source Performance Standards

(NSPS) and National Emission Standards for Hazardous Air Pollutants (NESHAP).

•	Manual Test Method QA - Calibration criteria evaluated in this review are
specified in the reference methods and address field measurement equipment
calibrations and inspections. These criteria establish the minimum operating
limits for measurement equipment that provide confidence in the accuracy and
precision of the test results. This information addresses the critical elements of
the test equipment that have a direct impact on measurement and subsequent
calculation of sample volumes, effluent flow rates and pollutant concentrations.

•	Laboratory QA - Laboratory information evaluated in this review is directly
related to the accuracy of the laboratory analysis of pollutant samples collected
in the field. Listed items have a direct impact on the analysis of the samples and
the reliability of the test data. For example, sample integrity during transport is
assessed by comparing sample volumes to the values recorded prior to shipping,
which may indicate potential loss of sample media. Another example is analytical
detection limits, which should be sensitive enough to measure the pollutant of
interest at concentrations appropriate for the test plan.

A-17


-------
Appendix A

Procedures for Determining Individual Test Report Quality Ratings

•	Instrumental Test Method QA - The QA checks for instrumental test methods
are specified in the reference methods. These checks are designed to
demonstrate that the sampling system and analyzers are:

i.	Capable of meeting minimum acceptance criteria for acquiring a
representative effluent sample, and

ii.	Operating in a stable environment.

This information verifies that the analytical accuracy and precision of the measurement
results are acceptable for regulatory programs.

•	Process Data QA - The evaluation criteria listed in this review are based upon the
instrumental test method evaluations for data accuracy and representativeness.
Process disruptions may have a negative impact on the accuracy of the data.
Calibration information establishes the reliability and accuracy of the values used
to calculate emissions rates.

•	Other QA Indicators - Among other factors that should increase the assurance of
high-quality data from a source emissions test is the participation of qualified
individuals during the field testing. A Ql (e.g., someone recognized by the Source
Evaluation Society (SES) or meeting the criteria outlined in ASTM standard
D7036-12) is someone who has demonstrated a high level of knowledge and
ability consistent with an experienced field test team leader responsible for
emissions test planning, preparation, conduct, and reporting. Another factor is
the presence of a qualified observer during the field emissions testing. Such an
observer may be an independent technical expert or a representative of the
state, local or federal agency familiar with source emissions testing and who was
on site to monitor progress during the test.

A-18


-------
Appendix B

Procedures for Handling Test Data That are Below the

Method Detection Limits


-------
Appendix B

Procedures for Handling Test Data That are Below the Method Detection Limits

1.0 Introduction

In some cases, the result of a process emissions test is not an emissions rate, but a
determination that the target pollutant was not present at or above the minimum detection
limit of the test method (MDL). The EPA defines MDL as the minimum concentration of a
substance that can be measured and reported with a given level of confidence that the analyte
concentration is greater than zero. The MDL is determined from an analysis of a sample in a
given matrix containing the analyte. For purposes of emissions factor development, that level of
confidence is 99 percent. Stated another way, the MDL is the smallest amount of a substance
that an analytical method can reliably distinguish from zero, at a specified confidence level,
from the signal produced by a blank sample.

It is important to understand that the MDL is a statistical parameter and not a chemical
one. For EPA test methods (e.g., Method 5 - Particulate Matter) where a single analytical
technique is specified, the MDL will be the same for all source tests. However, the MDL can vary
from substance to substance and from measurement process to measurement process in cases
where the test method (e.g., Method 29 - Metals Emissions from Stationary Sources) allows for
alternative analytical techniques. In these cases, variability is introduced into MDLs by the
analysts conducting the measurements, the equipment and chemicals used in the
measurements and the Quality Assurance/Quality Control (QA/QC) procedures used. A
separate MDL should be generated for each test program. After MDLs have been developed,
the results of the testing can be compared. Results that are less than the MDL are referred to as
below the MDL (BDL). Test run values where some measurement data from the test method
are BDL are referred to as detection level limited (DLL).

2.0 Description of Procedures

We have developed specific procedures for handling ADL, BDL, and DLL data at the test
run level in ERT submissions when calculating test averages and for addressing cases where
some or all of the data included in the candidate data set selected for use in developing
emissions factors are ADL, BDL, and/or DLL. Note that we apply the procedures for determining
the test average in this appendix prior to conducting the data outlier tests described in
Appendix C so that appropriate values are used in outlier analyses.

It is not unusual for environmental data to contain some BDL values. Because such
values are expected, data users have developed calculation techniques to account for these
BDL values that exist, but are difficult to quantify with the accuracy typically associated with
ADL values. Generally, these calculation techniques recognize that small and large sample sizes
do not warrant rigorous mathematical approaches to provide a numerical value that replaces a
value found to be BDL. On the other hand, medium sample sizes warrant mathematical
approaches that provide numerical values associated with a maximum likelihood estimator
(MLE), a value found via calculation to be between V* the MDL and the MDL.

These approaches generally work well for programs managed by other agency offices
tasked with establishing regulatory emissions limits and determining compliance for specific

B-l


-------
Appendix B

Procedures for Handling Test Data That are Below the Method Detection Limits

individual facilities in narrowly-defined source categories. However, such rigor is overly
complicated for the WebFIRE emissions factor development program because emissions factors
are, by design, representative of generic facilities in broadly-defined source categories. As a
result, the procedures adopted for handling ADL, BDL, and DLL data in the derivation of
emissions factors are more straightforward and are based upon two general principles. First, as
emissions test values generally represent the average of three test runs, a data set containing
more than 10 test values is based upon more than 30 individual test runs. According to the
central limit theorem, such a data set is important because as one obtains 30 or more individual
samples (i.e., test runs), the distribution of those samples approaches that of a normal
distribution whose statistical characteristics are obtained readily. Second, the use of ADL and
DLL data is preferred over the use of BDL data in cases where adequate amounts of ADL data
are available. This generally reduces the uncertainty associated with emissions factors derived,
in part, from data that are BDL.

In understanding the recommended procedures for handling ADL, BDL, and DLL data,
note that a test run refers to the net period of time during which an emissions sample is
collected, as well as to the amount of pollutant emitted during that time period. Likewise, a test
refers to the net period of time over which separate runs, typically three, are conducted, as
well as to the average amount of pollutant emitted over the test period.

In most cases, the emissions test data contained in the ERT are used by sources to
demonstrate compliance with regulatory limits. Although we acknowledge that analytical
laboratories and state regulatory agencies use varying approaches in addressing ADL, BDL, and
DLL data for compliance assessments, the EPA's preferred approach is to report the BDL data as
"real" values and to flag the data appropriately when calculating test averages.

Table B-l summarizes our procedures for calculating an average emissions value and
assigning a detection limit flag from ERT test run-level data. In calculating an emissions factor
value from a candidate data set where some of the test averages are flagged as BDL, we do not
include BDL values that are greater than the highest ADL or DLL value in the candidate data set.
We also do not calculate emissions factors in cases where all of the candidate data are BDL.

Table B-l. Summary of WebFIRE Procedures for Handling ADL, BDL, and DLL Test Data in
Calculating a Test Average from Run-Level Data

Types of Test Run Data

Basis for Calculating Average Value

All test runs are ADL

WebFIRE calculates the average of the test run
values and assigns a flag of ADL to the calculated
average.

All test runs are DLL or a mix of ADL and DLL

WebFIRE calculates the average of the test run
values and assigns a flag of DLL to the calculated
average.

All test runs are BDL

WebFIRE calculates the average using the BDL/2
values and assigns a flag of BDL to the calculated
average.

B-2


-------
Appendix B	Procedures for Handling Test Data That are Below the Method Detection Limits

B-3


-------
Appendix B

Procedures for Handling Test Data That are Below the Method Detection Limits

Table B-2. Summary of WebFIRE Procedures for Handling ADL, BDL and DLL Test Data in
	Calculating a Test Average	

Types of Test Run Data

Basis for Calculating Average Value

All test runs are a mix of ADL, DLL, and/or BDL
values

WebFIRE calculates the average using the ADL
and DLL values and Vz the BDL values, provided
that Vz the BDL is equal to or less than the highest
ADL or DLL value, and assigns a flag of DLL to the
calculated average. When Vz the BDL is greater
than the highest ADL or DLL value, that BDL value
is excluded from the average calculation.

B-4


-------
Appendix C

Procedures For Determining
Statistical Outliers


-------
Appendix C

Procedures for Determining Statistical Outliers

1.0 Introduction

After a candidate data set containing more than three test values has been selected for
emissions factor development and the BDL analysis has been performed (see Appendix B),
WebFIRE will conduct a set of tests (i.e., the Dixon QTest or the RosnerTest) to identify values
in the candidate data set that are statistical outliers (i.e., a value that does not conform to the
statistical pattern established by other values under consideration). These tests are
incorporated into the EPA's WebFIRE (see Section 6.2) and are based on algorithms in ProUCL,
an EPA-developed statistical package available to the public free of charge.2 We neither
endorse ProUCL or any other statistical package, nor limit our ability to use ProUCL or any other
statistical package, as other statistical packages are capable of performing the requisite outlier
analysis. Emissions data are usually log-normally distributed; therefore, for the purposes of
evaluating outliers for emissions factor development, we assume that all emissions test data
values in the candidate data set follow log normal distributions. Thus, we log-transform every
test value in the candidate data set prior to conducting outlier tests.

2.0 Description of Procedures

In WebFIRE, the outlier test is applied to the log-transformed values in the candidate
data set in an iterative process. Each run of the outlier test identifies whether a low or high
value is an outlier, and the test is applied until all outliers have been identified and removed
from the candidate data set. However, the data values removed from the candidate data set
are not removed from the WebFIRE database because the outlier designation is relative to the
population of values selected for the candidate data set (i.e., an outlier in one data set may be
an acceptable value in a different data set, especially when differing data sets are being
compared using a t-test).

The general approach to use for determining outliers is shown in Figure C-l. If the
candidate data set contains less than three test values, a statistical outlier test is not performed
by WebFIRE because statistical analyses cannot determine outliers from such a small sample
size. Moreover, with just two values it is impossible to tell which one might be the outlier. If
there are three to 24 test values in the candidate data set, WebFIRE applies the Dixon test to
determine outliers. If there are 25 or more test values for analysis, the Rosner test is used to
identify outliers. Consistent with ProUCL, all outlier tests in WebFIRE are performed using the
95-percent confidence level using a 1-tailed statistical test, meaning that we are willing to
accept a 5 percent risk of rejecting a valid observation.

2 ProUCL is described and can be downloaded from the following Internet address: www.epa.gov/land-
research/proucl-software.

C-l


-------
Appendix C

Procedures for Determining Statistical Outliers

Figure C-l. Procedures to Identify Data Outliers in a Candidate Data Set

C-2


-------
Appendix C

Procedures for Determining Statistical Outliers

If an outlier is detected by WebFIRE, it is flagged in the data set and the number of valid
test data values remaining in the candidate data set is determined. The Rosner test or the Dixon
test, as determined by the number of test data values, is performed again. Outliers are
removed from the candidate data set and the appropriate outlier test is performed again until
the candidate data set does not contain outliers. When the data set does not contain outliers,
WebFIRE calculates the average of the remaining test values (not the log-transformed values)
and uses that average as the emissions factor value.

C-3


-------
Appendix D

Emissions Factor Development and Data Quality
Characterization Procedures


-------
Appendix D

Emissions Factor Development and Data Quality Characterization Procedures

1.0 Introduction

The procedures used in WebFIRE to determine which individual test data values (i.e.,
average values derived from multiple test runs) to use in deriving an emissions factor are based
upon two premises: (1) higher-quality data are preferred over lower-quality data, and (2) more
test data values are preferred over fewer test data values. These concepts are combined with
simple statistical procedures to derive the approach used by WebFIRE in assigning a quality
rating to the derived emissions factor. This quality rating indicates how well the derived factor
represents the average of the emissions from a particular source category. These procedures
are described in detail in the following sections.

2.0 Terms and Definitions

As a prelude to presenting these procedures, it is important to explain and define the
parameters used for the emissions factor calculations and data quality characterizations:

1.	Individual Test Rating (ITR) - The ITR value is the quality indicator assigned to individual
source test reports by the ERT. This value is based upon the level of documentation
available in the test report, the use and conformance with established the EPA
reference test method (or other test methods with comparable precision and accuracy),
and the operation of the source and associated emissions controls at known and
representative conditions. The ITR ranges from a high of 100 to a low of 0. The ERT
procedures for calculating the ITR are presented in Appendix A.

2.	Composite Test Rating (CTR) - The CTR is a weighted-average quality indicator for
groups of test reports. An inverse square weighting of the ITR values for the test reports
is used in calculating the CTR. As with the ITR, the CTR ranges from a high of 100 to a
low of 0.

3.	Factor Quality Index (FQI) - The FQI is a numerical indicator representing the derived
emissions factors ability to estimate emissions for the entire national population. The
FQI is dependent upon both the CTR and the number of test values used to develop the
emissions factor. The FQI is analogous to the standard error of the mean (oM) in
statistical calculations. In statistical calculations, oM provides an indication of the
confidence associated with an estimate of the mean of a population when a given
number of samples are obtained from the population. The oM is calculated from the
standard deviation of the samples (or other estimate of the populations variability)
divided by the square root of the number of samples. In the FQI, the parameter 100/CTR
simulates the function of the standard deviation in that measurements with great
variability (due to variations between sources in the population, variations with
individual sources, precision and accuracy of the methods used for measurement, and
other factors affecting variations in the measured values) are larger in value than
measurements with less variability. In the FQI, the minimum value is associated with
emissions tests that are judged to have the greatest precision and accuracy of sources
operating at representative conditions. This is the appropriate data set selection for use

D-l


-------
Appendix D

Emissions Factor Development and Data Quality Characterization Procedures

in emissions factor derivation as increases in the oM and increases in the number of
samples used to estimate the mean of the population serve to reduce the value of the
FQI in proportion to the estimated reliability of the estimate of the mean. In addition,
like oM, equal values of FQI provide comparable reliability in the estimate of the
population mean irrespective of differences in the CTR and the number of samples used
(i.e., test values) for estimating the population mean.

4.	Emissions factor quality indicator - There are three quality indicators used to
characterize the calculated emissions factor:

•	Highly representative is assigned to emissions factors having the lowest FQI
rating.

•	Moderately representative is assigned to emissions factors having an
intermediate FQI rating.

•	Minimally representative is assigned to emissions factors having the highest FQI
rating.

5.	Boundary criteria - Boundary criteria refers to the specific conditions that determine
which quality rating (i.e., minimally representative, moderately representative, or highly
representative) is assigned to an emissions factor. Based upon our experience with
developing emissions factors, we determined that, for source categories containing
more than 15 sources, an emissions factor derived from three tests with a CTR of 100
(FQI = 0.5774) qualifies for a moderately-representative rating. Likewise, an emissions
factor derived from more than 11 tests with a CTR of 100 (FQI = 0.3015) qualifies for a
highly-representative rating. These criteria are designed to allow for the development of
highly-representative emissions factors without the burden of conducting an inordinate
amount of emissions tests. For source categories containing 15 or fewer sources, it is
appropriate to allow fewer tests to attain a specific quality rating. An emissions factor
developed from more than one test with a CTR of 100 (FQI = 1.000) qualifies for a
moderately-representative rating and more than three tests with a CTR of 100 (FQI =
0.5774) qualifies the emissions factor for a highly-representative rating. For both source
category population sizes, degradation of the CTR warrants an increase in the number of
tests to compensate for the decrease in the average test quality to achieve the same
FQI. Table D-l provides the boundary line equations for the two population sizes and
Figures D-l and D-2 provide the graphical relationship between the CTRs and the
number of tests appropriate for the boundary conditions, respectively.

Table D-l. FQI and Boundary Line Equations

D-2


-------
Appendix D

Emissions Factor Development and Data Quality Characterization Procedures

If the source

category
contains...

Then use these bound

ary line equations ...

Minimally to moderately
representative

Moderately to highly
representative

More than 15
sources

FQI = 0.5774
N = 30,000 * CTR"2

FQI = 0.3015
N = 110,000 * CTR"2

15 or fewer
sources

FQI = 1
N = 10,000 * CTR"2

FQI = 0.5774
N = 30,000 * CTR"2

D-3


-------
Appendix D

Emissions Factor Development and Data Quality Characterization Procedures

	__—

-

	

	:	

38888888





W.V.*

v.wv

¦rV///.wXv/A4Ay.v<.'.,l





V.V.V.WvV

Xgjjffigggffi

y.yyy,

•vXv;:-:

jAAfiUAA

Highly Representative

OwQwQJK
v\v.v,y.;.

¦.'/.V.'.V.W
•..-.•.v.;.;.;.;.

;:-v:v

^x-x-x-x-x-Xv

¦vv.v.y. v.v.v.v.;. v.;;v.v.;.



llilll



Minimally Representative

Moderately Representative

Figure D-l. Emissions Factor Representativeness Areas for Source Categories Containing More

Than 15 Sources

35	40	45	50	55	60	65	70	75

Composite Test Rating, %

D-4


-------
Appendix D

Emissions Factor Development and Data Quality Characterization Procedures

Figure D-2. Emissions Factor Representativeness Areas for Source Categories Containing 15 or

Fewer Sources

Composite Test Rating, %

3.0 Procedures

The following steps summarize the specific calculation and data quality characterization
procedures used in WebFIRE to calculate a new or revise an existing emissions factor from a
candidate data set that has been subjected to the WebFIRE BDL and outlier analyses (See
Appendices B and C, respectively). The steps described in this section are performed when
deriving a user-defined emissions factor.

•	Step 1 - WebFIRE arranges the individual test data values being considered in
descending order by: (1) the ITR and (2) the test data value.

•	Step 2 - Beginning with the second individual test data value and continuing
sequentially in order, WebFIRE calculates the CTR using the following equation:

D-5


-------
Appendix D

Emissions Factor Development and Data Quality Characterization Procedures

CTR =
n

n

z

i = 1

VITRiJ
N

1-0.5

Where:

CTR = Composite Test Rating,

ITR = Individual Test Rating (assigned by ERT), and

N = Number of tests with ITRs equal to or greater in value as those included
in the candidate data set.

It should be noted that a CTR is calculated for each combination of individual test values
in the data set potentially used to derive an emissions factor. For example, using a data
set consisting of 10 test values, WebFIRE would calculate 9 CTRs, beginning with the first
two data points, then the first three data points, and so forth until a CTR is calculated
for all 10 data values.

• Step 3 - For each calculated CTR, WebFIRE calculates the FQI using the following
equation:

100

FQI ~ 0CTRxN°-5)

Where:

FQI = Factor Quality Index,

CTR = Composite Test Rating associated with the data set selected for deriving
the emissions factor, and

N = Number of tests with ITRs equal to or greater in value as those included
in the candidate data set.

•	Step 4 - WebFIRE compares the calculated FQI with the FQI for the previous ITR
grouping. If the FQI associated with the larger grouping (i.e., more data values) is
less than the FQI with fewer data values, then WebFIRE proceeds back to Step 2 to
perform the next sequence in the calculations. If the FQI associated with the larger
grouping is greater than the preceding FQI, then WebFIRE does not include the test
data value responsible for the increase in the FQI in calculating the emissions factor
and excludes the remaining data (with lower ITRs) from consideration.

•	Step 5 - WebFIRE calculates the emissions factor using all test data values that were
included in calculating the lowest FQI. This includes all test data values with higher
ITRs than the ITR value that resulted in an increased FQI value.

D-6


-------
Appendix D

Emissions Factor Development and Data Quality Characterization Procedures

•	Step 6 - WebFIRE determines if the SCC corresponding to the candidate data set
selected by the user contains 15 or fewer sources. Appendix F lists the SCCs that we
expect to contain 15 or fewer sources.

•	Step 7 - WebFIRE compares the FQI for the test values used to calculate the
emissions factor with the corresponding boundary criteria for assigning one of the
three emissions factor quality ratings. Different boundary criteria are used for
source categories containing 15 or fewer sources and for source categories
containing greater than 15 sources.

Example 1

Table D-2 below contains an example set of 35 individual test data values selected to
develop an emissions factor for SCC 303010. The table shows the test data values, their
corresponding ITR and N values, and the calculated CTR and FQI values. The table also indicates
whether or not the test data value should be used to calculate an emissions factor and the
representativeness of the resulting emissions factor (not shown in the table).

Table D-2. Individual Test Data and Various Characteristics

Individual









Use for



Test









EF

EF

Value

ITR

CTR

N

FQI

Average?

Representativeness

0.0108

98

98.00

1

1.0204

Yes

Minimally

0.1100

98

98.00

2

0.7215

Yes

Minimally

0.0917

92

95.87

3

0.6022

Yes

Minimally

0.0212

92

94.86

4

0.5271

Yes

Moderately

0.0339

91

94.05

5

0.4755

Yes

Moderately

0.0027

91

93.52

6

0.4365

Yes

Moderately

0.0563

89

92.83

7

0.4072

Yes

Moderately

0.0165

89

92.32

8

0.3829

Yes

Moderately

0.0158

88

91.81

9

0.3631

Yes

Moderately

0.0044

88

91.41

10

0.3460

Yes

Moderately

0.0675

88

91.08

11

0.3310

Yes

Moderately

0.0043

88

90.81

12

0.3179

Yes

Moderately

0.0449

74

89.10

13

0.3113

Yes

Moderately

0.0203

73

87.58

14

0.3052

Yes

Moderately

0.0603

70

85.97

15

0.3003

Yes

Highly

0.0425

70

84.64

16

0.2954

Yes

Highly

D-7


-------
Appendix D

Emissions Factor Development and Data Quality Characterization Procedures

Table D-2. Individual Test Data and Various Characteristics (Cont.)

Individual









Use for





Test









EF

EF



Value

ITR

CTR

N

FQI

Average?

Representativeness

0.0130

70

83.51

17

0.2904

Yes

Highly

0.1440

69

82.45

18

0.2859

Yes

Highly

0.0177

68

81.45

19

0.2817

Yes

Highly

0.0317

68

80.58

20

0.2775

Yes

Highly

0.0052

68

79.82

21

0.2734

Yes

Highly

0.1350

68

79.14

22

0.2694

Yes

Highly

0.0006

60

77.90

23

0.2677

Yes

Highly

0.0023

45

74.85

24

0.2727

No

Not appl

cable

0.0724

45

72.33

25

0.2765

No

Not appl

cable

0.0960

44

70.08

26

0.2799

No

Not appl

cable

0.0538

40

67.54

27

0.2850

No

Not appl

cable

0.0170

38

65.07

28

0.2904

No

Not appl

cable

0.0132

35

62.48

29

0.2972

No

Not appl

cable

0.0124

34

60.14

30

0.3036

No

Not appl

cable

0.0029

30

57.41

31

0.3128

No

Not appl

cable

0.0018

30

55.16

32

0.3205

No

Not appl

cable

0.0083

30

53.28

33

0.3268

No

Not appl

cable

0.0009

30

51.66

34

0.3319

No

Not appl

cable

0.0034

30

50.27

35

0.3362

No

Not appl

cable

Figure D-3 shows a plot of the CTR and N data in Table D-2 and the boundaries created
by the line equations. In developing the emissions factor for the example data set, the first
23 values in Table D-2 are included in the emissions factor calculation because the FQI increases
for the first time between the 23rd and 24th pair. Using the first 23 values yields an emissions
factor of 0.0413 with a quality rating of "highly representative."

D-8


-------
Appendix D

Emissions Factor Development and Data Quality Characterization Procedures

Figure D-3. Plot of CTR and N Data from Table D-3

ffl

0)
-G

E
~

Highly Representative

23rd value





Moderately Representative

Minimally Representative

Composite Test Rating, %

Example 2

Table D-3 contains another example set of individual test data values selected for use in
developing an emissions factor for SCC 303011, which is expected to contain 15 or fewer
sources per Table D-l.

Table D-3. Individual Test Data Values
Selected for Developing an Emissions Factor
for a Source Category Containing 15 or
Fewer Sources

Individual Test Data Value

ITR

0.0015

45

0.0004

60

0.0055

30

0.0019

30

0.0012

30

0.0640

30

0.0113

30

D-9


-------
Appendix D

Emissions Factor Development and Data Quality Characterization Procedures

Table D-3. Individual Test Data Values
Selected for Developing an Emissions Factor
for a Source Category Containing 15 or
Fewer Sources (Cont.)

Individual Test Data Value

ITR

0.0088

30

0.0029

88

0.0611

92

0.0402

70

0.0299

74

0.0375

89

0.0118

68

0.0072

99

Table D-4 shows the same data after the data have been sorted and the N, CTR and FQI
values have been calculated. The table also indicates whether or not the test data value should
be used to calculate an emissions factor and the representativeness of the resulting emissions
factor.

Table D-4. Individual Test Data and Various Characteristics for a Source Category

with 15 or Fewer Sources

Individual









Use for



Test









EF

EF

Value

ITR

CTR

N

FQI

Average?

Representativeness

0.0072

99

99.00

1

1.0101

Yes

Minimally

0.0611

92

95.31

2

0.7419

Yes

Moderately

0.0375

89

93.06

3

0.6204

Yes

Moderately

0.0029

88

91.71

4

0.5452

Yes

Highly

0.0299

74

87.16

5

0.5131

Yes

Highly

0.0402

70

83.42

6

0.4894

Yes

Highly

0.0118

68

80.56

7

0.4692

Yes

Highly

0.0004

60

76.80

8

0.4603

Yes

Highly

0.0015

45

69.75

9

0.4779

No

Not applicable

0.0012

30

58.11

10

0.5442

No

Not applicable

0.0019

30

51.97

11

0.5801

No

Not applicable

0.0088

30

48.12

12

0.6000

No

Not applicable

0.0113

30

45.45

13

0.6103

No

Not applicable

0.0640

30

43.48

14

0.6147

No

Not applicable

0.0055

30

41.97

15

0.6152

No

Not applicable

D-10


-------
Appendix D

Emissions Factor Development and Data Quality Characterization Procedures

Figure D-4 shows a plot of the CTR and N values shown in Table D-4 and the boundaries
created by the line equations. In developing the emissions factor for the example data set, the
first 8 values in Table D-4 are included in the emissions factor calculation because the FQ.I
increases for the first time between the 8th and 9th pair. Using the first 8 values yields an
emissions factor of 0.0239 with a quality rating of "highly representative."

Figure D-4. Plot of Selected Data from Table D-6

Composite Test Rating, %

For test data submitted to WebFIRE using ERT, a numerical ITR value will be assigned to
the data by ERT prior to incorporation in WebFIRE. For data that were incorporated into
WebFIRE prior to the development of ERT (e.g., the underlying data used to develop AP-42
emissions factors), the current subjective, letter-grade quality ratings have been converted to
numerical values as follows:

Test Data Letter Grade

Equivalent ITR Score

A

80

B

60

C

45

D

30

For example, a previous test rated as a "B" that is part of the candidate data set for
emissions factor development would have an ITR value of 60 for use in calculating the CTR. We

D-ll


-------
Appendix D	Emissions Factor Development and Data Quality Characterization Procedures

used this approach because it would be time intensive and prohibitively costly to reevaluate
every previous test report and assign it an ITR based on the rating system contained in the ERT.

D-12


-------
Appendix E

Statistical Procedures for Determining Valid Data

Combinations


-------
Appendix E

Statistical Procedures for Determining Valid Data Combinations

1.0 Introduction

As new emissions data are incorporated into WebFIRE, we expect that, periodically, we
will want to determine whether a new data set should be combined with an existing data set
for a given source type or category. When determining whether data sets should be combined,
we will follow the procedures specified in this appendix.

We anticipate applying these procedures on a case-by-case basis, most likely on data
that are expected to be from the same type of emissions units, with similar types of emissions
controls and under the same type of operational process. For example, a statistical analysis
would be performed on source test data for the following processes at a Portland cement plant:
a dry-process kiln, a wet-process kiln, a preheater kiln and a preheater/precalciner kiln. Each of
the processes employs either an electrostatic precipitator (ESP) or a fabric filter. Emissions from
the processes and control type combinations (e.g., a dry-process kiln controlled by an ESP and a
wet-process kiln controlled by a fabric filter) would be compared to determine if the data sets
should be combined. These procedures would not be applied to source test data from
processes or controls that are clearly separate and distinct (e.g., coke oven emissions and
electric arc furnace emissions) nor would they be applied to source test data that are clearly
representative of the same source type, same fuel or same controls. In cases where it is
acceptable to combine the new and existing data, the BDL and outlier calculation procedures
found in Appendix B and Appendix C, respectively, are used in the emissions factor
development process.

Simple statistical characteristics such as the number of values, the mean and the
variance can be used to represent a data set for computational purposes. Comparison of similar
characteristics between data sets can determine whether the data sets are from the same
population of values. If the data sets are determined to be from the same population of values,
the data sets can be combined into a single, combined data set, often referred to as a pool.
Pooled values are preferred over individual values because pooled values provide the best
estimate of a population's variance.

2.0 Description of Procedures

The data combination assessment procedures that we would use to determine whether
a new data set should be combined with an existing data set are based upon use of the
Student's t-test. For this analysis, we use a two-tailed test rather than a one-tailed test. The EPA
uses the following steps to determine if it is appropriate to combine new data with existing
data:

1. Obtain all emissions test data (i.e., the number of values and the numerical values of
the data set) for the new data set and the data used to calculate the existing
emissions factor. Include those data values that were previously identified in the
emissions factor development for the source type or category as potential outliers.
The new and existing data should represent average emissions test values, not test

F-E-l


-------
Appendix E

Statistical Procedures for Determining Valid Data Combinations

run values. Calculate the natural log of each value to create the data sets for use in
the Student's t-test.

2.	Prepare a null hypothesis that the data sets are from the same distribution (the
means of the two sets are equal) and an alternative hypotheses that the data sets
are not from the same distribution (the means of the two sets are unequal).

3.	Conduct a Student's t-test on the data sets assuming unequal variances. By assuming
an unequal variance, the variance of the data set and the characteristics of
equivalency do not need to be determined. Calculate the absolute value of the
Student's t-test statistic.

4.	Find tcriticai values at the 0.05 significance level for the appropriate number of
degrees of freedom. If the absolute value of the Student's t-test statistic is greater
than the tcriticai value, the means are assumed to be unequal (i.e., the data sets
should not be combined). If the absolute value of the Student's t-test yields a value
that is less than or equal to the tcriticai value, the means are assumed to be equal (i.e.,
the data sets can be combined).

The two examples below illustrate the use of the data combination assessment
procedures based on the Student's t-test. In both examples, the determination as to whether to
combine the two data sets is based on an assessment of the difference in the means of the data
sets. Accepting the null hypothesis means that we are 95 percent confident that the differences
between the means of the Group A Source Test Data and Group B Source Test Data are not
statistically significant; therefore, we can combine both data sets. If we accept the alternate
hypothesis (reject the null hypothesis), we are 95 percent confident that the differences
between the means of the two source test data sets are statistically significant and we cannot
combine Group A Source Test Data and Group B Source Test Data to derive an emissions factor.

Example 1

Table E-l presents two data sets: Group A, which is used to calculate the current
emissions factor of 0.0118 pounds of pollutant per ton of fuel combusted, and Group B, which
is from a similar source category with similar controls and operated under a similar process.

Table E-l. Emissions Factor Characteristics for Group A and B

Group A
Source Test
Data

Group B
Source Test Data

0.0015

0.0029

0.0004

0.0611

0.0055

0.0402

E-2


-------
Appendix E

Statistical Procedures for Determining Valid Data Combinations

Table E-l. Emissions Factor Characteristics for Group A and B (Cont.)

Group A
Source Test
Data

Group B
Source Test Data

0.0019

0.0299

0.0012

0.0375

0.064

0.0118

0.0113

0.0072

0.0088

Using an alpha of 0.05, these values yield a t-test statistic whose absolute value is 1.401 and a
tcriticai value of 2.160. Since the absolute value of the t-test statistic is less than the tcriticai value,
the analysis shows that the means of Group A and Group B are equal. Therefore, the null
hypothesis is accepted, meaning that the data sets are from the same distribution; thus their
means are the same. Given that the means of Groups A and B are equal, the individual test data
sets can be combined and a revised emissions factor could be calculated using the procedures
specified in Appendices B through D. If the means had been unequal, the Group A and B
individual test data sets would not be combined.

Example 2

Table E-2 presents two data sets: Group C, which is used to calculate the current
emissions factor of 0.0015 pounds of pollutant per ton of fuel combusted, and Group D, which
is from a similar source category with similar controls and operated under a similar process.

Table E-2. Emissions Factor Characteristics for Group C and D

Group C
Source Test
Data

Group D
Source Test
Data

0.0005

0.0029

0.0015

0.0029

0.0025

0.0029

Using an alpha of 0.05, these values yield a t-test statistic whose absolute value is 2.425 and a
tcriticai value of 4.303. Since the absolute value of the t-test statistic is less than the tcriticai value,
the analysis shows that the means of Group A and Group B are equal. Therefore, the null
hypothesis is accepted, meaning that the data sets are from the same distribution; thus their
means are the same. Given that the means of Groups A and B are equal, the individual test data
sets can be combined and a revised emissions factor could be calculated using the procedures
specified in Appendices B through D. If the means had been unequal, the Group A and B
individual test data sets would not be combined.

E-3


-------
Appendix F

Source classification codes for source categories

CONTAINING 15 OR FEWER SOURCES

Appendix F is now located at: https://www.epa.gov/svstem/files/documents/2021-
11/append ix-f-scc-codes-for-less-than-15-sources.pdf

F-l


-------

-------
United States	Office of Air Quality Planning and Standards	Publication No. EPA-453/B-24-001

Environmental Protection	Sector Policies and Programs Division	August 2024

Agency	Research Triangle Park, NC


-------