SCC/WebFIRE Product Design Team Report November 22, 2017 Table of Contents 1. Introduction 1 2. PDT Members and Activities 2 3. Survey Questions and Results 3.1. Respondents 4. Conclusions and Next Steps... 4.1 Conclusions 12 13 4 4 4.2 Next steps 15 1. Introduction This document is the final report on work done by the Source Classification (SCC) and WebFIRE (WF1) Product Design Team (PDT) as part of the CAER project. It is one of five PDT projects2. The broader goal of these teams is to gather information that will help move us towards a common reporting framework, possibly through a common emissions form (CEF) approach. The goal of this team was to identify problems and solutions as related to SCCs and WF, that will meet requirements under the CAER project for State/local/tribes (SLTs), the National Emissions Inventory (NEI), the National Air Toxics Assessment (NATA), and the Compliance and Emissions Data Reporting Interface (CEDRI)/Emissions Reporting Tool Each SCC represents a unique source category-specific process or function that emits air pollutants. SCCs are used as a primary identifying data element in EPA's WF, the NEI, and other EPA databases. WF is EPA's online emissions factor (EF) repository, retrieval, and development tool. It contains recommended EFs for criteria and toxic pollutants for industrial and non-industrial processes identified by SCC. For each recommended emissions factor and individual data value, WF contains descriptive information such as industry and source category type, control device information, the pollutants emitted, and supporting documentation. EFs are a key piece of information in emissions estimates for accurate inventory reporting. Thus, a CEF would have to be able to retrieve EF data by SCC and present it to the user. The project scope was to develop, implement and evaluate a survey on the use of SCCs and WF. The focus of the survey questions was on the interaction between SCCs and EFs from WF. While any changes to either SCCs or WF that might assist the broader goal for the CEF development3 were noted, 1 The use of the acronym "WF" is solely for practical purposes and is not the typical usage to refer to "WebFIRE", which itself is an acronym for "Web Factor and information Retrieval Database" 2 See CAER website for information on CAER PDT projects. 3 At the time, the survey results were being compiled, changes to SCCs reported by survey participant were already being addressed. (ERT). 1 ------- the team avoided focusing on revamping the SCC or WF system themselves, and focused on how the CEF would have to work with both. Section 2 of this report describes the team members and activities. Section 3 of this report describes the survey questions and results. Section 4 presents conclusions and next steps. 2. PDT Members and Activities The team members were: SLTs: Dennis McGeen, Michigan (DEQ) Tom Shanley, Michigan (DEQ) Mark Wert, Massachusetts ( MassDEP) Team Co-lead David McClard, South Carolina (DHEC) EPA: Mike Ciolek (WebFIRE) Julia Gamas (Emissions Inventory and Analysis Group & SCCs) - Team Co-lead Ketan Patel (CEDRI project lead and SCCs) The steps the team followed and timeframe for those steps were as follows: 1. Information gathering (February and March 2017): • The team exchanged information about their use of SCCs and WF. • The team learned about WF as it is now and future plans, as well as certain time and resource constraints that exist in terms of its updates and maintenance. 2. Survey Development by State Team Members (March to July 2017): The state members of the team developed a survey form containing questions to identify the issues and challenges in the current SCC and WF systems that SLT, NEl, and CEDRI/ERT/WF programs are facing. The state team members set out to collect the following types of information based on their own experience and knowledge at the time: • Issues in compiling emission inventories (including problems using SCC and WF factors in electronic data collection systems and maintaining those tables over time) • Issues in reporting emissions and stack testing data (including significant gaps in SCC and emission factors, and inconsistencies between the list of SCCs and the SCCs contained in WF). • Issues in data analysis • Methods and approaches that individual programs have used to address the problems • Suggestions for improvements 3. Survey Deployment (July to September 2017): 2 ------- The final survey was refined and then deployed by the state team members. The time SLTs were surveyed had to be staggered with the deployment of surveys from other PDT projects4. So as not to overwhelm SLTs with too many surveys, thus maximizing the responses, the survey was deployed in July 2017. Final responses were received in September 2017. 4. Data Analysis and Report (September to October 2017): Data were analyzed and summarized as presented in this report. The current report summarizes the issues and challenges in the current SCC system and WF as discovered from the survey. 5. Development of Conclusions and Next Steps (November 2017): Potential follow-up questions and specific next steps for future PDT work (for example, to be addressed in a second project phase) evolved from this report. Results from the survey provided a "to do" list of next steps. These have been prioritized for further work. These results and conclusions are intended to: • Identify issues on the critical path for CAER pilots. • Guide additional phases for addressing the issues and challenges through future work under the PDT. The initial brainstorming phase included both EPA staff and participating SLTs. The survey development and phase included SLTs only. SLTs were assisted in survey deployment by Kelly Poole of the Environmental Council of States (ECOS). 4 See footnote 2. 3 ------- 3. Survey Questions and Results In what follows, we describe the respondents of the survey and list the results of the survey by listing the question and then the responses. Questions were grouped by "theme" as will be indicated, and include how EF data are retrieved from WF, how the data are used, as well as what issues arise in using SCCs and WF. 3.1. Respondents Respondents to the survey were SLT authorities; therefore, more than one response could have been received from a state. Table 1 shows the states that responded and how many responses were received from that state. The survey was sent to all states and U.S. territories with 38 states (76% of states) and 1 U.S. territory, (Puerto Rico), sending at least one response. Table 1. States who Responded to the Survey State Respondents State Respondents State Respondents TN 4 KS 1 AR 1 MD 1 AZ 2 NJ 1 NM 1 CT 1 CO WY 1 OK 1 PA 1 HI 1 DE 1 MO 1 Rl 1 IA 1 NY 1 SC 1 Ml 1 NC NH 1 MN 3 KY 1 ID 1 MT 1 ND 1 LA 1 UT 1 NE GA 1 MS 1 NV 1 PR 1 IL 1 IN 1 FL 1 OH 1 MA 1 TOTAL 48 Note: * Puerto Rico (PR) is a U.S. territory. For reporting purposes, it is considered a state. Fifty-one responses were received, however, one set was deleted and the other consolidated. Pima Association of Governments (AZ) submitted duplicate entries but stated that it does not collect point source emissions data and answered "No" to many of the questions or left them blank. Therefore, it's responses were removed from the results, as the intent was to reach those who do report point source data. Louisville, KY submitted duplicate responses by the same individual so these were consolidated into one. For the purposes of this report, each respondent (regardless of how many respondents came from the same state) is counted as one. This, in contrast to reporting totals by state, because the number of respondents is a true representation of reporting burden. There was a total of 48 responses. To put this 4 ------- number in context, currently there are 89 distinct SLTs actively reporting criteria emissions towards the 2014 National Emissions Inventory. Of those, about 50% are represented in the answers to the survey5. 5 The list of active reporting SLTs was matched to the list of survey respondents. All but three SLTs matched with two SLTs from TN (Nashville and Shelby) and one tribal authority from MN (Mille Lacs Band of Ojibwe). These were counted towards the state they belong to or are associated with. 5 ------- Survey Questions and Answers This section documents the responses to the survey questions. For each question, a list of answers is shown and between parentheses next to it is the percentage of respondents who selected that answer. Note that respondents were given the option of selecting more than one answer. Therefore, the reader should not expect that the percentages reported next to each answer will add up to 100%. Because most respondents selected more than one answer, additional tables are provided with a more detailed breakdown of the combination of answers selected by respondents. The percentages in those tables will add up to 100%. For example, if the answers offered were 1, 2 and 3, respondents might have selected answers 1 and 2, 1 and 3, all three answers, 2 and 3, etc. To get from the percentages reported in the detailed table to the percentages reported between parentheses in the list of answers, the reader must add up all who answered 1, all who answered 2, and all who answered 3 in the detailed table. In addition, for each question, respondents were given the option to add an additional answer or comments of their own. These are reported either in a table or in a bulleted list. Except for Table 2, figures and tables in this section are numbered according to the number of the question from the survey. Therefore, the numbering sequence will not be continuous as some questions with a "yes" or "no" answer did not require representation in detailed breakdown tables. Retrieval of WebFIRE Emission Factors Data (Questions 2 - 4) Question 2. How does your SLT collect emissions data for point sources? (select all that apply) Response options and the percent who chose that option were (see Figure Q2): 1. We have our own web-based electronic reporting system (58%) 2. We collect data electronically (e.g. excel sheets) but must then take several steps to process the data to compile it (33%) 3. Our data collection is on paper or PDF files (56%) 4. Other (Please specify) (38%) Table Q2-a shows a breakdown of responses to Question 2. Given that the reply allowed the respondent to select all responses that apply, these data can be broken down further. The 58% of respondents that said they have a web-based electronic reporting system can be further broken down into: those who only do web-based electronic reporting only (31%); those who, in addition to web-based reporting, accept electronic collection in other formats such as excel sheets (15%); and finally, those who in addition to both web-based and other electronic means, also accept paper or files in PDF format (13%). Recall that those responding that they accept electronic formats were also answering that some post processing of the data received was necessary. About 10% collect data either only electronically, or, in addition to gathering data electronically, accept paper and PDF submissions in addition (8%). Finally, 21% answered that they do paper or PDF collection only. 6 ------- Three respondents said they are moving to some form of electronic reporting system. To see additional comments that were recorded under the "Other" category, see Table Q2-b. Responses to this question indicate data are being collected in many ways with no one way currently dominating. That is because while many SLTs have electronic systems, many still allow spreadsheets and paper submissions. Many SLTs are collecting in a mix of ways as opposed to only one way or another. Question 3. How do you retrieve WebFIRE data? (select all that apply) Response options and the percent who chose that option were (see Figure Q3): 1. Our system downloads the data from an EPA website and pulls it into our own electronic system automatically (2%) 2. We download the data from an EPA website and directly pull it into our system as a batch (10%) 3. We download the data from an EPA website and type changes into our system individually (33%) 4. We download and process the data from an EPA website, then upload it into our system (27%) 5. We download data from EPA and distribute to industry via a website or other means (13%) 6. We must email someone at EPA to get some of the information that we need, then pull it into our systems or store on our computers (2%) 7. Other? (please specify) (54%) Table Q3-a shows the breakdown of responses to this question. About 10% of respondents download data from EPA in batch and pull it directly into their systems with only 6% doing so exclusively. This means the rest (94%) must do some sort of post-processing of EPA data before being able to use it. Of particular interest are responses from the "Other" category, to which 44% responded. Many report letting industry choose their EF, 5(10%) respondents reported not using WF, and the rest reported having some combination of use of EFs from WF as well as from other sources. Table Q3-b shows a summary of these comments. Question 4. How frequently do you retrieve WebFIRE information? (select all that apply) Response options and the percent who chose that option were: 1. As soon as WebFIRE is updated with a new emission factor. (4%) 2. As soon as we are aware of WebFIRE updates. (10%) 3. Several times per year periodically. (8%) 4. Several times per year depending on resource availability. (0%) 5. Once per year (please specify the time in the space provided below under other). (17%) 6. Less frequently than once per year. (23%) 7. Other? (please specify) (58%) Many respondents (40%) don't check WF frequently (once per year or less). Another 40% responded in the "other" reply that they check them on an "as needed" basis. 20% of respondents check WF EFs as soon as there is an update or several times per year. Only one respondent chose more than one option selecting both as soon as updates are available (response 1) and as soon as they are aware of an update (response 2). Figure 4 and Table Q4 show results for this question. 7 ------- Use of Emissions Factors from WebFIRE (Questions 5 -15) For this survey we are assuming that you use WebFIRE to look up emission factors by SCC for emissions calculations and reporting. Question 5. Do you use Emissions Factors for (select all that apply)? Response options and the percent who chose that option were: 1. Criteria Pollutants (81%) 2. HAPs (71%) 3. Other toxics beyond HAPS (23%) 4. Other (please specify) (27%) Figure Q5 and Table Q5 show the results. A majority uses them for more than one type of pollutant: 48% for both criteria and HAPS and 23% for criteria, HAPS and other toxics. It is interesting to note that 6 respondents (13%) consult WF for greenhouse gases, while also using WF for the one or more of the other pollutants, as reported under "Other". Question 6. Which type/s of WebFIRE emission factors do you use in your system? (select all that apply) Response options and the percent who chose that option were: 1. Uncontrolled emission factors (plus control efficiency if supplied by the filer) (75%). 2. Controlled emission factors (applied directly) (52%) 3. Uncontrolled emission factors derived from controlled emission factors by state. (25%) 4. Other (please specify) (21%) Figure Q6 and Table Q6 show the summary of responses for this question. 27% of respondents use the data for uncontrolled emission factors only (response 1), 23% for both uncontrolled and controlled (responses 1 and 2), 25% for all three options, and only 4% for controlled factors only. Five respondents answered the "Other (please specify)" question. Four responses were repetitive and their comment has been captured in the previous questions. One comment is important because it points to a challenge involved in the use of WF: "Only in recent years has our emission estimator been updated to attempt to take advantage of controlled emission factors or toxics. Relating EFs with the appropriate Control devices is challenging, especially since the WebFIRE control devices are inconsistent and can use obsolete device codes." Answers for questions 7 through 13 were either "yes" or "no" with a choice to provide more explanation or detail. See 8 ------- Table 2 for a summary of responses to these questions. Question 7. Do you use WebFIRE data to check emissions or emission factors reported by filers for reasonability (e.g. compare an emission factor resulting from reported emissions and activity data to an emissions factor in WebFIRE for a specific SCC)? If so, please describe your checking process. If you need additional space, please email one or all of the individuals named above. 58% of respondents answered "yes" to this question. Comments under the "other" response report the use of EFs to validate a submission. While not all respondents specified doing these checks manually, none reported having a QA system in place that will do these checks automatically. One comment highlighted difficulty in this process: they have found the WF format difficult to use when pulling all the emission factors for a process. They find that AP-42 tables are more user friendly. Question 8. Do you add state-specific emission factors to those in WebFIRE that are available for use by your filers for any facility? If yes, please describe what you add, where it comes from, and why you find it valuable? 27% of respondents answered "yes" to this question. 5 respondents mentioned the need for EF when the ones needed are not available in WF. 1 respondent said filers can supply any EF, but it must be supported by documentation if it has not already been reviewed through the permitting process, and that most EF verification happens during the permitting process. The emissions factors used can be from stack tests or where the state has vetted them. Question 9. Do you fill in missing criteria pollutant emissions factors in WebFIRE with other WebFIRE factors from very similar SCCs? 44% of respondents said yes to this question. 3 respondents mentioned this happens specifically for fuel-burning SCCs. Question 10. Do you make any other modifications or additions to WebFIRE information to make it useful in your system? If so, please describe. 25% of respondents answered "yes" to this question. 1 respondent said: "The WebFire data as downloaded has many inconsistencies and duplicate rows. We clean up some of the units for consistency, assume the most conservative for inequalities, assume the midpoint for ranges, and switch MMBtu factors to be throughput based formulas where (heat content per throughput) is a variable. Based on how our system is built, we also need to add rows with a null factor and null formula for SCCs which have no Fire factor." 1 respondent said they supplement HAPS EFs as well. 5 respondents reported having to convert to standard units of measure. Question 11. If you modify or add to WebFIRE information (e.g., state-specific emission factors or gap filling) would you be willing to share the table/s of SCC and emissions factors used in your reporting system? If so, please email it to one of the team members listed at the top of the survey. Please zip if the file is large. 46% of respondents said "yes". While this subject will be brought up in the discussion section, one idea that emerged is to compile a database with both WebFIRE and other EFs from states that would be shared with everyone who might need it. 9 ------- Question 12. Do you maintain and allow filers to use expired factors? 35% of respondents answered "yes" to this question. 2 respondents reported that they don't have a specific verification process to check for expired factors. Most respondents said that they allow expired factors to be used it if no other factor is available. Question 13. When there are 2 factors for the same SCC pollutant combination do you select which one to allow or allow filers to select for themselves? 65% of respondents said "yes". 16 respondents said they allow the facility to select the factor they think best. 1 responder said their agency selects the correct one and another responder said their agency selects the higher one. 3 reporters mentioned using the EF from the permitting stage and 1 respondent said they allow the facility to pick the uncontrolled or pre-NSPS factor. Question 14. Do you face any of these problems in using the emission factor data in WebFIRE in your system? (select all that apply) Response options and the percent who chose that option were (see Figure Q14): 1. Emission factors assigned to general SCCs that when used by filers result in emission estimates that are not representative of the actual process being reported on (25%) 2. Inconsistent emissions factors for nearly identical activities resulting in users selecting the lowest but not necessarily most representative emission factor (factor "shopping") (25%) 3. Multiple factors assigned to the same SCC requiring you to select the best one for use in your system (40%) 4. Controlled factors being higher than uncontrolled factors resulting in you having to find alternative factors to provide consistency (6%) 5. Missing emission factors for some criteria pollutants associated with an SCC (54%) 6. WebFIRE emission factor table factors outdated as compared to AP-42 (17%) 7. WebFIRE emission factor table using outdated SCCs (23%) 8. Have you identified other problems? Please provide specific examples and explain how you fixed the problem (25%) Table Q14 shows the breakdown of responses. Adding up individual responses or small groups of responses, we can see that about 71% of individual SLTs or small groups of SLTs have at least one or more of these issues. Most of these (52%) indicated having more than one issue. Amongst the comments under response 8 were: • One reported a dissatisfaction with the SCC system with SCCs being general and not covering all different types of process or devices. • One reported lack of EF for new SCCs, thus having to augment emission factors from similar SCCs. This involves a lot of time and resources since it is mostly manual work. • One reported having to drill down to the detail of why the factors are different, and this is time consuming. There should be one uncontrolled factor per pollutant/SCC combination - as in AP- 42. • One reported the format is difficult to use when pulling all pollutant EFs for an SCC. • One reported inconsistent units of measure (e.g. MMBtu or tons or MMCF). 10 ------- Question 15. Do you face any of these issues in using SCCs in your system? (select all that apply) Response options and the percent who chose that option were (see Figure Q15): 1. SCCs have inconsistent description for like level codes, especially for top levels (33%) 2. Missing "map to" value for a retired SCC (44%) 3. Multiple SCCs (with different pollutant lists and factors) have been retired by EPA and map to the same new SCC with different factors from either SCC that was mapped from (21%) 4. SCC with valid emission factors is retired by EPA, but the map to SCC has no factors in WebFIRE (21%) 5. SCCs in WebFIRE do not match those in EIS (21%) 6. Missing "short name" for new SCCs (15%). 7. SCC descriptions too long (13%). 8. Timing of SCC changes and retirements cause errors when trying to flow data to EIS (e.g., changes in mid-year come after industry has submitted using old, now invalid SCCs) (33%) 9. Have you identified other problems? Please provide specific examples and explain how you fixed the problem (23%) Similar to question 14, individual SLTs or small groups of SLTs reported having different combinations of problems. However, adding all those who had one or more than one issue adds up to 73% of respondents: 19% reported one issue only, and 54% reported more than one. See Table Q15 for a breakdown of the responses). While some responses elaborated more on the issue they selected, a few more added information (even if sometimes not strictly referring to the original question but to EFs and SCC in general): • One responded that there is no notification process when an EF is added, updated or deleted. • One responded that sometimes a description changes that should result in a change in the numerical destination/code that doesn't always happen. • One responded that: "The new process of changing SCCs in the EIS for existing processes was very different and poorly handled last year. And on top of that, there is not consistent communication when SCC changes are made, so a lot of times the first time we are aware of a change of a SCC is when we start submitting our data to the EIS." • One responded having similar issues with control codes. At this stage in the report, it is important to note that many of the SCC issues that have surfaced in this survey had surfaced in previous CAER work, specifically as it relates to the SCC Web-search and web services "short term win" project. Issues covered in responses 1,2, and 8 are already being addressed. Issues covered by answers 6 and 7 will be addressed in the future. Much of the confusion about SCCs has been addressed in the new "Introduction to SCCs" document, which can be downloaded from the EPA website. WebFIRE Data in Calculations (Questions 16) Question 16. Have you encountered any of these problems in the process of using WebFIRE data for emissions estimate calculations? (select all that apply). Response options and the percent who chose that option were (see Figure Q16): 11 ------- 1. Formulas that must be manually adjusted/parsed so your software can interpret them (13%) 2. There are multiple emission factors associated with a single SCC that have different throughput units of measure (requiring you to manually select between them to make the WebFIRE data useable in your system) (19%) 3. Lack of appropriate conversion factor data associated with mass or volume emission factor in such a way that a system "knows" what to use for calculations (e.g., when your system needs to use different units of measure for throughput than WebFIRE is giving) (19%) 4. Inconsistencies between WebFIRE control devices and active EPA EIS control device codes (8%) 5. Please provide specific examples and explain how you fixed the problem. If your SLT system has utilities for conversions and handling formulas for calculations, please elaborate on how it accomplishes these tasks (23%) Table Q16 shows the breakdown of answers to this question. 75% percent of respondents did not indicate having any of these issues. The remaining 25% had a mix of issues with 8% experiencing all of them. Comments included: • "Our system is currently very limited when it comes to parsing equations or converting units, so these are things we mostly handle manually." • 2 alluded to the fact that industry is responsible for finding the correct EF which means that any burden might be translated to them. • 3 reported having to use some manual correction in some cases Question 17. Additional Comments? If you need additional space, please email Mark Wert (mark.wert@state.ma.us) or Tom Shanley (SHANLEYT@michigan.gov) or Dennis McGeen (MCGEENDl@michigan.gov). Responses included: • "We do a lot of manual work with WebFIRE (looking up emission factors) as we do not have a lot of different facilities in Knox County, TN." • "In general, SLEIS itself is not very amenable to using WebFIRE emission factors. For instance, SLEIS will only auto-populate emission factors that were previously reported...it won't allow for uploading or using WebFIRE emission factors." • "In most cases, we encourage filers to use CEM, stack tests, and other more accurate methods of emission factors." • "Overall, I imagine most of what was questioned in this survey is more applicable to systems that have a more automated approach to their emission factors, and since we hand enter all emission factors and allow any emission factor to be entered, we have a lot more flexibility than most. The flip side is, we spend a lot more time on OA of emission factors than we would have to if we had a more automated and restrictive process. We are definitely interested in having our system communicate with SCC and WebFIRE systems, we just do not currently have the capabilities." 4. Conclusions and Next Steps 12 ------- 4.1 Conclusions From survey results, the project team has compiled a set of follow-up-questions that would have to be answered for the CEF to work effectively for SCC and WebFIRE data retrieval. WebFIRE/Emission Factor data availability in the CEF: There is a mix in the way SLT systems collect emissions data. If the CEF is to work with these systems, it would have to be able to pull emissions factors in and work with each SLT system. How could the CEF alleviate and substitute for SLTs without a web-based system, while working with those who do? There seemed to be a substantial amount of SLTs using paper and PDF reporting. How might a CEF ease them into a transition to electronic reporting in such a way that emission factor data are available for calculating emissions to be reported? While the survey did not ask for specifics in the case of paper or PDF reporting, it seems that there were many instances when there might be specific exceptions or situations where this kind of reporting is necessary: for example, in the provision of technical or supplementary documentation. If any of these supplemental data are related to emissions factors, how could the common form or underlying database support the inclusion of supplemental data via electronic submission instead of paper and PDF. In other words, how might the CEF alleviate the need for paper and PDF as relates to emission factors and emission factor documentation? WebFIRE data retrieval: How might the common form provide the EFs to the reporter either individually or in bulk from WebFIRE? In terms of timing, how could this be done seamlessly so that any EF updates appear automatically and don't have to be searched and retrieved? How might the EF provide state- specific or even facility-specific EFs where applicable? WebFIRE data consistency: What would be required to address issues with WebFIRE that the survey raised? Continuous WebFIRE data updates and maintenance: new data incorporated, EFs QA'd for inconsistencies and duplicates, and more information provided when more than one EF exists for an SCC, and units of measure and formula issues resolved? SCC-WF link: How would the SCCs and WF have to be linked so that whenever there is an update to an SCC, EFs are also generated and incorporated into WF without delay? In some cases, it is not possible to provide an EF for a new SCC instantaneously because the process has not ever been measured explicitly. Furthermore, the WF's emission factor inclusion process is not immediate (a formal notice-and- comment process are required). Therefore, could EPA provide information about the progress and timeline for a pending new EF, and if so, how might it do that? From the survey responses much information was gathered, but also some information is missing: Some respondents indicated not using WF and/or that the onus is on industry to find the correct EF. Thus, a next question is how this survey may or may not also reflect burden to industry. A longer-term item to explore is the potential for recording emission factors generated during the permitting process itself. Could the CEF be set up to include the EF from the permitting process, when applicable, and hold the EF for that facility so it can continue to use it? How might that data be shared back to WF and to other reporters? 13 ------- Many states have their own emission factors. This is a result of either gap filling, or the fact that some emission factors (e.g. for oil and gas SCCs) are very specific to the location where the activity is taking place. How might a common emissions form accommodate state specific and/or facility specific emission factors? 14 ------- 4.2 Next steps Next steps are dependent on time, funding and other staffing availability. 1. Address how to keep WF database updated by including, for example, consistency checks, periodic review of current data and updates where needed, and inconsistencies and duplicates in emission factors resolved, per survey findings. Efforts in this area are under way, but funding and staffing constraints for EPA mean these issues may not be addressed quickly. 2. Address a way that WF and the SCCs can be updated together and linked so changes are provided automatically in both. Currently, WF is not using the SCC web service. Furthermore, because of the lag between the time an SCC is crated and the time an emission factor for that SCC can be generated, it may not be possible to have an EF instantaneously. Yet, perhaps adding some sort of message to that effect for new SCCs would help allay confusion. Any efforts to allow WF to relate to the newly created SCC web services, even if only at the design level, would be helpful. 3. Create an EF database that includes state-specific EFs that a community of users could access via the CEF. Through the survey, some SLT EF data were collected and could provide a starting point to research how to create a compendium of SLT emission factors and business rules around its maintenance. 4. Develop a pilot for a specific state to be able to access EF data via the CEF. The ultimate goal for any of the three next steps listed above would be for the CEF to pull EF data from WebFIRE, the SLT compendium and/or any other EF databases contained in SLT electronic reporting systems, if applicable. 15 ------- Figure Q2. How does yourSLT collect emissions data for point sources? 70% 60% 50% 40% 30% 20% 10% 0% 58% 56% 33% We have our own we b- We collect data based electronic reporting electronically (e.g. excel system sheets) but must then take several steps to process the data to compile it. Our data collection is on paper or PDF files. 16 ------- Table Q2-a. Breakdown of Responses: How does your SLT collect emissions data for point sources? Q2 Replies Reply Num Pet Num Pet We have our own web-based electronic reporting system 1 28 58% We have our own web-based electronic reporting system. 1 only 12 25% Web-based electronic system, other electronic collection (e.g. excel) and paper or PDF. 1,2,3 7 15% Web-based electronic system, and paper or PDF. 1,3 9 19% We collect data electronically (e.g. excel sheets) but must then take several steps to process the data to compile it. 2 10 21% We collect data electronically (e.g. excel sheets) but must then take several steps to process the data to compile it. 2 only 5 10% Electronic collection, paper and PDF. 2,3 5 10% Our data collection is on paper or PDF files. 3 only 10 21% 10 21% None None 1 2% 1 2% TOTAL 48 100% 48 100% 17 ------- Table Q2-b. Other (please specify): How does yourSLT collect emissions data for point sources? Q2 Replies Reply Comments under "Other" We have our own web-based electronic reporting system. 1 only 2 reported using SLEIS; 1 reported that the online tool is used to produce the report, and the final product is paper, but they have all data electronically Web based electronic system, other electronic collection (e.g. excel) and paper or PDF. 1,2,3 1 reported their electronic system is not required; 1 reported that the majority of their reports are via electronic submission; 1 reported 50% electronic submissions and 50% PDFs; 1 reported they are moving to electronic reporting in the future. Web based electronic system, and paper or PDF. 1,3 1 reported accepting paper copies but discouraging this for required reporters. Their electronic reporting system generates PDF documents upon emission statement submittal. 1 reported using SLEIS and collecting non AERR point source emissions on paper forms. 3 reported receiving most of their data electronically. We collect data electronically (e.g. excel sheets) but must then take several steps to process the data to compile it. 2 only NJ reported having free software for reporting emissions and building permit applications call RADIUS, http://www.state.nj.us/dep/aqpp/radius.html. "The data is then submitted electronically to NJDEP via our portal, http://www.nj.gov/dep/online/." Electronic collection, paper and PDF. 2,3 1 reported they will be moving to SLEIS in the future Our data collection is on paper or PDF files. 3 only 1 reported that they are currently moving to online reporting None None 1 reported some submit on paper and some submit spreadsheets 18 ------- Figure Q3. How do you retrieve WebFIRE doto? 35% 30% 25% 20% 15% 10% 5% 13% Automatic download to SLT electronic system Batch download into system Download and type changes directly into SLT system Download and pre-process before upload into SLT system Download and distribute to industry via website or other means 2% Email someone at EPA for information, then pull into SLT system 19 ------- Table Q3-a. Breakdown of Responses: How do you retrieve WebFIRE data? Q3 Replies Reply Num Pet All replies 1,2,3,4, 5 1 2% We download the data from an EPA website and directly pull it into our system as a batch. 2 only 3 6% We download the data from an EPA website and directly pull it into our system as a batch, and distribute to industry via a website or other means. 2,5 1 2% We download the data from an EPA website and type changes into our system individually 3 only 7 15% We download the data from an EPA website and type changes into our system individually; and download process and upload to our system; 3,4 6 13% We download the data from an EPA website and type changes into our system individually; and, download and distribute to industry 3,5 1 2% We download the data from an EPA website and type changes into our system individually; download process and upload to our system; download and distribute to industry 3,4,5 1 2% We download and process the data from an EPA website, then upload it into our system. 4 only 5 10% We download data from EPA and distribute to industry via a website or other means. 5 only 2 4% We must email someone at EPA to get some of the information that we need, then pull it into our systems or store on our computers. 6 only 1 2% None None 21 44% TOTAL 48 100% 20 ------- Table Q3-b. Other (please specify): How do you retrieve WebFIRE data? Q3 Replies Reply Comments under "Other" We download the data from an EPA website and type changes into our system individually; and download process and upload to our system; 3,4 1 reported having many options for updating the lookup table in their system but did not specify which options these are. 1 reported difficulty in doing bulk uploads from WF and difficulty in maintaining EFs without information about their origin. 1 reported not using WF data. 1 reported having an Access program that processes WF data for import into their electronic reporting system. We download and process the data from an EPA website, then upload it into our system. 4 only 1 reported that their web-based system has its own factor table that it uses for calculations (no direct link to WebFIRE). They maintain that factor table based largely on WebFIRE factors. We must email someone at EPA to get some of the information that we need, then pull it into our systems or store on our computers. 6 only If information within WebFIRE is desired, they access the search function of the website and instruct industry to do so also. Emission factors must be manually entered. There are no quality checks on the entered factor at the web-based electronic reporting tool level. After the reporting period for industry has passed, they review if facilities are using reasonable/correct emission factors, no matter whether the factors are from WebFIRE or any other source. None None 3 reported not using WF (although 2 reported consulting it in further questions), and 1 reported only doing so for a very specific EF only (mostly using AP-42 or test data instead). 1 reported that SLEIS takes care of this for AER sources, and that any other need for WF requires manual download from EPA's website. 1 reported their system links to WF, AP42 and other relevant data (SCCs, unit descriptions). 5 reported that industry is responsible for identifying the right EF. 3 reported not pulling WF data directly into their system. 5 reported their systems are not directly tied to WF and thus they must do any uploads manually. 21 ------- Figure Q4. How frequently do you retrieve WebFIRE informotion? 70% 60% 50% 40% 30% 23% 20% 10% 17% 10% 18% i As soon as WebFIRE is updated with a new emission factor As soon as we are aware of WebFIRE updates Several times per year periodically Several times per year depending on resource availability Once per year (please specify the time in the space provided below under other) Less frequently than once per year Other (please specify) 22 ------- TableQ4-a. Breakdown of Responses: How frequently do you retrieve WebFIRE information? Q4 Replies Reply Num Pet As soon as WebFIRE is updated with a new emission factor. 1 only 1 2% As soon as we are aware of WebFIRE updates. 2 only 4 8% As soon as WebFIRE is updated with a new emission factor; and, as soon as we are aware of WebFIRE updates 1,2 1 2% Several times per year periodically. 3 only 4 8% Several times per year depending on resource availability. 4 only 0 0% Once per year (please specify the time in the space provided below under other). 5 only 8 17% Less frequently than once per year. 6 only 11 23% None None 19 40% TOTAL 48 100% Table Q4-b. Other (please specify): How frequently do you retrieve WebFIRE information ? Q4 Replies Reply Other (please specify) Several times per year depending on resource availability 6 only 1 reported reviewing EFs every 3 years for the emissions reporting cycle. 1 reported reviewing certain EFs for QA purposes and then updating them. Once per year (please specify the time in the space provided below under other). 5 only Responses ranged in time-frame from some checking them at the end of year or beginning of year while others check mid-late Summer, or as needed. None None Respondents reported reviewing them on an "as needed" basis. 23 ------- Figure Q5. Do you use emissions factors for...? 24 ------- TableQ5. Breakdown of Responses: Do you use Emissions Factors for... Q5 Replies Reply Num Pet Other (please specify) Criteria Pollutants 1 only 5 10% 1 reported also looking for GHGs Criteria Pollutants & HAPS 1,2 23 48% 4 reported looking for GHG, 1 reported having all WF EF in their default lookup table. Criteria Pollutants, HAPS, and Other toxics 1,2,3 11 23% 1 reported also looking for GHG beyond HAPS None None 9 19% 1 reported not all SCCs are associated to an EF in WF TOTAL 48 100% 25 ------- Figure Q6. Which types of WebFIRE emission factors do you use in your system ? 80% 75% 70% 60% 50% 40% 30% 20% 10% 0% 52% 25% Uncontrolled emission factors (plus control efficiency if supplied by the filer). Controlled emission factors (applied directly). Uncontrolled emission factors derived from controlled emission factors by state. 26 ------- Table Q6. Breakdown of Responses: Which type/s of WebFIRE emission factors do you use in your system ? Q6 Replies Reply Number Pet Other (please specify) Uncontrolled emission factors (plus control efficiency if supplied by the filer) 1 only 13 27% Uncontrolled emission factors (plus control efficiency if supplied by the filer); and, controlled emission factors (applied directly) 1,2 11 23% 1 responded that SLEIS allows for the use of either controlled (applied directly) or uncontrolled emission factors. When using uncontrolled, SLEIS will automatically reduce emissions if a control device has been specified. Uncontrolled emission factors (plus control efficiency if supplied by the filer); controlled emission factors (applied directly); and, uncontrolled emission factors derived from controlled emission factors by state. 1,2,3 12 25% 1 response was that "only in recent years has our emission estimator been updated to attempt to take advantage of controlled emission factors or toxics. Relating EFs with the appropriate Control devices is challenging, especially since the WebFIRE control devices are inconsistent and can use obsolete device codes." Controlled emission factors (applied directly) 2 only 2 4% None 10 21% Users of our system select the emission factor which is appropriate for their source, whether uncontrolled or controlled. TOTAL 48 100% 27 ------- Table 2. Responses to Questions 7 through 13 Question % Responded "Yes" Q7. Do you use WebFIRE data to check emissions or emission factors reported by filers for reasonability (e.g. compare an emission factor resulting from reported emissions and activity data to an emissions factor in WebFIRE for a specific SCC)? 58% Q8. Do you add state-specific emission factors to those in WebFIRE that are available for use by your filers for any facility? 27% Q9. Do you fill in missing criteria pollutant emissions factors in WebFIRE with other WebFIRE factors from very similar SCCs? 44% Q10. Do you make any other modifications or additions to WebFIRE information to make it useful in your system? 25% Qll. If you modify or add to WebFIRE information (e.g., state-specific emission factors or gap filling) would you be willing to share the table/s of SCC and emissions factors used in your reporting system? 46% Q12. Do you maintain and allow filers to use expired factors? 35% Q13. When there are 2 factors for the same SCC pollutant combination do you select which one to allow or allow filers to select for themselves? 65% 28 ------- Figure Q14. Do you face any of these problems in using the emission factor data in WebFIRE in your system? 60% 50% 54% 40% 40% 30% 25% 25% 25% 23% 20% 17% 10% 0% Too general SCCs Inconsistent Multiple factors for Controlled EF> Missing criteria EF Outdated comparedOutdated SCC in WF Other (please emissions factors same SCC Uncontrolled EF from an SCC toAP-42 specify) 29 ------- Table Q14. Breakdown of Responses: Do you face any of these problems in using the emission factor data in WebFIRE in your system? Possible Responses Response # Num Pet 1. Too general SCCs 2. Inconsistent emissions factors 3. Multiple factors for same SCC 4. Controlled EF > Uncontrolled EF 5. Missing criteria EF from an SCC 6. Outdated compared to AP-42 7. Outdated SCC in WF All 1 2% 1*2,3,5,6,7 1 2% 1,2,3,5,7 1 2% 1,2,5 6% 1,3,4,5 1 2% 1,3,5 4% 1,3,7 1 2% 1,5 1 2% 1,5,7 1 2% 2,3,5 6% 2,3,6,7 1 2% 2,5 1 2% 2,5,6,7 1 2% 3 only 6% 3,4,6,7 1 2% 3,5 6% 3,5,7 1 2% 5 only 8% 5,6 1 2% 5,6,7 1 2% 6 only 1 2% 7 only 1 2% None 14 29% TOTAL 34 71% 30 ------- Figure Q15. Do you face any of these issues in using SCCs in your system? 30% 15% 13% Inconsistent like level Missing "map to" value New map to SCC for New Map_to SCC SCCs in WebFIRE do not Missing "short name" SCC descriptions too Timing of SCC changes Other (please specify) codes descriiption for a retired SCC multiple codes with very without factors in match those in EIS for new SCCs long and retirements cause different factors WebFIRE errors 31 ------- Table Q15. Breakdown of Responses: Do you face any of these issues in using SCCs in your system ? Possible Responses Response # Num Pet 1. Inconsistent like level codes description 2. New map to SCC for multiple codes with very different factors 3. New map to SCC without factors in WebFIRE 4. SCCs in WebFIRE do not match those in EIS 5. Missing "short name" for new SCCs 6. SCC descriptions too long 7. Timing of SCC changes and retirements cause errors All 3 6% 1 only 2 4% 1,5 1 2% 1,2 2 4% 1,2,3,4,5 1 2% 1,2,3,8 1 2% 1,2,5 1 2% 1,3 1 2% 1,3,4,8 1 2% 1,4 1 2% 1,6,7 1 2% 2 only 4% 2,3 1 2% 2,3,4 1 2% 2,4 1 2% 2,6,8 1 2% 2,7 1 2% 2,8 5 10% 32 ------- 33 4,5,8 1 2% 5 only 1 2% 5,6 1 2% 7 only 1 2% 8 only 3 6% All except 7 1 2% None 13 27% TOTAL 48 100% ------- Figure Q16. Have you encountered any of these problems in the process of using WebFIRE data for emissions estimate calculations? 13% 8% Formulas that must be manually adjusted/parsed so your software can interpret them. Multiple emission factors associated with a single SCC Lack of appropriate conversion factor data associated Inconsistencies between WebFIRE control devices and that have different throughput units of measure with mass or volume emission factor active EPA E IS control device codes. 34 ------- Table Q16. Breakdown of Responses: Have you encountered any of these problems in the process of using WebFIRE data for emissions estimate calculations? Q16 Replies Response # Num Pet All except inconsistencies between WF control devices and active EPA control codes 1,2,3 1 2% Formulas that must be manually adjusted/parsed so your software can interpret them. 1 only 1 2% Multiple emission factors associated with a single SCC that have different throughput units of measure 2 only 2 4% Lack of appropriate conversion factor data associated with mass or volume emission factor 3 only 2 4% Multiple EF for a single SCC, and lack of appropriate conversion factors 2,3 2 4% All except inconsistencies between WF control devices and active EPA control codes All 4 8% None None 36 75% TOTAL 48 100% 35 ------- |