CAA Plain Language Guide Clean Air Act Metrics Plain Language Guide State Review Framework-Round 4 This Plain Language Guide describes the elements and metrics EPA uses during a State Review Framework (SRF) review and provides instructions on how to use the metrics to make appropriate findings and recommendations. Reviewers should also refer to the CAA file review checklist and spreadsheet when conducting file reviews. Data used in SRF reviews fall into three primary categories data verification counts, data metrics and file review metrics. These metrics provide an initial overview of agency performance. 1. Data Verification Counts are used to assure the completeness and accuracy of universes and activities essential to establishing values for other data metrics. The annual data verification process requires states and EPA regions to review facility and activity counts in order to create accurate and complete frozen data. EPA expects agencies to correct any inaccuracies identified during the data verification process in the ICIS-Air data system (Integrated Compliance Information System for Air). ICIS-AIR data counts, once verified, are frozen and utilized for public access purposes as well as for developing Data Metrics for the SRF. These counts are not evaluated directly in the SRF process. 2. Data Metrics are metrics where counts are combined or compared in some way that is informative. EPA derives data metrics from frozen verified data in ICIS-AIR. Reviewers download data metrics from the Enforcement and Compliance History Online (ECHO.gov) to get an initial overview of a state or local agency's performance. All data metrics fall into one of the following subcategories: Goal metrics provide a specific numeric goal and a national average expressed as percentages. EPA evaluates agencies against goals not averages. These metrics include averages only to provide a sense of where an agency falls relative to others. Supporting Data Indicators, though not metrics themselves, may provide valuable information for both informing the selection of files for review and adding perspective to the file review findings. Each Supporting Data Indicator is paired with a file review metric. When using the Supporting Data Indictors to support file reviews, examine state average versus national averages to indicate when agencies appear to diverge from national norms. A deviation from a national norm or average does not mean that a performance issue exists, just that the issue should be explored further. For a significant deviation, EPA should ensure that it pulls a sufficient sample of files to evaluate the matter during the file review (see the File Selection Protocol for additional guidance). EPA and the state or local agency should discuss the matter to determine if a problem exists. Compliance Monitoring Strategy (CMS) metrics relate to agency commitments in CMS plans and provide for SRF findings based on agency-specific commitments rather than national goals. If a state does not have a CMS plan, they are expected to meet national goals. 3. File review metrics are employed during review of facility files (including information such as compliance monitoring reports (CMRs), evaluations, enforcement responses and actions, and penalty documentation). The results of file reviews, in combination with data metric results, provide a greater understanding of an agency's performance than data metrics results alone. All file metrics ------- CAA Plain Language Guide have associated national goals; however, unlike data metrics with goals, file metrics will not have national averages. Guidance References and Acronyms The SRF Documentation Page on ECHO.gov provides a full list of links to SRF guidance and policies. Year reviewed, review year, and review fiscal year refer to the federal fiscal year being reviewed, not the year in which the review is conducted. Ideally, the year reviewed is the latest frozen dataset available in ECHO when the review is initiated. Agency refers to the state, local, or federal agency that has the lead for compliance monitoring and enforcement within the state or other jurisdiction undergoing the SRF review. A list of acronyms is provided as an attachment at the end of this Plain Language Guide. CAA SRF Review Process 1. Annual data verification 2. Annual data metric analysis 3. File Selection 4. Local agency or regional office inclusion (if applicable) 5. Discussion with HQ on review process (or discussion on a step-by-step basis, as chosen by the Region) 6. Entrance conference 7. File Review 8. Exit conference 9. Draft Report Submitted for internal agency review 10. State Comment Period 11. Revised report sent to agency for review and internet posting 12. Final report and recommendations published on a SRF web site 13. Track implementation status of Area for Improvement Recommendations in the SRF Manager database on a periodic basis Using Metrics to Determine Findings Goal metrics always have numeric goals and stand alone as sufficient basis for a finding. For example, the goal for CAA metric 3a2 is for 100 percent of HPVs to be reported into ICIS-AIR timely. To analyze performance under this metric, reviewers compare the percentage of HPVs reported with the 100 percent goal. Based on this analysis, the reviewer would make a finding. All findings will fall under one of these categories: Meets or Exceeds Expectations: The SRF was established to define assess the base level or floor of enforcement program performance. This rating describes a situation where the base level is met and no performance deficiency is identified, or a state performs above base program expectations. Area for State Attention: An activity, process, or policy that one or more SRF metrics show as a minor problem. Where appropriate, the state should correct the issue without additional EPA ------- CAA Plain Language Guide oversight. EPA may make recommendations to improve performance, but it will not monitor these recommendations for completion between SRF reviews. These areas are not highlighted as significant in an executive summary. Area for State Improvement: An activity, process, or policy that one or more SRF metrics show as a significant problem that the agency is required to address. Areas for improvement will be highlighted in the Executive Summary as significant issues. Recommendations should address root causes. Status of recommendations is publicly available. Recommended activities to correct the issue(s) are identified. These recommendations must have well-defined timelines and milestones for completion, and EPA will monitor them for completion between SRF reviews in the SRF Tracker. Whenever a metric indicates a significant performance issue, EPA will write up a finding of Area for State Improvement, regardless of other metric values pertaining to a particular element. The National Strategy for Improving Oversight of State Enforcement Performance is a key reference in identifying recommendations for Areas for Improvement. Where a performance problem cannot be readily addressed, or where there is a significant or recurring performance issues, there are steps EPA can and should take to actively promote improved state performance. For additional information: https://www.epa.gov/sites/production/files/2014-06/documents/state-oversight- strategy.pdf Using Other Metrics When metrics other than Goal metrics indicate problems, EPA should conduct additional research to determine whether there is truly a problem. These metrics provide additional information that is useful during file selection, and for gauging program health when used with other metrics. For example, CAA metric 8a is a Supporting Data Indicator for File Review metric 8c. Indicator 8a provides the state's HPV discovery rate at active major sources which is related to Metric 8c, the percentage FRVs identified by the state for which an accurate HPV determination was made. If there is a significant deviation for Indicator 8a from the national average, it is only with further review of stack test results, accuracy of HPV determinations determined during the file review, and other contextual information that a reviewer is able to analyze whether the HPV discovery rate presents a performance issue. Use of State Guidance and Regional-State Agreements as Basis for Findings in SRF Reviews The State Review Framework evaluates enforcement program performance against established OECA national program guidance. State program guidance or regional-state agreements are applicable to the SRF review process under the following circumstances. 1. It is acceptable to use the state's own guidance to evaluate state program performance if: 1) the region can demonstrate that the state's standard(s) is(are) equivalent to or more stringent than OECA guidance, and; 2) and the state agrees to being evaluated against that standard(s). In these cases, regions should inform OECA/OC in advance of the review that they intend to use state guidance, and should include a statement in the SRF report indicating that the state guidance was determined to be equivalent or more stringent than the applicable OECA policy and was used as the basis for the review. 2. For certain metrics, clearly specified in this Plain Language Guide, it will be necessary to refer to state policies or guidance, or to EPA-state agreements. For example: a. If the state has an Alternative CMS, EPA will use these state-specific commitments as the basis to evaluate compliance monitoring coverage. ------- CAA Plain Language Guide b. The national guidance may require only that a state establish a standard but not actually provide the standard. In such cases, the reviewer will need to ensure that the state has developed the required standard, and once it has been reviewed and approved by the region, use that standard to evaluate state performance. 3. Where national guidance has been modified or updated, it is important to review the corresponding state program implementation guidance to assess whether it has become out of date or inaccurate. In such cases, the reviewer should make appropriate recommendations for revision of the state guidance, review the revised version, and approve it, if appropriate. Where state program guidance or regional-state agreements establish practices or standards that are not consistent with or at least equivalent to national program guidance, this may be an allowable flexibility under section A4 of the Revised Policy Framework for State/EPA Enforcement Agreements (Barnes, August 1986, as revised). If so, the region should inform OECA/OC prior to the review and note this flexibility in the explanation of the SRF report. If the differences between the state guidance or regional-state agreements and the national guidance is significant, or if it is unclear whether flexibility from OECA policy is appropriate, the region should elevate the issue to OECA for resolution (per Interim Guidance on Enhancing Regional-State Planning and Communication on Compliance Assurance Work in Authorized States (Bodine, 2018) prior to developing findings or a draft report. Element and Metric Definitions Element 1 Data EPA uses Element 1 to evaluate data accuracy and completeness. At the beginning of the review, the presumption is that the frozen dataset has been verified by the state and EPA region and is accurate, or caveats were provided that indicate where the state was unable to make changes in the national data system. EPA will evaluate data accuracy and completeness through metric 2b, which is a file metric that compares data in the ECHO.gov Detailed Facility Report or ICIS-AIR to information in facility files. EPA will also use data metrics 3a2, 3bl, 3b2, and 3b3 to evaluate timely reporting of monitoring minimum data requirements (MDRs) and noncompliance to ICIS-AIR. A reviewer may determine that the value for a data metric is inaccurate to a significant degree when he or she conducts the entrance conference or the file review. Discrepancies in data counts should be noted under Element 1. EPA regions may also note under Element 1 any significant discrepancy in universe data found on the ECHO.gov Air Activity and Performance Dashboards. If the cause of the inaccurate data is a data quality issue or discrepancy, the reviewer should include this as an Area for State Attention or Area for State Improvement, depending on the magnitude of the discrepancy. The finding would cite the data inaccuracies and provide both the reported and actual values. Refer to ECHO Data Entry Requirements for CAA minimum data requirements. Key metrics: 2b, 3a2, 3b 1, 3b2, and 3b3 Metric 2b Accurate MDR data in ICIS-AIR Metric Type: File, Goal ------- Goal: 100% of data are complete and accurate CAA Plain Language Guide What it measures: Percentage of files reviewed where substantive MDR data are accurately reflected in ICIS-AIR. Numerator: The number of files reviewed where file data and ICIS-AIR data are the same for substantive MDRs; Denominator: The number of files reviewed. Guidance: Compare the information in the files for the year reviewed with data from the ECHO.gov Detailed Facility Report (DFR), ICIS-AIR Compliance Source Data Report, listing of regulatory subparts, and HPV pathway reports. Review the MDRs listed on the File Review Checklist to confirm whether information is consistently reported to ICIS-AIR in an accurate manner. The following MDRs are considered "substantive", and the file information should be consistent with the data reported to ICIS-AIR and captured in the reports listed above: 1. Full compliance evaluation (FCE): Compare the FCE date in the file with information in the DFR under "Compliance Monitoring History." 2. Title V annual compliance certification: Compare the Title V certification received date in the file with the information in the DFR under "Compliance Monitoring History." Each Title V certification received is identified under Inspection Type, and whether or not the facility reported deviations is provided under Finding. 3. Stack Test: Compare the results of stack tests in the file with information in the DFR under "Compliance Monitoring History." Each stack test is identified under Inspection Type; the date would be the date the stack test was conducted, and the results are provided under Finding. Please note, a pollutant is not required to be reported for a stack test, but some agencies choose to optionally report the pollutant tested. Reviewers should verify (in the DFR or ICIS-Air) that the stack test result has been entered, and any pending results have been changed to Pass or Fail within 120 days. 4. Compliance Status: Check to ensure that any necessary violation determinations were accurately recorded in ICIS-AIR. Under the Three Year Compliance Status by Quarter, each row identified by a Violation Type, Programs and Pollutants represents a Case File. The violation type corresponds to the Enforcement Response Policy identified on the Case File. The Air Programs and Pollutants indicate the values reported on the Case File. The corresponding date in the row will be either the Earliest HPV Day Zero Date or the Earliest FRV Determination Date depending on the Violation Type identified for the row. If an HPV is identified as the Enforcement Response Policy on the Case File, the date corresponds to the Earliest HPV Day Zero Date. 5. Formal Enforcement Action and Final Order: Check to ensure that all formal enforcement actions found in the file for the review year are in the DFR and compare date(s) in the file with information in the DFR under "Formal Enforcement Actions (5-year history)." The final order is the vehicle in which captures a settlement agreement, compliance schedule, penalty assessment, or conditions to return to compliance, which may include injunctive relief. Final order details can be found in ICIS-Air. 6. Notices of Violation (NOV): Check to ensure all NOVs or Warning Letters found in the file for the review year are in the DFR. Compare date(s) in the file with information in the DFR under "Informal Enforcement Actions (5-year history)." Note that formal notice ------- CAA Plain Language Guide may be provided via a variety of mechanisms, for example: Notice of Violation, Warning Letter, Notice to Correct, Notice of Opportunity to Correct, Notice to Comply, or Notice of Noncompliance. If the purpose is to formally notify a source of an FRV, it is to be reported to ICIS-Air as either an NOV or a Warning Letter. Methods of Advisement are reported on the Air Violations screen in ICIS. 7. Penalties: Compare any penalty amounts in the file with information in the DFR under "Formal Enforcement Actions." Penalties should be entered in the "Penalty Assessed to be Paid" portion of the Penalty screen in the Final Order module in ICIS-Air. 8. Federally reportable violations (FRVs) and High-priority violations (HPV): Compare file to information in the DFR under "Compliance Summary Data." Check that all federally reportable violations that meet one of the FRV or HPV criteria are in the DFR. Reviewers must consult the Case File Module in ICIS-Air to verify the following MDR information related to each violation (both FRVs and HPVs): violation type, air program, pollutant, method & date of advisement. Additional MDRs for HPVs include the following: HPV Day Zero Date, discovery action & date, addressing action & date, and resolving action & date. 9. Air Program and Subparts: Compare the Air Programs and operating status in the DFR with applicable programs reflected in the file. A subpart is required for NESHAP Part 63 and NSPS if the facility is a title V major. Part 63 and NSPS subparts are optional but encouraged for any non-major facility. A subpart is also required for NESHAP Part 61 regardless of facility classification. Subpart information can only be found in ICIS-Air. The applicable pollutants and pollutant classification for each air program should also be verified in ICIS-Air. 10. CMS: The CMS Source Category and Frequency cannot be found on the DFR, so it must be verified in ICIS-Air. Other MDRs are considered "administrative" and should be evaluated for accuracy. However, problems with these MDRs would only warrant a recommendation if a significant number of errors exist, or a pattern of data entry problems is evident. Administrative MDRs include the following: facility ID, name, street, city, state, county, zip, NAICS code, government ownership, and activity identifiers. Applicable EPA policy/guidance: Air Stationary Source Compliance and Enforcement Information Reporting (ICR) Supporting Statement (EPA-HQ-OECA-2014-0523); CAA CMS; Guidance on Federally-Reportable Violations for Clean Air Act Stationary Sources (2014); Revision of U.S. Environmental Protection Agency's Enforcement Response Policy for High Priority Violations of the Clean Air Act: Timely and Appropriate Enforcement Response to High Priority violations (2014). Metric 3a2 Timely reporting of HPV determinations into ICIS-Air Metric type: Data, Goal Goal: 100% of HPV determinations reported to ICIS-Air within 60 days What it measures: Percentage of HPV determinations entered within 60 days based on the Case File "Date Created" in ICIS-AIR. Numerator: number of HPVs reported within 60 days of HPV determination within the review year; ------- CAA Plain Language Guide Denominator: number of Case Files with HPVs that were reported during the review year. Guidance: The metric examines the percentage of Case File records with an HPV with a Day Zero determination made during the review year by the state, local or EPA that were reported to ICIS- Air within the required 60-day timeframe. To measure the number of days used to report the HPV Day Zero, the metric compares the Earliest HPV Day Zero date to the earliest HPV created date, which is the day the Case File is entered in ICIS-Air. The reported date is the earliest created date of a violation that has a Day Zero reported. The source universe is limited to the federally-reportable universe. There might be instances where the Case File Date Created is before the Earliest HPV Day Zero date. This is acceptable because a Case File may be created prior to the agency determining a violation is an HPV. This metric is based on Case Files containing HPVs; some Case Files may have more than one HPV but would be counted as a single HPV in this metric. Metric 3bl Timely reporting of compliance monitoring MDRs Metric Type: Data, Goal Goal: 100% of actions reported within specified timeframes What it measures: Percentage of compliance monitoring-related MDR actions achieved during the review year that were reported within 60 days of the date achieved. Because stack test results can be reported within 120 days, stack tests are not included in this metric. Numerator: number of compliance monitoring-related MDR actions achieved during the review year and reported within 60-days of the date achieved; Denominator: number of compliance monitoring-related MDR actions achieved during the review year at federally reportable facilities. Guidance: Compliance monitoring actions include full compliance evaluations and receipt of Title V annual compliance certifications. The source universe is limited to the federally-reportable universe. The metric compares the number of compliance monitoring activities (FCEs and Reviews of Title V Annual Compliance Certifications) completed by the agency during the review year and reported to ICIS-Air in less than or equal to 60 days of the date they were completed to the total number of compliance monitoring activities completed by the agency during the review year. To measure the number of days between the date completed and the date the activity was reported to ICIS-Air, the metric counts the number of days between the activity's Actual End Date and the activity's Date Created, which is the date automatically recorded on the action record by ICIS-Air when the activity is reported. The metric excludes activities where the only air program(s) reported on the compliance monitoring activity action record is "Not Defined as Federally-Reportable" or "State or Local rule or regulation that is not federally-enforceable". Specifically, the numerator is the number of FCEs and TVACC reviews that occurred during the review fiscal year and were reported to ICIS-Air within 60 days. The reported date is the day the compliance monitoring event was created. The occurrence date for FCEs is the Actual End Date and the occurrence date for TVACCs is the earliest Reviewed Date for the TVACC. The denominator is the number of FCEs and TVACC reviews that were reported during the review fiscal year. ------- CAA Plain Language Guide Metric 3b2 Timely reporting of stack tests and results Metric Type: Data, Goal Goal: 100% of actions reported within specified timeframes What it measures: Percentage of stack tests achieved and results reported during the review year that were reported to ICIS-Air within 120 days of the stack test. Numerator: number of stack tests that occurred at CAA majors, synthetic minors, and Part 61 NESHAP minors during the review year and were reported and reviewed within 120 days. Denominator: majors, synthetic minors, and Part 61 NESHAP minors with a stack test achieved during the review year. Guidance: The source universe is limited to majors, synthetic minors and minor sources subject to NESHAP Part 61, which are a subset of the federally-reportable universe. Therefore, this metric may not account for all stack tests completed or reviewed in the review year. The metric compares the number of stack tests reported by the state or local agency that were conducted during the review fiscal year and reported to ICIS-Air in less than or equal to 120 days of the date they were conducted to the total number of stack tests conducted and reported by the state or local agency during the review fiscal year. The reported date is the Date Created associated with the stack test compliance monitoring record in ICIS-Air, and the occurrence date is the Actual End Date. To measure the number of days between the date completed and the date the activity was reported to ICIS-Air, the metric counts the number of days between the activity's "Actual End Date" and the activity's "Date Created," which is the date automatically recorded on the action record by ICIS-Air when the activity is entered to the system. Applicable EPA policy/guidance: Air Stationary Source Compliance and Enforcement Information Reporting (ICR) Supporting Statement (EPA-HQ-OECA-2014-0523); CAA CMS. Clean Air Act National Stack Testing Guidance Metric 3b3 Timely reporting of enforcement MDRs Metric type: Data, Goal Goal: 100% of actions reported within specified timeframes What it measures: Percentage of enforcement actions achieved during the review year that were reported to ICIS-AIR within 60 days. Numerator: number of enforcement actions achieved during the review year that were reported within 60 days; Denominator: number of enforcement actions achieved during the review year. Guidance: The source universe is limited to the federally-reportable universe. The metric compares the number of informal and formal enforcement activities (Notices of Violation, Administrative Orders, and Consent Decrees) completed by the state or local agency during the review fiscal year and reported to ICIS-Air in less than or equal to 60 days of the date they were completed to the total number of enforcement related activities completed by the state or local agency during the review fiscal year. To measure the number of days between the date completed and the date the activity was reported to ICIS-Air, the metric counts the number of days between the activity's Achieved ------- CAA Plain Language Guide Date for informal enforcement actions or Final Order Issued/Entered date for formal enforcement actions and the activity's Date Created, which is the date automatically recorded on the action record by ICIS-Air when the activity is entered into the system. The metric excludes all enforcement sensitive activities and activities where the only air program(s) reported on the enforcement activity action record is "Not Defined as Federally-Reportable" or "State or Local rule or regulation that is not federally-enforceable". Element 2 - Inspections Element 2 evaluates the following: 1. Inspection coverage rates compared to CMS commitments. 2. Title V Annual Compliance Certification review rate. 3. Documentation of FCE elements to assure a complete evaluation occurred. 4. Compliance monitoring report completeness and sufficiency to determine compliance. Key metrics: 5a, 5b, 5c, 5e, 6a, and 6b Metric 5a FCE coverage: majors and mega-sites Metric type: Data, Goal Goal: 100% of commitment What it measures: Percentage of CMS majors and mega-sites that received an FCE within a negotiated frequency or recommended minimum frequency. Numerator: the number of CMS major sources and mega-sites where an FCE was completed by the end of the review year; Denominator: the number of CMS major sources and mega-sites where an FCE was completed, plus those planned but not completed by the end of the review year. Guidance: For this metric, the universe of sources is based solely on the CMS Category. The source classification is not considered nor is the operating status. This metric is based on source- specific historic CMS data (CMS Source Category Indicator, CMS Minimum Frequency Indicator, and FCE). This historic data is captured by ICIS-AIR on December 1 each year for the previous fiscal year. This metric captures alternative evaluation frequencies. It does not reflect those instances where a PCE has been negotiated in lieu of an FCE; this should be confirmed in ICIS-AIR and the percentage of coverage adjusted accordingly in the SRF report. Specifically, the numerator is the number of ICIS-Air facilities that had an FCE during the review fiscal year and a CMS Source Category of Title V Major or Mega-site when the historic CMS data was captured on December 1. The FCE occurrence date is the evaluation Actual End Date. If the facility was removed from a CMS plan during the review fiscal year the facility is included in the numerator if the FCE occurred prior to the CMS Plan Removal Date, but not if the FCE occurred after the CMS Plan Removal Date. The denominator is the number of ICIS-Air facilities with a CMS Source Category of Title V Major or Mega-site when the historic CMS data was captured on December 1st and had either an FCE that occurred during the review fiscal year or were due for an FCE during the review fiscal year. As with the numerator, the FCE occurrence date is the Actual End Date and, if the facility was removed from a CMS plan during the review fiscal year, the facility is included in the denominator if the FCE occurred prior to the CMS Plan Removal Date, but not if the FCE occurred after the CMS Plan Removal Date or if there was no FCE performed. CMS Plan Removal Dates that occur after September 30 of the review fiscal year do not factor into ------- the metric logic. CAA Plain Language Guide Metric 5b FCE coverage: SM-80s Metric type: Data, Goal Goal: 100% of commitment What it measures: Percentage of CMS SM-80s that received an FCE within a negotiated frequency or minimum recommended frequency. Numerator: number of CMS SM-80 sources where an FCE was completed by the end of the review year; Denominator: number of CMS SM-80 sources where an FCE was completed, plus those planned but not completed by the end of the review year. Guidance: For this metric, the universe of sources is solely based on the compliance monitoring source category (CMSC). The source classification is not considered nor is the operating status. This metric is based on source-specific historic CMS data (CMS Source Category Indicator, CMS Minimum Frequency Indicator, and FCE). This historic data is captured by ICIS-AIR on December 1 each year for the previous fiscal year. This metric captures alternative evaluation frequencies. It does not reflect those instances where a PCE has been negotiated in lieu of an FCE; this should be confirmed in ICIS-AIR and the percentage of coverage adjusted accordingly in the SRF report. Specifically, the numerator is the number of ICIS-Air facilities that had an FCE during the review fiscal year and a CMS Source Category of 80% Synthetic Minor when the historic CMS data was captured on December 1. The FCE occurrence date is the evaluation Actual End Date. If the facility was removed from a CMS plan during the review fiscal year the facility is included in the numerator if the FCE occurred prior to the CMS Plan Removal Date, but not if the FCE occurred after the CMS Plan Removal Date. The denominator is the number of ICIS-Air facilities with a CMS Source Category of 80% Synthetic Minor when the historic CMS data was captured on December 1 and had either an FCE that occurred during the review fiscal year or were due for an FCE during the review fiscal year. As with the numerator, the FCE occurrence date is the Actual End Date and if the facility was removed from a CMS plan during the review fiscal year the facility is included in the denominator if the FCE occurred prior to the CMS Plan Removal Date, but not if the FCE occurred after the CMS Plan Removal Date or if there was no FCE performed. CMS Plan Removal Dates that occur after September 30 of the review fiscal year do not factor into the metric logic Metric 5c FCE coverage: minors and synthetic minors (non-SM 80s) that are part of an alternative CMS plan Metric type: Data, Goal Goal: 100% of commitment What it measures: Percentage of minor and synthetic minors (SMs), not including SM-80s, included on an alternative CMS plan that received an FCE within a negotiated frequency. Numerator: number of CMS minor and synthetic minor (non-SM80) sources where an FCE was completed by the end of the review year; Denominator: number of CMS minor and synthetic minor (non-SM80) sources where an FCE was ------- CAA Plain Language Guide completed, plus those planned but not completed by the end of the review year. Guidance: Reviewers should typically only apply this metric when the state/local Agency has an Alternative CMS plan approved by EPA. It is usually not necessary to evaluate this metric during the SRF review if the Agency is utilizing a traditional CMS plan, although some state/local Agencies have included a few minor sources in their traditional CMS plan for specific reasons. The universe of minors and synthetic minors reflects the current classification as a minor and synthetic minor source and the historic CMS Source Category of "Other/Alternate Facilities". This metric is based on source-specific historic CMS data (CMS Source Category, CMS Minimum Frequency, and FCE Actual End Date). If a PCE has been negotiated in lieu of an FCE, this should be confirmed in ICIS-AIR and the percentage of coverage adjusted accordingly in the SRF report. Specifically, the numerator is the number of ICIS-Air facilities that had an FCE during the review fiscal year and a CMS Source Category of Other/Alternate Facilities when the historic CMS data was captured on December 1. The FCE occurrence date is the evaluation Actual End Date. If the facility was removed from a CMS plan during the review fiscal year the facility is included in the numerator if the FCE occurred prior to the CMS Plan Removal Date, but not if the FCE occurred after the CMS Plan Removal Date. The denominator is the number of ICIS-Air facilities with a CMS Source Category of Other/Alternate Facilities when the historic CMS data was captured on December 1 and had either an FCE that occurred during the review fiscal year or were due for an FCE during the review fiscal year. As with the numerator, the FCE occurrence date is the evaluation Actual End Date. If the facility was removed from a CMS plan during the review fiscal year the facility is included in the denominator if the FCE occurred prior to the CMS Plan Removal Date, but not if the FCE occurred after the CMS Plan Removal Date or if there was no FCE performed. In addition, metric 5c includes all facilities with a CMS Source Category of Other/Alternate Facilities that had a CMS start date that occurred anytime during the review fiscal year and a CMS Frequency of one year, regardless of what ICIS-Air shows as the Next FCE Due Date. These one-year frequency facilities will not be included in the denominator if no FCE was performed and the facility was removed from the CMS plan during the review fiscal year. CMS Plan Removal Dates that occur after September 30 of the review fiscal year do not factor into the metric logic. Note: Metric 5c and 5d have been consolidated into the same metric because in ICIS-Air it is not possible to distinguish between non-SM-80 sources and other minor sources. Metric 5e Reviews of Title V annual compliance certifications completed Metric type: Data, Goal Goal: 100% of annual certifications reviewed What it measures: Percentage of the active Title V universe (regardless of classification) for which the agency has reviewed a Title V annual compliance certification (ACCs) during the review year. Active refers to an operating status of either operating (O), temporarily closed (T), or seasonal (I). Numerator: number of active Title V sources with a Title V ACC reviewed by the agency for the review year; Denominator: active Title V universe with an ACC due in the review year. ------- CAA Plain Language Guide Guidance: Actions where the "Not Defined as Federally-Reportable" or "State or Local rule or regulation that is not federally-enforceable" air programs are the only air programs reported are not included in the metric. Because the metric is limited to the currently active universe of Title V sources, some sources that have permanently closed since the review year will not be captured even if an ACC was reviewed during the review year. For SRF Round 4, the metric logic has been revised to exclude sources that have only recently become a Title V source, and for which an ACC is not yet due. Specifically, the numerator is the number of sources with a current or historic active title V air program that had at least one TV ACC review occur during the SRF review fiscal year. The occurrence date is the earliest "actual end date" within the review FY for the TV ACC Receipt/Review compliance monitoring event. The denominator includes sources with a "planned end date" in the Review Year for the TV ACC Due/Received compliance monitoring record, since this is a firm indication from the delegated agency that an ACC is due that FY. For those sources with the "planned end date" blank, if the historical CMS data has a "Title V Major" or "Mega-Site" CMS designation for the fiscal year prior to the Review Year, it will be included in the denominator. Review of annual certifications is integral to the Title V source compliance monitoring program because it provides EPA with the necessary information to validate a facility's compliance. The metric is predicated on all Title V sources submitting an annual compliance certification. The percentage for this metric will be lowered if all Title V sources do not submit an annual certification. Conversely, this metric may reflect an artificially high percentage of annual certifications reviewed (i.e., >100%) if the Title V universe is inaccurate or if the agency is reviewing certifications from the previous year. Applicable EPA policy/guidance: CAA CMS: Air Stationary Source Compliance and Enforcement Information Reporting (ICR) Supporting Statement (EPA-HQ-OECA-2014- 0523) Metric 6a Documentation of FCE elements Metric type: File, Goal Goal: 100% What it measures: Percentage of FCEs in files reviewed that meet the FCE definition in the CMS policy. Numerator: the number of files with FCE documentation that ensures that a source's compliance status has been evaluated per Section V of the CMS; Denominator: the number of files reviewed with FCEs. Guidance: Review each file with an FCE against the FCE definition provided in Section V of the CAA CMS Guidance document findings. The CMS establishes three categories of compliance monitoring: Full Compliance Evaluations, Partial Compliance Evaluations, and Investigations. This metric ensures that the monitoring activity being reported as an FCE meets the definition as provided in Section V of the October 2016 CMS Guidance, the reported evaluations are thoroughly documented in a timely manner, and an FCE of a source's compliance status has been conducted. This metric also evaluates the tools and procedures used by the agency to determine that an FCE has been completed. ------- CAA Plain Language Guide Metric 6b Compliance monitoring reports (CMRs) or facility files reviewed that provide sufficient documentation to determine compliance of the facility Metric type: File, Goal Goal: 100% of CMRs or source files reviewed What it measures: Percentage of CMRs or source files reviewed that provide sufficient documentation to determine source compliance. Numerator: the number of CMRs or facility files containing all elements listed in the CMS, Section IX; Denominator: the number of files reviewed for CMR elements. Guidance: The CAA CMS, Section IX, lists the elements of a CMR. Agencies are not required to follow a particular format. This metric ensures agencies provide sufficient documentation in the CMR to allow for a compliance determination or include in the facility files the basic elements of the CMR. Review the same files as in metric 6a against the CMR elements as provided in Section IX of the CMS. Use the CAA File Review Checklist to document findings and express results as a percentage. All elements should be present and properly documented for the CMR to be complete. Agencies will have their own methods for completing CMRs. EPA should discuss this with the agency at the beginning of the review to determine which parts of the agency's CMR documentation are consistent with EPA requirements for a complete CMR. Applicable EPA policy/guidance: CAA CMS Guidance (2016), Sample Compliance Monitoring Reports posted on Internet Element 3 Violations Under this element, EPA evaluates the accuracy of the agency's violation and compliance determinations, and the accuracy of its HPV determinations. Reviewers will evaluate Supporting Data Indicator 8a during the Element 3 analysis. If the reviewer finds that HPV identification rates are significantly lower than the national average, he or she may want to include additional compliance evaluations or violations in the file review to determine whether violations and HPVs are being determined accurately. Metric 7a covers the accuracy of compliance determinations made from compliance evaluations, and metric 8c covers the accuracy of HPV determinations. These metrics will generally form the basis for findings under this element. Key metrics: 7a, 8a, and 8c Metric 7a Accurate compliance determinations Metric type: File, Goal Goal: 100% of CMRs or source files reviewed ------- CAA Plain Language Guide What it measures: Percentage of CMRs or source files reviewed that led to accurate compliance determinations. (This differs from metric 6b which focuses on whether there is sufficient documentation in the files. Metric 7a examines whether the compliance determination was accurate.) Numerator: number of CMRs or source files with accurate compliance determinations; Denominator: the number of CMRs or source files reviewed. Guidance: Review the CMR or source file to determine if the information and documentation used by the agency to determine compliance was accurately analyzed and reported in ICIS-AIR. For example, if a file indicates that an emission unit failed a stack test, the reviewer should check to see that a stack test compliance monitoring record was reported with the Stack Test Status of "Fail." If the file indicates that the failed test was subsequently determined to be a violation, the reviewer should check to see that a Case File was added with the applicable violation type. If the file indicates that an accurate compliance determination was made by the agency, but the violation (HPV or FRV) or other data element (such as a stack test failure) is not recorded accurately in ICIS-Air, this should be captured under Metric 2a. However, if the reviewer believes the agency did not appropriately identify an FRV or HPV, this would be captured here under Metric 7a. Reviewers should refer to the "Three Year Compliance Status by Quarter" section of the ECHO.gov Detailed Facility Report (DFR). One may also review the results recorded in the "Compliance Monitoring History (5 years)" section of the DFR. Supporting Data Indicator 7al FRV 'discovery rate' based on evaluations at active CMS sources. Metric Type: Data, Supporting Indicator for Metric 7a What it measures: The percentage of FRVs reported into ICIS-Air at CMS sources active during the review year. Numerator: number of facilities with an FRV determination date during the review year at active CMS sources. Denominator: universe of active CMS sources during the review year. Guidance: Review files that identify FRVs (FRVs and HPVs) and files with violations not designated as FRVs or HPVs. To determine if all violations were accurately identified, compare both the FRVs/HPVs and non-FRVs in the files with the FRV/HPV definition from the FRV and HPV policies. This indicator is used by the SRF reviewer to provide context and to assist in focusing on files during the file selection process. It may also be used to point toward possible program implementation strengths or deficiencies. This indicator provides context to support metric 7a. Applicable EPA policy/guidance: Guidance on Federallv-Reportable Violations for Clean Air Act Stationary Sources (2014). HPV Policy (2014) Supporting Data Indicator 8a HPV discovery rate at majors Metric type: Data, Supporting Indicator for Metric 8c What it measures: HPV "discovery rate" based on active major sources. Numerator: the universe of active major sources with an HPV reported during the review year. Denominator: the universe of active major sources. ------- CAA Plain Language Guide Guidance: This indicator should be used by the SRF reviewer as an indicator to assist in focusing on files during the file selection process. When a CAA program has a very high or low rate of discovering HPVs, the reviewer should ensure that a sufficient number of files are selected to understand whether high or low rates are attributable to program deficiencies in inspections or violation identification. This indicator may also be used to support findings regarding strengths or deficiencies in inspections or violation identification. Major sources are defined as active if they have an operating status of operating (O), temporarily closed (T) or seasonal (I). The metric is a source count and each source is counted only once. The universe of major sources is limited to those added to ICIS-Air before the end of the review year with a default classification that corresponds to a major source at the time the data was pulled from ICIS-Air, which generally occurs in January or February after the review year. The metric excludes activities where the only air program(s) reported on the Case File is "Not Defined as Federally-Reportable" or "State or Local rule or regulation that is not federally-enforceable". This indicator provides context to support metric 8c. Specifically, the numerator is the number of ICIS-Air facilities with an HPV Day Zero during the review fiscal year that occurred at ICIS-Air facilities with a default pollutant classification of Major at any time during the review fiscal year. The denominator is the number of ICIS-Air facilities with a default pollutant classification of Major at any time during the review fiscal year. Metric 8c Accuracy of HPV determinations Metric type: File, Goal Goal: 100% of violations accurately identified What it measures: Percentage of federally reportable violations (FRVs) reviewed for which an accurate HPV determination (HPV or no HPV) was made. Numerator: number of FRVs reviewed for which an accurate HPV/non-HPV determination was made; Denominator: Total number of FRVs reviewed. Guidance: Review files with FRVs that identify HPVs and files with FRVs not designated as HPVs. To determine if all HPVs were accurately identified, compare both the HPVs and non- HPVs in the files with the HPV criteria set forth in the HPV Policy (pp. 3-4). Note: Universe of files to select from using the file selection tool is all files with: a) FRVs reported to ICIS-Air that become 90 days old during the review year; and, b) HPV's reported to ICIS-Air during the review year. Applicable EPA policy/guidance: Revision of U.S. Environmental Protection Agency's Enforcement Response Policy for High Priority Violations of the Clean Air Act: Timely and Appropriate Enforcement Response to High Priority violations (2014). CAA National Stack Testing Guidelines (2009) Metric 13 Timeliness of HPV Identification Metric type: Data, Goal Goal: 100% of violations identified timely ------- CAA Plain Language Guide What it measures: Within 90-days after the compliance monitoring activity or discovery action that first provides reasonable information indicating a violation of federally-enforceable requirements, an HPV classification should be made. Numerator: number of case files with HPVs that were reported in the review year with a "day zero" within 90-days of the discovery action. Denominator: number of case files with Earliest HPV Day Zero date in the review year. Guidance: The HPV policy of 2014 states that "Day Zero will be deemed to have occurred on the earlier of (1) the date the agency has sufficient information to determine that a violation has occurred that appears to meet at least one HPV criterion or (2) 90 days after the compliance monitoring activity that first provides information reasonably indicating a violation of a federally enforceable requirement. This metric examines the rate of meeting this 90-day timeframe for determining Day Zero. All enforcement agencies must record the Day Zero into ICIS-Air. (See HPV Policy Page 4-5; Sec III paragraph 2). This data metric is looking at number of case files with HPVs, not number of HPVs. Some case files may have multiple HPVs. Applicable EPA policy/guidance: Revision of U.S. Environmental Protection Agency's Enforcement Response Policy for High Priority Violations of the Clean Air Act: Timely and Appropriate Enforcement Response to High Priority violations (2014). Element 4 Enforcement Reviewers use Element 4 to determine the agency's effectiveness in taking timely and appropriate enforcement, and using enforcement to return facilities to compliance. EPA's Information Collection Request (ICR) Supporting Statement for the 2014 Renewal defines formal and informal enforcement actions as follows: "An informal enforcement action notifies or advises the recipient of apparent deficiencies, findings concerning noncompliance, or that the issuing agency believes one or more violations occurred at the referenced source and provides instructions for coming into compliance. An informal enforcement action offers an opportunity for the recipient to discuss with the issuing agency actions they have taken to correct the violations identified or provide reasons they believe the violations did not occur. An informal enforcement action may include reference to an issuing agency's authority to elevate the matter, and/or liability of the recipient to pay a penalty. This data is intended to ensure that the delegated agency informs the source as soon as possible of the agency's findings so that the source is on notice of the need to promptly correct conditions giving rise to the violation(s) or potential violation(s)." "A formal enforcement action either requires that a person comply with regulations, requirements, or prohibitions established under the CAA; requires payment of a penalty or establishes an agreement to pay a penalty; initiates an administrative procedure (e.g., file a complaint) or civil action (e.g., referral); or constitutes a civil action. Generally, these actions are referred to as complaints, settlement agreements, compliance or penalty orders, referrals, consent agreements, or consent decrees. In other words, formal enforcement actions have legal consequences if the source does not comply." ------- CAA Plain Language Guide The information provided by the Supporting Data Indicators for Element 3 and for this Element should be examined when selecting facility files to review. If violation and HPV identification rates are high (data verification metrics ld2 and lfl)but enforcement is low (data verification metrics lei and lgl), reviewers should select a sufficient number of facility files with violations and HPVs to determine whether the low enforcement activity rate is a result of lack of timely and appropriate enforcement. If enforcement numbers are high, reviewers should select sufficient facility files with enforcement to determine if those actions were appropriate and returned facilities to compliance. Further, if the rate of addressing HPVs within 180 days is low, or if the percentage of HPVs addressed without formal enforcement is high, reviewers should select sufficient facility files with HPV to determine why. Reviewers use metrics 9a (enforcement that returns sources to compliance), 10a (timeliness of addressing or having a Case Development and Resolution Timeline in place), 10b (addressing or removal of HPVs consistent with the HPV policy) and 14 (HPV Case Development and Resolution Timeline in place that meets requirements of the HPV policy) to draft findings under this element. Key metrics: 9a, 10a, 10b, 14 Metric 9a Formal enforcement responses that include required corrective action that will return the facility to compliance in a specified time frame or the facility fixed the problem without a compliance schedule. Metric type: File, Goal Goal: 100% of enforcement actions bring sources back into compliance What it measures: Percentage of formal enforcement actions reviewed that include required corrective actions that will return the source to compliance in a specified time frame or the facility fixed the problem without a compliance schedule. This encompasses HPVs and non- HPVs. Numerator: number of formal enforcement actions reviewed that either include a schedule to return to compliance or the facility fixed the problem without a compliance schedule; Denominator: total number of formal enforcement actions reviewed. Guidance: EPA expects agencies to pursue enforcement actions that will return a source to compliance and deter future noncompliance. Where appropriate, a compliance schedule establishing actions that will return the source to compliance in a specified time frame should be included in enforcement actions. For penalty-only enforcement responses, reviewers should look for documentation that the source returned to compliance or was required to comply by a specified time. Documentation may include items such as the following: closeout memo, note to the file, internal tracking system notation or screenshot, etc. This metric ensures the formal enforcement actions reviewed meet EPA expectations and conform to EPA established policies and guidance. This metric also assesses a delegated agency's implementation of its own policies and guidance, which are required to be consistent with EPA policy and guidance. Applicable EPA policy/guidance: Revision of U.S. Environmental Protection Agency's Enforcement Response Policy for High Priority Violations of the Clean Air Act: Timely and Appropriate Enforcement Response to High Priority violations (2014). Clean Air Act Stationary Source Penalty Policy (1991). Information Collection Request Supporting Statement (2016) ------- CAA Plain Language Guide Metric 10a Timeliness of addressing HPVs or alternatively having a case development and resolution timeline in place. Metric type: File, Goal Goal: 100% of HPVs are addressed timely or have a CD&RT timely in place What it measures: Percentage of HPVs reviewed that were either a) addressed within 180 days of Day Zero or b) not addressed within 180 days of Day Zero, but had a case development and resolution timeline in place within 225 days of Day Zero. Numerator: Number of HPVs reviewed that were either: a) addressed within 180 days of Day Zero; or, b) not addressed within 180 days of Day Zero, but had a case development and resolution timeline in place within 225 days of Day Zero. Denominator: Number of HPVs reviewed Guidance: HPVs must be addressed within 180 days of Day Zero or a case development and resolution timeline (CD&RT) should be in place within 225 days from day zero (an additional 45 days from the 180-day period). Review all files which include an HPV that reached 180 days old during the review year and determine if each HPV either was addressed within 180 days of Day Zero or a CD&RT was in place at or before 225 days from day zero. Applicable EPA policy/guidance: Revision of U.S. Environmental Protection Agency's Enforcement Response Policy for High Priority Violations of the Clean Air Act: Timely and Appropriate Enforcement Response to High Priority violations (2014). Supporting Data Indicator lOal Rate of Addressing HPVs within 180 days Metric type: Data, Supporting Indicator for Metric 10a What it measures: Percentage of HPV's addressed that were addressed within 180 days of Day Zero. Numerator: number of case files with HPVs addressed during the review year that were addressed within 180-days of the Case File's Earliest HPV Day Zero date. Denominator: number of case files with HPVs addressed during the review year. Guidance: This indicator would be used by the SRF reviewer to provide perspective for Metric 10a, showing the portion of the 10a percentage that represents HPVs addressed within 180 days. This indicator can assist in focusing on files during the file selection process. It may also be used to point toward possible program implementation strengths or deficiencies. The purpose of this indicator is to assess what portion of the HPVs identified by the state that were not otherwise resolved were addressed within 180 days (and therefore did not move into the case development and resolution timeline/consultation scenario). Applicable EPA policy/guidance: Revision of U.S. Environmental Protection Agency's Enforcement Response Policy for High Priority Violations of the Clean Air Act: Timely and Appropriate Enforcement Response to High Priority violations (2014). ------- CAA Plain Language Guide Metric 10b Percent of HPVs that have been have been addressed or removed consistent with the HPV Policy. Metric type: File, Goal Goal: 100% of violations appropriately addressed or removed consistent with HPV policy. What it measures: Percent of HPVs that have been addressed or removed (via a no further action determination, lead change, or another removal mechanism) consistent with HPV Policy. Numerator: the number of HPV's reviewed that were addressed or removed (via a no further action determination, lead change, or another removal mechanism) consistent with the HPV policy. Denominator: the number of HPV's reviewed that were addressed or removed. Guidance: Review all files which include an HPV and determine if the violation was addressed or removed according to the policy. Actions that are not appropriate under the HPV Policy include actions that are informal, that do not contain an appropriate penalty, or formal actions that do not return the source to compliance or do not contain compliance schedules. HPVs that are compliant with the requirements of the Case Development & Resolution Timeline are not considered here because they still are in the process of being addressed (See metric 14) but are not yet concluded. This metric does not measure timeliness of addressing HPVs. This is accomplished via metric 10a and indicator lOal. This metric assures that the removal action or addressing action adheres to the terms of the HPV Policy in all ways other than timeliness. Through this metric we are not examining whether the HPV was properly resolved, just whether it was properly addressed or removed. The term "another removal mechanism" captures other ways that an HPV can be concluded like simply removing the HPV flag in ICIS-Air. Note: The universe (denominator) of files to be considered for review using the ECHO file selection tool is those HPVs that were addressed or removed during the review year. Applicable EPA policy/guidance: Revision of U.S. Environmental Protection Agency's Enforcement Response Policy for High Priority Violations of the Clean Air Act: Timely and Appropriate Enforcement Response to High Priority violations (2014). Supporting Data Indicator lObl Rate of Managing HPVs to completion without a Formal Enforcement Action ------- CAA Plain Language Guide Metric type: Data, Supporting Indicator for Metric 10b What it measures: Percentage of HPVs managed w/o a formal enforcement action. Numerator: Number of case files with HPVs managed to completion during the review year via "removal," a determination of no further action, lead change, or another mechanism, but not via a formal enforcement action. Denominator: Number of case files with HPVs managed to completion during the review year via any mechanism (removal, no further action lead change, another mechanism, or a formal enforcement action). Guidance: This indicator is used by the SRF reviewer to assist in focusing on files during the file selection process. It provides perspective on the metric 10b result, and may also be used to point toward possible program implementation strengths or deficiencies. Provides information on resolution of HPVs by mechanisms other than addressing the HPV with a formal enforcement action. Metric 14 HPV Case Development and Resolution Timeline In Place When Required that Contains Required Policy Elements Metric type: File, Goal Goal: 100% of case development and resolution timelines are timely in place and meet, at a minimum, the requirements of HPV Policy What it measures: HPVs not addressed or otherwise managed to completion within 180 days of Day Zero have a case development and resolution (CD&RT) timeline in place, and the CD&RT meets the requirements of the HPV Policy. Numerator: number of HPVs reviewed that require a CD&RT plan (are 225 days old and were not addressed or otherwise managed to completion) that have a CD&RT plan that meets the requirements of the HPV policy. Denominator: number of HPVs reviewed that required a CD&RT plan (are 225 days old and were not addressed or otherwise managed to completion). Guidance: Review HPVs with CD&RTs in place to ensure that were established within 225 days from Day Zero and contain the following required elements of the policy at a minimum. The CD&RT needs to include: 1. Pollutant(s) at issue 2. Estimate of the type and amount of an on-going emissions in excess of the applicable standard. 3. Specific milestones for case resolution a. Proposed date for the start of settlement negotiations and timeline b. Proposed date for commencing an enforcement action. Note: Files reviewed will be selected from the universe of files with HPVs that reach 225 days old during the review year, and have not been addressed or otherwise concluded in the file selection tool logic. ------- CAA Plain Language Guide Applicable EPA policy/guidance: Revision of U.S. Environmental Protection Agency's Enforcement Response Policy for High Priority Violations of the Clean Air Act; Timely and Appropriate Enforcement Response to High Priority violations (2014). Element 5 Penalties Element 5 evaluates penalty calculation and collection documentation using three metrics 1 la for examining documentation of calculation of gravity and economic benefit components of a penalty, 12a for documentation of any difference between the initial and final penalty calculations, and 12b for documentation of penalty collection. Reviewers can gauge the level of penalty activity in a state for the review year using the CAA Dashboard, which provides information on the number of penalties imposed and their dollar values. Key metrics: 11a, 12a, 12b Metric 11a Penalty calculations reviewed that document gravity and economic benefit Metric type: File, Goal Goal: 100% of penalty calculations include and document gravity and economic benefit components as appropriate What it measures: Percentage of penalty calculations reviewed that document and include, where appropriate, calculations of gravity and economic benefit. The numerator is the number of penalties reviewed where the penalty was appropriately calculated and documented; the denominator is the total number of penalties reviewed. Guidance: Agencies should document penalties sought, including the calculation of gravity and economic benefit where appropriate. With regard to this documentation, the Revisions to the Policy Framework for State/EPA Enforcement Agreements (1993) say the following: EPA asks that a State or local agency make case records available to EPA upon request and during an EPA audit of State performance. All recordkeeping and reporting should meet the requirements of the quality assurance management policy and follow procedures established by each national program consistent with the Agency's Monitoring Policy and Quality Assurance Management System.... State and local recordkeeping should include documentation of the penalty sought, including the calculation of economic benefit where appropriate. It is important that accurate and complete documentation of economic benefit calculations be maintained to support defensibility in court, enhance the Agency's negotiating posture, and lead to greater consistency. The CAA Stationary Source Civil Penalty Policy (1991) also specifies that to achieve deterrence, a penalty should not only recover any economic benefit of noncompliance, but also include an amount ------- CAA Plain Language Guide reflecting the seriousness of the violation, which is the gravity component. The gravity component includes the size of the business and the duration and seriousness of the violation. The CAA Penalty Policy goes on to say: In individual cases where the Agency decides to mitigate the economic benefit component, the litigation team must detail those reasons in the case file and in any memoranda accompanying the settlement. Delegated agencies may use their own penalty policies and either EPA's computerized model, known as BEN, or their own method to calculate economic benefit consistent with national policy. State/local programs should provide documentation in the file reflecting how the gravity and economic benefit values assessed in the penalty were derived. In addition, if no economic benefit is assessed as part of the penalty, a written rationale should be reflected in the file [e.g. "EB was below de minimis threshold" (which should be identified), or "the violation was a paperwork violation with minimal cost"]. Applicable EPA policy/guidance: Clean Air Act Stationary Source Civil Penalty Policy (1991). Oversight of State and Local Penalty Assessments: Revisions to the Policy Framework for State/EPA Enforcement Agreements (1993). Revised Policy Framework for State/EPA Enforcement Agreements (1986) Metric 12a Documentation of rationale for difference between initial penalty calculation and final penalty Metric type: File, Goal Goal: 100% What it measures: Percentage of penalty calculations reviewed that document the rationale for the final value assessed when it is different than the initial calculated value. The numerator is the number of penalty calculations reviewed that document the rationale for the final value assessed compared to the initial value calculated. The numerator also includes those penalty calculations reviewed where there is no difference between the initial and final penalty. The denominator is the total number of penalty calculations reviewed. Guidance: According to the Revisions to the Policy Framework for State/EPA Enforcement Agreements (1993), states should document any adjustments to the initial penalty including a justification for any differences between the initial and final assessed penalty. Review penalty files to identify initial and final penalties. If only one of the two penalty calculations is found in the file, ask the agency why the initial and final assessed penalty calculations are not both documented, along with the rationale for any differences. Metric 12b Penalties collected Metric type: File, Goal Goal: 100% of files with documentation of penalty collection ------- CAA Plain Language Guide What it measures: Percentage of penalty files reviewed that document collection of penalty. The numerator is the number of assessed penalties with documentation of collection, or documentation of measures to collect a delinquent penalty; the denominator is the number of assessed penalties reviewed. Guidance: This metric assesses whether the assessed penalty was collected. Begin by looking in the file for a cancelled check or other correspondence documenting transmittal of the check. If this documentation is not in the file, ask the agency if they can provide proof of collection through the data system of record. If the penalty has not been collected, there should be documentation either in the file or in the data system of record that the agency has taken appropriate follow-up measures. Note: This metric evaluates whether the final penalty was collected, and whether this information is documented in the file. Reviewers should not make judgements concerning the penalty amount assessed or collected, or any downward or upward trends in penalty collection, as this is not the focus of this metric. Applicable EPA policy/guidance: Clean Air Act Stationary Source Civil Penalty Policy (1991). Oversight of State and Local Penalty Assessments: Revisions to the Policy Framework for State/EPA Enforcement Agreements (1993). Revised Policy Framework for State/EPA Enforcement Agreements (1986) ------- CAA Plain Language Guide Appendix: Acronyms ICIS-AIR Integrated Compliance Information System for Air BACT Best Available Control Technology CAA Clean Air Act CMR Compliance Monitoring Reports CMS Compliance Monitoring Strategy CMSC Compliance Monitoring Source Category DFR Detailed Facility Report ECHO Enforcement and Compliance History Online (httD://www.eDa-echo.gov/echo) EPA U.S. Environmental Protection Agency FCE Full Compliance Evaluation FRV Federally Reportable Violation FY Federal Fiscal Year (Oct. 1 - Sept. 30) HPV High Priority Violation ICR Information Collection Request LAER Lowest Achievable Emissions Rate MOA Memorandum of Agreement MOU Memorandum of Understanding MDR Minimum Data Requirement NOV Notice of Violation NPM National Program Manager Guidance NSR New Source Review PCE Partial Compliance Evaluation PPA Performance Partnership Agreement PPG Performance Partnership Grant PSD Prevention of Significant Deterioration SM-80 Synthetic Minor sources that emit or have the potential to emit at or above 80 percent of the Title V major source threshold SRF State Review Framework ------- |