State Review Framework
Bay Area Air Quality Management District
California
Clean Air Act
Implementation in Federal Fiscal Year 2016
U.S. Environmental Protection Agency
Region 9, San Francisco
Final Report
March 11,2019
l
-------
Executive Summary
Introduction
The U.S. Environmental Protection Agency (EPA) Region IX Air & TRI Enforcement Section conducted a
State Review Framework (SRF) enforcement program oversight review of the Bay Area Air Quality
Management District (Air District) of California.
The State of California divides air quality stationary source regulatory work by geographic regions into 35
different air districts. The Bay Area Air District, created in 1955, regulates stationary sources within 9 counties
in the San Francisco Bay Area. The region encompasses Alameda County, Contra Costa County, Marin County,
Napa County, City and County of San Francisco, San Mateo County, Santa Clara County, southern Sonoma
County, and south-western Solano County. The Air District is governed by a 24-member Board of Directors
which oversees its policies and has the authority to develop and enforce regulations within its jurisdiction. The
Board of Directors also appoints the Air District's Executive Officer/Air Pollution Control Officer, who
implements policies and manages staff, as well as the Office of the District Counsel, who directs the Air
District's legal affairs. Air District staff consists of engineers, inspectors, planners, scientists, and other
professionals.
The Air District has the responsibility to issue permits, conduct inspections, issue administrative enforcement
actions, and issue administrative abatement or enforcement orders at its public hearing board. In some cases,
judicial enforcement actions are referred to the State Attorney General. The Air District implements aspects of
the Title V program that include requirements under facility permit conditions, rules adopted under its State
Implementation Plan, New Source Performance Standards, National Emission Standards for Hazardous Air
Pollutants, Maximum Achievable Control Technology, as well as targeting and inspections at industrial
facilities, reviewing annual compliance certifications, performing and overseeing source testing and monitoring
at emission sources, and pursuing enforcement cases.
EPA bases SRF findings on data and file review metrics, and conversations with program management and
staff. EPA will track recommended actions from the review in the SRF Tracker and publish reports and
recommendations on the EPA ECHO web site.
Areas of Strong Performance
The Air District evaluates air Compliance Monitoring Strategy (CMS) sources on a more frequent basis
than the minimum evaluation frequencies recommended in the CMS Policy.
The CMS source universe is accurate.
Cases with enforcement penalties: The Air District has a state-mandated penalty policy that is
consistent with EPA's, taking into consideration economic benefit in its penalty calculations. The Air
District continues to take this into account as it implements state requirements to ensure a level playing
field.
2 | P a g e
-------
Priority Issues to Address
The following are the top-priority issues affecting the state program's performance:
Data Reporting/Timeliness: While the Air District consistently provided dates of inspections at its
facilities, information on informal and formal actions taken to return facilities to compliance was
missing.
Lack of Federally-Reportable Violations (FRV) and High Priority Violations (HPV) reporting: Similar
to Data Reporting and Timeliness, while the Air District consistently provided dates of inspections at its
facilities, information on Federally-Reportable Violations (FRV) and High Priority Violations (HPV)
was missing.
3 | P a g e
-------
I. Background on the State Review Framework
The State Review Framework (SRF) is designed to ensure that EPA conducts nationally consistent oversight. It
reviews the following local, state, and EPA compliance and enforcement programs:
Clean Water Act National Pollutant Discharge Elimination System
Clean Air Act Stationary Sources (Title V)
Resource Conservation and Recovery Act Subtitle C
Reviews cover:
Data completeness, accuracy, and timeliness of data entry into national data systems
Inspections/Evaluations meeting inspection/evaluation and coverage commitments, inspection
(compliance monitoring) report quality, and report timeliness
Violations identification of violations, determination of significant noncompliance (SNC) for the
CWA and RCRA programs and high priority violators (HPV) for the CAA program, and accuracy of
compliance determinations
Enforcement timeliness and appropriateness, returning facilities to compliance
Penalties calculation including gravity and economic benefit components, assessment, and collection
EPA conducts SRF reviews in three phases:
Analyzing information from the national data systems in the form of data metrics
Reviewing facility files and compiling file metrics
Development of findings and recommendations
EPA builds consultation into the SRF to ensure that EPA and the state/local understand the causes of issues and
agree, to the degree possible, on actions needed to address them. SRF reports capture the agreements developed
during the review process in order to facilitate program improvements. EPA also uses the information in the
reports to develop a better understanding of enforcement and compliance nationwide, and to identify issues that
require a national response.
Reports provide factual information. They do not include determinations of overall program adequacy, nor are
they used to compare or rank state/local programs.
Each state/local program is reviewed once every four years. The first round of SRF reviews began in FY 2004.
The third round of reviews began in FY 2013 and will continue through FY 2017.
4 | P a g e
-------
II. SRF Review Process
Review period: FY 2016
Key dates:
Kickoff letter sent to the Air District: June 29, 2017
CAA data metric analysis and file selection list sent to the Air District: August 4, 2017
On-site CAA file review: September 25-27, 2017
Draft report sent to the Air District: October 2018
Report finalized: March 2019
State and EPA key contacts for review:
BAAQMD
Wayne Kino, Director of Enforcement, Compliance and Enforcement Division
Juan Ortellado, Air Quality Program Manager, Compliance and Enforcement Division
Jeffrey Gove, Supervising Air Quality Specialist, Compliance and Enforcement Division
EPA Region IX
Matt Salazar, Manager, Air & TRI Office, Enforcement Division, Region IX
Andrew Chew, Case Developer/ Inspector, Air & TRI Office, Enforcement Division, Region IX
David Basinger, Case Developer/ Inspector, Air & TRI Office, Enforcement Division, Region IX
Jennifer Sui, ICIS-Air Coordinator, Information Management Section, Enforcement Division, Region IX
Elizabeth Walsh, Office of Compliance, Office of Enforcement and Compliance Assurance
5 | P a g e
-------
III. SRF Findings
Findings represent EPA's conclusions regarding state/local performance and are based on findings made during
the data and/or file reviews and may also be informed by:
Annual data metric reviews conducted since the previous state/local SRF review
Follow-up conversations with state/local agency personnel
Review of previous SRF reports, Memoranda of Agreement, or other data sources
Additional information collected to determine an issue's severity and root causes
There are three categories of findings:
Meets or Exceeds Expectations: The SRF was established to define a base level or floor for enforcement
program performance. This rating describes a situation where the base level is met and no performance
deficiency is identified, or a state/local performs above national program expectations.
Area for State/Local Attention: An activity, process, or policy that one or more SRF metrics show as a minor
problem. Where appropriate, the state/local should correct the issue without additional EPA oversight. EPA
may make recommendations to improve performance, but it will not monitor these recommendations for
completion between SRF reviews. These areas are not highlighted as significant in an executive summary.
Area for State/Local Improvement: An activity, process, or policy that one or more SRF metrics show as a
significant problem that the agency is required to address. Recommendations should address root causes. These
recommendations must have well-defined timelines and milestones for completion, and EPA will monitor them
for completion between SRF reviews in the SRF Tracker.
Whenever a metric indicates a major performance issue, EPA will write up a finding of Area for State/Local
Improvement, regardless of other metric values pertaining to a particular element.
The relevant SRF metrics are listed within each finding. The following information is provided for each metric:
Metric ID Number and Description: The metric's SRF identification number and a description of
what the metric measures.
Natl. Goal: The national goal, if applicable, of the metric, or the CMS commitment that the state/local
has made.
Natl. Avg: The national average across all states, territories, and the District of Columbia.
State N: For metrics expressed as percentages, the numerator.
State D: The denominator.
State % or #: The percentage, or if the metric is expressed as a whole number, the count.
6 | P a g e
-------
Clean Air Act Findings
Element 1 Data
Finding 1-1 Area for State Improvement
Summary The SRF File Review indicated information reported into ICIS-Air was
not consistent with the information found in the files reviewed.
Explanation Review Metric 2b evaluates the completeness and accuracy of reported
Minimum Data Requirements (MDRs) in the ICIS-Air reporting system.
Timeliness is measured using the date the activity is achieved and the date
it is reported to ICIS-Air. While the national goal for accurately reported
data in ICIS-Air is 100%, we found that, with exception of the facility
identifiers and Full Compliance Evaluation (FCE) dates, none of the other
reviewed data in the files were accurately reported. To elaborate, facility
identifiers that were related to facility information (names, addresses,
contact phone numbers, Compliance Monitoring Strategy information,
pollutants, operating status, etc.) were correctly reported. Dates of FCE
performed, when applicable, were also correctly reported. However,
information and activity data related to steps taken after the performance
of FCEs were missing (e.g., stack test results were not reported to ICIS-
Air). EPA reiterates the importance of accurate, complete, and timely
reporting as non-reporting results in a lack of information and
transparency being provided to the public, and could be potentially
misleading.
Our review of ICIS-Air indicated that there were no HPVs reported. Upon
review of case files and conversation with staff, we learned that this
circumstance was due to an overall failure to identify (and therefore
report) HPVs.
Metric 3b 1 measures the timeliness for reporting compliance-related
MDRs (FCEs and Reviews of Title V Annual Compliance Certifications).
Out of 30 facilities (where 30 FCEs were performed and 16 Title V
Annual Compliance Certifications [ACC] reviewed), 0 were reported
within 60 days (0.0%). The national goal is 100%.
Metric 3b2 evaluates whether stack test dates and results are reported
within 120 days of the stack test. The national goal for reporting results of
stack tests is to report 100% of all stack tests within 120 days. We
selected 9 stack tests to review. Of the 9 stack tests we selected, none
were reported (0%). This is below the national goal.
Metric 3b3 measures timeliness for reporting enforcement-related MDRs
within 60 days of the action. No actions were reported by the Air District,
7 | P a g e
-------
despite numerous informal and formal enforcement actions documented
in their case files and databases. For this reason, missing enforcement
MDR reporting resulted in none reported within 60 days (0%), which is
below the national goal of 100%.
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avjj
Siale
\
Siale
1)
Siale
% or#
2b- Accurate MDR Data in ICIS-Air
100%
0
^1
<).<)"
3a2- Untimely Entry of HPVs
100%
18.4%
0
0
0 <)"
3b 1 - Timely Reporting of Compliance
Monitoring MDRs
100%
68.3%
0
4(>
0 <)"
3b2 - Timely Reporting of Stack Test
Dates and Results
100%
63.8%
0
J
0 <)"
3b3 - Timely Reporting of Enforcement tnnn/ , wl,
100% 61.3% 0 1'^ no".,
MDRs
State Response As conveyed in the November 8, 2018, conference call with EPA Region
IX staff, the Air District is committed to addressing the data
reporting/timeliness findings identified in the SRF draft report. The Air
District will be implementing the following improvements and will
provide a draft plan of the items mentioned to EPA Region IX for review
and approval within 60 days of issuance of the final SRF report:
a) Two additional Enforcement staff will be added to enter
compliance monitoring and enforcement activity data into ICIS-
Air
b) Compliance monitoring and enforcement data entry to ICID-Air
will occur on a monthly frequency to ensure reporting timeliness
(i.e. HPV, FRV, etc.)
c) Compliance monitoring activity data will include Stack Testing
and Annual compliance certifications (ACC)
d) Federally reportable violations (FRV) and high priority violations
(HPV) are appropriately entered into ICIS-Air. [Note: The Air
District will be looking to EPA for guidance on timeliness
reporting of FRV, HPV, informal enforcement and formal
enforcement actions as it relates to Air District's notice of
violation resolution process.]
e) Informal and formal enforcement actions are appropriately entered
into ICIS-Air (see above note).
The Air District is receptive to monthly or quarterly conference calls with
EPA Region IX to discuss minimum data requirements (MDRs) and
compliance monitoring reporting topics. Additionally, EPA training on
reporting MDRs into ICIS-Air would be helpful to four Enforcement staff
members responsible for ICIS-Air data entry.
-------
Recommendation EPA recommends that within 60 days of issuance of the
final report, the Air District should provide to EPA Region
IX for review and approval a draft plan describing how it
will address data entry and reporting issues. The Air District
and EPA will commence monthly or quarterly conference
calls to discuss MDRs, including compliance monitoring-
related reporting. If requested, EPA will provide training on
reporting MDRs into ICIS-Air. Once the Air District begins
implementing the plan, Region IX will review the reported
data throughout FY 2019. If the data is timely, complete,
and accurate, the recommendation will be deemed
completed at the end of the Fiscal Year.
Element 2 Inspections/Evaluations
Finding 2-1
Meets or Exceeds Expectations
Summary
The Air District has a correct listing of CMS source universe [number of
Majors, Synthetic Minor-80s (SM80s), and Mega-Sites], and meets goals
for inspection coverage.
Explanation
This Element evaluates whether the negotiated frequency for compliance
evaluations is being met for each source. The Air District met the national
goal for the relevant metrics.
The Air District met the negotiated frequency for conducting Full
Compliance Evaluations of Title V Major Sources, Mega-Sites, and
SM80s. The Air District ensured each major source was evaluated with an
FCE once every two years, each Mega-Site once every three years, and
each SM80 once every five years.
EPA commends the Air District for full compliance evaluations at major
facilities, an impressive accomplishment given the distance and
complexities of the sources it regulates. The Air District goes beyond the
minimum frequencies and inspects sources more often than EPA's CMS
policy requires. The Air District kept their CMS plan up to date. The Air
District maintained its database files on their CMS source universe and
updated ICIS-Air correctly (adhering to the CMS evaluation frequency).
Out of 31 facilities files reviewed, only two CMS sources were not
properly identified as a Title V Major Source or SM80 in the Air District's
database. We believe that the Air District will have corrected this by the
time of issuance of the final report.
9 | P a g e
-------
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
% or #
5a - FCE Coverage Majors
100%
86.6%
28
28
100%
5b - FCE Coverage SM80s
100%
91.6%
3
3
100%
5c - FCE Coverage CMS non-SM80s
100%
79.8%
0
0
0.0%
State Response
Recommendation None required.
Element 2 Inspections/Evaluations
Finding 2-2
Area for State Improvement
Summary
The Air District completed the required reviews for each Title V Annual
Compliance Certification (ACC); however, the Air District had not
reported its universe into ICIS- Air.
Explanation The Air District failed to report any of its Title V ACCs into ICIS-Air, as
required under Element 2. This Element evaluates whether the delegated
agency has completed the required review for Title V Annual Compliance
Certifications. The Air District did complete 23 out of 28 Title V Annual
Compliance Certifications of sources selected. There were no records to
indicate that the remaining 5 reviews had been completed. Furthermore, of
the 23 completed reviews, only 14 were shown to have been completed
within 60 days of receipt.
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
% or #
5e - Review of TV ACCs
100%
69.4%
23
28
82.1%
State Response The Air District has completed the review of all Title V ACC to ensure
data have been entered into ICIS Air. As indicated in the response for
Element 1, Finding 1-1, Title V ACCs data will be included in the
compliance monitoring activity data entered into ICIS-Air. [Note: in a call
on January 30, 2019, Air District staff indicated all ACCs were reviewed
and the data entered.]
Recommendation In EPA's draft report, we recommended that the Air District determine if
all required reviews of Title V Annual Compliance Certifications were
performed during the review period and thereafter, input the data that was
described under the recommendations for Element 1, as well as report all
ACCs. The Air District has indicated the above is now complete as of the
10 | P a g e
-------
date of this report. Actions to ensure continuing compliance with this
reporting element should be included within a plan as recommended under
Element 1 within 60 days of issuance of the final report.
Element 2 Inspections/Evaluations
Finding 2-3 Area for State Attention
Summary Overall, the Air District compliance monitoring reports (CMRs) provided
were adequate, but additions of relevant information may make them more
useful to inspectors.
Explanation Some reports, such as those for stack tests or tank inspections, lacked
sufficient information to allow an understanding of what steps or
recommendations were needed after an inspector had completed his or her
review. For example, tank inspection reports did not include a
determination of compliance.
Inspection reports did not include descriptions of enforcement history
which is considered a "basic element" that should be included (as
discussed in the CMS Policy). The District report format/template should
be updated to include an enforcement history section.
The statement of a facility being "in compliance" should be removed from
inspection reports (CMRs) and instead language stating "no violations seen
at this time" should be used. Inspectors should continue citing observations
and recommendations in their reports.
Twenty-seven Air District compliance monitoring reports were reviewed
under this Element. Reviewers found 23 inspections were fully
documented, and 4 were missing FCE Elements.
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
% or #
6a - Documentation of FCE Elements
100%
23
27
85.0%
6b - CMRs/Sufficient Documentation to
Determine Compliance
100%
23
27
85.0%
State Response
Recommendation None required.
11 | P a g e
-------
Element 3 Violations
Finding 3-1 Area for State Improvement
Summary In general, compliance determinations were accurately made; however,
they were not reported into ICIS-Air based on the CMRs reviewed and
other reviewed compliance monitoring information (i.e., Title V ACCs,
stack test reports, NOVs, and RCAs).
Explanation Metric 7a is designed to evaluate the overall accuracy of compliance
determinations and Metric 8c focuses on the accurate identification of
violations that are determined to be High Priority Violations (HPVs).
For 7a, in 18 out of 18 reviewed compliance determinations, there was
enough information to show that the Air District made appropriate
compliance determinations.
During the period addressed by this review, the Air District had not been
reporting any violations as HPVs or Federally Reportable Violations
(FRVs). In our review of six case files, all appeared to contain more than
adequate information to make a determination of HPV, and should have
been timely reported as such, in accordance with EPA policy. This concern
was discussed with the Air District staff, who said they were aware of both
policies. In those discussions, we learned that data non-reporting and
missing HPV determinations were a result of resource constraints as the
District had been working on building out and transitioning to a new
permit database system. As this important project has been completed, the
Air District will have resources available to report all compliance
determinations and FRVs/HPVs into ICIS-Air.
The Air District did not differentiate between FRV violations and HPV
violations. HPVs are a "subset" of FRVs and, as more significant
violations that meet the HPV criteria, are treated differently, and must be
reported accordingly into ICIS-Air. Failure to do so runs counter to the
MDRs/reporting requirements. Identifying HPV violations according to
EPA policy could help identify appropriate corrective actions to be taken
and improve their timeliness.
12 | P a g e
-------
Relevant metrics . Natl Natl Siaic sum- Siaic
Metric ID Number and Description _ , . .
Goal Avjj N D %or#
Metric 7a - Accurate Compliance
^ ^ 100% I) IS <)<)"
Determinations
Metric 8c - Accuracy of HP V s s s ww
^ J 100% I)
Determinations
State Response As indicated in the response for Element 1 recommendations (Finding 1-1),
the Air District plans to submit a draft plan describing how the data
reporting/timeliness findings identified in the SRF draft report will be
addressed. The draft plan will be provided to EPA Region IX for review
and approval within 60 days of issuance of the final report.
Air District Notices of Violations (NOVs) resolved for the review period
(FY16) have been entered into ICIS-Air. Based on guidance from the
previous EPA Region IX ICIS-Air coordinator, the NOVs were entered
into ICIS-Air as an informal enforcement action and followed by a formal
enforcement action entry with an administrative order, penalty information
and final order date. [Note: on September 20, 2016, Air District staff met
with the EPA R9 ICIS-Air Coordinator for a "Year End Check-in
Meeting." One of the discussion items was around HPVs and the best way
to enter them into ICIS-Air. Majority of the NOVs issued by the Air
District are categorized as HPVs; guidance was asked on the least
burdensome method to enter these HPVs into ICIS-Air. Guidance was
given to enter the NOVs into ICIS-Air as an informal and formal
enforcement action.]
Recommendation The Air District must ensure that all enforcement responses
(Formal Notices of Violations; field citations; warnings; and
informal NOVs) are reported into ICIS-Air as required in the ICR
within 90 days of the final SRF report being issued. All staff and
managers should be provided copies of the FRV and HPV policies.
All FRVs and HPVs need to be reported consistent with EPA
policy.
Reiterating our recommendation under Finding 1-1, the Air District
should develop a plan that details a process to address FRV/HPV
determinations along with other reporting issues. The Air District
should provide the plan to Region IX within 60 days of issuance of
the final report. The plan must adequately resolve the weaknesses
on FRV/HPV determinations, as well as timeliness and
completeness in reporting.
As stated in the HPV Policy, Region IX will have conference calls
with the Air District to discuss potential HPVs (as well as any
issues concerning FRVs and CMS implementation). These will
13 | P a g e
-------
occur on a regular basis (monthly calls) to discuss any relevant
reporting issues.
Region IX will be reviewing FRV/HPV determinations/reporting
throughout FY2019. If the reporting is accurate, the
recommendation will be deemed completed at the end of 2019.
Element 4 Enforcement
Finding 4-1 Area for State Improvement
Summary The six enforcement actions available for review in this period did not
require timelines for corrective action in order to demonstrate the facilities'
return to compliance. Based on a review of the case files, EPA believes the
Air District took timely and appropriate steps in formal enforcement to
address these violations. However, the Air District did not perform HPV
determinations consistent with the policy.
Explanation EPA reviewed several case files that recorded formal enforcement actions
for various source categories. The Air District failed to document how the
facilities returned to compliance. The Air District has a varied source
universe. EPA commends the Air District for its enforcement responses.
The Air District should fully document that all enforcement responses
(Formal Notices of Violations; field citations; warnings; informal NOVs;
settlements and corrective actions) return facilities to compliance, and are
sufficient to be an appropriate response.
Metric 10a is designed to evaluate the extent to which the agency takes
timely action to address HPVs. The Air District did not code violations as
HPVs, though file reviews indicated instances where an HPV designation
would have been appropriate. The Air District did not adhere to the 2014
HPV Policy and inspectors did not recognize when violations meet the
HPV criteria and should be identified/reported as HPVs.
Metric 10b is designed to evaluate the extent to which the agency takes
appropriate enforcement responses for HPVs. Although the enforcement
response was appropriate, the Air District did not identify the HPVs
consistent with policy.
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
% or #
9a - Formal Enforcement Returns
100%
0
0.0%
Facilities to Compliance
O
14 | P a g e
-------
10a - Timely Action Taken to Address
HPVs T'2/°
0 0 0.0%
10b - Appropriate Enforcement 9
Responses for HPVs
0 0 0.0%
State Response
Recommendation Reiterating our recommendation under Finding 3-1, the Air District should
develop a plan that details a process to address FRV/HPV determinations
along with other reporting issues. The Air District should provide the plan
to Region IX within 60 days of issuance of the final report. A plan must
adequately resolve the weaknesses on FRV/HPV determinations, as well as
timeliness and completeness in reporting.
As stated in the HPV Policy, Region IX will have conference calls with the
Air District to discuss potential HPVs (as well as any issues concerning
FRVs and CMS implementation). These will occur on a regular basis
(monthly calls) to discuss any relevant reporting issues.
Region IX will be reviewing FRV/HPV determinations/reporting
throughout FY2019. If the reporting is accurate, the recommendation will
be deemed completed at the end of 2019.
Element 5
Penalties
Finding
Meets or Exceeds Expectations
Summary The California Health and Safety Code governs the Air District's penalty
policy, which includes accounting for economic benefit. We believe that
the penalty amounts serve as an effective deterrent to future violations and
that enforcement is handled consistently with similar penalties for similar
violations.
Explanation Our File Review and interview with the District Counsel representative
indicated that the penalties the Air District assessed accounted for
economic benefit. Economic benefit is important to include in the penalty
because it accounts for monetary benefit any institution receives by not
implementing the appropriate measures required to meet regulations.
Metric 12a is designed to evaluate the extent to which the agency
documents the rationale for the difference between initial and final penalty.
In the three cases reviewed with the District Counsel's representative, we
found that the initial penalty amounts reflected economic benefit and
gravity, with reasonable adjustments made before final penalty amounts
were settled.
15 | P a g e
-------
Metric 12b is designed to evaluate whether there is documentation that the
final penalty was collected. Upon request for several case files and
discussion with the District Counsel representative, we reviewed copies of
District documentation that showed its receipt of penalty payments.
Furthermore, the District Counsel representative affirmed that the penalty
calculations developed under California state statute and implemented
through the Air District's penalty policy incorporated an economic benefit
component and gravity.
Relevant metrics . Natl Natl Sialc Sialc Sialc
Metric ID Number and Description , . .
Goal Avg N D %or#
11a-Penalty Calculations Reviewed that ,,www , , , W1 ww
^ 7 . 100% ' " I no o"..
Document Gravity and Economic Benefit
12a - Documentation of Rationale for
Difference Between Initial and Final 100% ^ ^ looo",,
Penalty
12b - Penalties Collected 100% ' ' loo.o",,
State Response
Recommendation None required.
16 | P a g e
-------
State Review Framework
California
Clean Water Act
Implementation in Federal Fiscal Year 2016
U.S. Environmental Protection Agency
Region 9, San Francisco
Final Report
March 11,2019
-------
Executive Summary
Introduction
EPA Region 9 enforcement staff conducted a State Review Framework (SRF) enforcement
program oversight review of the State of California's NPDES compliance and enforcement
program. The review included an examination of facility files at the Santa Ana Regional Water
Quality Control Board (Santa Ana Regional Board, or RB8) in California. Data metrics were
evaluated both on a statewide basis and separately for RB8.
As per past California SRFs, EPA conducted the file review at one or two regional boards while
reviewing available statewide performance metrics. The State of California divides the water
quality regulatory work by watersheds into nine semi-autonomous Regional Water Quality
Control Boards (RWQCBs), who function in partnership with the State Water Resources Control
Board.
Each RWQCB consists of Governor appointed board members and a regulatory office headed by
an Executive Officer. Each individual RWQCB has the responsibility to issue permits, conduct
inspections, manage compliance data, issue administrative enforcement actions, and refer
judicial enforcement actions to the State Attorney General. Permits and administrative
enforcement orders are issued by the Boards at public hearings, typically held monthly. The
RWQCBs regulate all aspects of the NPDES program including pretreatment, stormwater,
SSO/CSOs, animal feeding operations, non-point source, watershed management, water quality
certification, basin planning, TMDL development, as well as State-mandated non-NPDES
programs for irrigated lands, discharges to ground waters, site clean-ups, and septic systems.
Standard operating procedures are consistent across regional boards and indicative of overall
state performance. Previous SRFs have conducted file reviews at the San Francisco and San
Diego RWQCBs (SRF Round 2), and at the Los Angeles and Central Valley RWQCBs (SRF
Round 1).
EPA bases SRF findings on data and file review metrics, and conversations with program
management and staff. EPA will track recommended actions from the review in the SRF Tracker
and publish reports and recommendations on EPA's ECHO web site.
Areas of Strong Performance
Inspection coverage at major, minor, and most stormwater facilities exceeds
commitments in the state-specific CMS plan.
Significant non-compliance at major facilities is below the national average.
Entry of major facility permit and effluent limits exceeds expectations.
Penalty calculations are well-documented and penalties were consistently collected.
Priority Issues to Address
The following are the top-priority issues affecting the state program's performance:
-------
Timely and appropriate CWA enforcement taken to return facilities in Significant
Noncompliance to compliance.
Most Significant CWA-NPDES Program Issues
Accuracy of data reported on facility information, inspections, violations, and
enforcement actions is not completely reported as required.
Inspection report timeliness is unclear with many stormwater inspection reports lacking
documentation of the date the document was finalized or delivered to the facility.
Single event violations are not consistently reported.
-------
I. Background on the State Review Framework
The State Review Framework (SRF) is designed to ensure that EPA conducts nationally
consistent oversight. It reviews the following local, state, and EPA compliance and enforcement
programs:
Clean Water Act National Pollutant Discharge Elimination System
Clean Air Act Stationary Sources (Title V)
Resource Conservation and Recovery Act Subtitle C
Reviews cover:
Data completeness, accuracy, and timeliness of data entry into national data systems
Inspections meeting inspection and coverage commitments, inspection report quality,
and report timeliness
Violations identification of violations, determination of significant noncompliance
(SNC) for the CWA and RCRA programs and high priority violators (HPV) for the CAA
program, and accuracy of compliance determinations
Enforcement timeliness and appropriateness, returning facilities to compliance
Penalties calculation including gravity and economic benefit components, assessment,
and collection
EPA conducts SRF reviews in three phases:
Analyzing information from the national data systems in the form of data metrics
Reviewing facility files and compiling file metrics
Development of findings and recommendations
EPA builds consultation into the SRF to ensure that EPA and the state understand the causes of
issues and agree, to the degree possible, on actions needed to address them. SRF reports capture
the agreements developed during the review process in order to facilitate program improvements.
EPA also uses the information in the reports to develop a better understanding of enforcement
and compliance nationwide, and to identify issues that require a national response.
Reports provide factual information. They do not include determinations of overall program
adequacy, nor are they used to compare or rank state programs.
Each state's programs are reviewed once every five years. The first round of SRF reviews began
in FY 2004. The third round of reviews began in FY 2013 and will continue through FY 2017.
State Review Framework Report | California | Page 1
-------
II. SRF Review Process
Review period: FY 2016. California's inspection coverage was evaluated on the State Fiscal
Year 2016 (July 1, 2015 to June 30, 2016). Data metrics were evaluated on the Federal Fiscal
Year 2016 (October 1, 2015 to September 30, 2016).
Key dates: Field Review July 2017
Draft Report September 2018
Final Report -March 2019
State and EPA key contacts for review:
CWA EPA Contacts: Michael Weiss (EPA Region 9), Greg Gholson (EPA Region 9)
CWA State Contact: Matthew Buffleben (State Water Resources Control Board)
III. SRF Findings
Findings represent EPA's conclusions regarding state performance and are based on findings
made during the data and/or file reviews and may also be informed by:
Annual data metric reviews conducted since the state's last SRF review
Follow-up conversations with state agency personnel
Review of previous SRF reports, Memoranda of Agreement, or other data sources
Additional information collected to determine an issue's severity and root causes
There are three categories of findings:
Meets or Exceeds Expectations: The SRF was established to define a base level or floor for
enforcement program performance. This rating describes a situation where the base level is met
and no performance deficiency is identified, or a state performs above national program
expectations.
Area for State Attention: An activity, process, or policy that one or more SRF metrics show as
a minor problem. Where appropriate, the state should correct the issue without additional EPA
oversight. EPA may make recommendations to improve performance, but it will not monitor
these recommendations for completion between SRF reviews. These areas are not highlighted as
significant in an executive summary.
Area for State Improvement: An activity, process, or policy that one or more SRF metrics
show as a significant problem that the agency is required to address. Recommendations should
address root causes. These recommendations must have well-defined timelines and milestones
for completion, and EPA will monitor them for completion between SRF reviews in the SRF
Tracker.
State Review Framework Report | California | Page 2
-------
Whenever a metric indicates a major performance issue, EPA will write up a finding of Area for
State Improvement, regardless of other metric values pertaining to a particular element.
The relevant SRF metrics are listed within each finding. The following information is provided
for each metric:
Metric ID Number and Description: The metric's SRF identification number and a
description of what the metric measures.
Natl Goal: The national goal, if applicable, of the metric, or the CMS commitment that
the state has made.
Natl Avg: The national average across all states, territories, and the District of Columbia.
State N: For metrics expressed as percentages, the numerator.
State D: The denominator.
State % or #: The percentage, or if the metric is expressed as a whole number, the count.
State Review Framework Report | California | Page 3
-------
Clean Water Act Findings
CWA Element 1 Data
Metric lb: Completeness of permit limit and discharge data in EPA's ICIS database.
Finding 1-1 Meets or Exceeds Expectations
Summary The state meets or exceeds EPA's expectations for coding major facility
permit limits and entering Discharge Monitoring Report (DMR) data in
EPA's Integrated Compliance Information System (ICIS), EPA's national
database.
Explanation Metrics lbl and lb2 measure the state's rate of entering permit limits and
DMR data into ICIS.
According to EPA's data metric analysis, California entered 83.9% of
permit limits in ICIS for major facilities state-wide as indicated in the
values presented for metric lbl below. This analysis, however,
misrepresents California's true permit limit rate by including dozens of
facilities in the rate calculation even though the facility permits lack
effluent limits. The lbl metric analysis includes 20 municipal separate
storm sewer systems (MS4) permits and 20 expired permits, none of which
require permit limit entry in ICIS. MS4 permits often have effluent limits
that are a narrative instead of numeric and hence cannot easily be entered
into a database as opposed to more traditional NPDES permits.
Similarly, the calculated permit limit entry rate for the Santa Ana Regional
Water Quality Control Board (89.5%) incorrectly included two MS4
permits. Were the MS4 and expired permits to be excluded, the permit
limit entry rate for the Santa Ana Regional Board and California statewide
would likely meet EPA's national goal of 95% permit limit entry rate.
California enters 99.0% of DMR data into ICIS, exceeding both EPA's
national goal and the national average DMR data entry rates. The Santa
Ana Regional Board has a 99.8% DMR entry rate.
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avg
State N
State D
State
% or
#
lbl Permit limit rate for major facilities in
California (state-wide)
>95%
91%
208
248
83.9%
lbl Permit limit rate for major facilities in
Regional Water Board 8 (Santa Ana)
>95%
17
19
89.5%
State Review Framework Report | California | Page 4
-------
lb 2 DMR entry rate for maior facilities in ^.
rf , . ; ... >95% 97%
California (state-wide)
13,815 13,955 99.0%
lb2 DMR entry rate for major facilities in >().0/
Regional Water Board 8 (Santa Ana) ~ 0
1,236 1,239 99.8%
State response
Recommendation
None.
CWA Element 1 Data
Metric 2b: Completeness and accuracy of inspections and enforcement action data in EPA's
ICIS database.
Finding 1-2
Area for State Improvement
Summary
Only fourteen percent of files reviewed had complete information reported
to EPA's ICIS database, well below the national goal of 100%.
Explanation
Under Metric 2b, EPA reviewers compared inspection reports and
enforcement actions found in selected files at the Santa Ana Regional
Board to determine if the inspections, inspection findings and enforcement
actions were accurately entered into ICIS. The analysis was limited to data
elements mandated in EPA's ICIS data management policies. States are not
required to enter inspections or enforcement actions for certain classes of
facilities.
EPA found only 5 of the 35 files reviewed (14.3%) in RB8 had all the
required information (facility location, inspection dates, violations, and
enforcement action information) accurately entered into ICIS when
compared with data in California's Integrated Water Quality System
(CIWQS). CIWQS is a computer system used by the State and Regional
Water Quality Control Boards to track inspections, manage permits, and
oversee enforcement activities. California also uses CIWQS as its
electronic file for storage of inspection reports and enforcement
documents. The data in CIWQS presented a more complete record of
actual State inspections and enforcement actions for comparison to ICIS.
Failure to record violations and enforcement actions in ICIS were among
the most frequently cited data accuracy issues for the Santa Ana Regional
Board. This was also an issue in the California Round 2 SRF 2012 Report,
which reviewed the San Francisco and San Diego Regional Boards that
State Review Framework Report California Page 5
-------
still had yet to be fully resolved at the time of the file review. None of
RB8's industrial or construction stormwater or CAFO inspections
reviewed in CIWQS were recorded in ICIS. Prior to July 2017, stormwater
and CAFO inspections (i.e. general permit inspections) were only entered
manually in ICIS when resources were available. This issue has been
addressed and currently (post July 2017) all inspections are entered in
CIWQS which then routinely uploads to ICIS.
Relevant metrics ^ ^ siaic
Metric ID Number and Description ' , .' Slaic N Siaie I) " i> or
Goal A\»
#
2b Files reviewed where data are accurately
reflected in the national data system for 100% 5 "o 14"".,
Regional Water Board 8 (Santa Ana)
State response In July 2017, the Water Boards implemented a compliance data flow from
its California Integrated Water Quality System (CIWQS) and Stormwater
Multiple Application and Report Tracking System (SMARTS) databases to
ICIS, which fixed many of the incomplete data issues identified in the
Draft Report. In general, new inspection, violation and enforcement action
records in the State databases are reflected in ICIS; however, business rule
differences and data entry errors may result in less than 100%
completeness in ICIS for certain permitting scenarios. For example, new
NPDES permit enrollees under an administratively extended permit are
entered into CIWQS and SMARTS, yet such entries are not permitted to be
entered into ICIS.
By March 31, 2019, the Water Boards will develop an audit framework to
ensure that the records in CIWQS and SMARTS are: (1) consistent with
records in ICIS, or (2) identified and purposefully excluded due to business
rule differences.
Recommendation While EPA acknowledges that data management requires resources from
state enforcement agencies also engaged in inspections and other
compliance activities, EPA also understands the importance of data
management for monitoring compliance activities and achieving progress
towards national goals.
By June 30, 2019, California should ensure all relevant information,
including facility location, inspection, violation, and enforcement action
information is entered into ICIS accurately and in accordance with EPA's
data entry requirements and eReporting Rule. This is especially significant
for facilities covered under general NPDES permits and for non-major
noncompliance categorization.
State Review Framework Report | California | Page 6
-------
By June 30, 2019, the State Board will investigate, address, or create a plan
to address the data flow problems contributing to missing data in ICIS.
EPA will include this as a standing agenda topic during regular meetings
with the state to track progress and ensure California is meeting its CWA
section 106 grant workplan commitments for ICIS-NPDES data
management.
CWA Element 2 Inspections
4a Metrics: Inspection coverage compared to State Workplan commitments.
Finding 2-1
Area for State Attention
Summary
The State met most inspection commitments in its Clean Water Act section
106 grant Workplan but fell short of its commitments for SSO inspections
and Phase IIMS4 inspections.
Explanation
The 4a metrics measure the number of inspections completed by the State
overall in the State Fiscal Year 2016 compared to the commitments in
California's Clean Water Act section 106 grant Workplan. EPA Region 9
established 106 Workplan inspection commitments for California
consistent with the inspection frequency goals outlined in EPA's 2014
CWANPDES Compliance Monitoring Strategy (CMS).
Metric 4al measures pretreatment compliance inspections and audits.
During State FY 2016, California's Regional Boards met their Workplan
commitment by completing 41 pretreatment compliance inspections or
audits at the 92 publicly owned treatment works (POTW) pretreatment
programs in California. The State has a goal of conducting one
Pretreatment Compliance Audit (PCA) in each five-year permit term of all
approved active POTW Pretreatment programs, and at least two
Pretreatment Compliance Inspections (PCI) during each five-year permit
term on all approved active POTW Pretreatment programs. Metric 4a2
measures inspections of Significant Industrial Users (SIUs). For Metric
4a2, California relies on an EPA-managed in-kind-services contract to
complete pretreatment inspections of Industrial Users, including SIUs
discharging to non-authorized POTWs. The data needed for Metric 4a2 is
segmented among the separate Regional Water Boards and non-authorized
POTWs and is not readily accessible. The Regional Boards typically
delegate this responsibility to the non-authorized receiving POTW as a
requirement of their NPDES Perm it/Waste Discharge Requirement (WDR)
(as a "POTW Mini-Program" as described in the EPA Memorandum
Oversight of SIUs Discharging to POTWs without Approved Pretreatment
Programs).
State Review Framework Report California Page 7
-------
Because there are only two Combined Sewer Systems in California, Metric
4a4 has a high percent of completion. Under metric 4a5, California is
expected to annually inspect at least five percent of sanitary sewage
collection systems subject to its general Waste Discharge Requirement
(WDR) for sewage collection systems (Order No. 2006-0003-DWQ).
During State FY 2016, California inspected 18 (1.6%) of its 1,093 sanitary
sewer systems.
California meet its 106 Workplan CMS inspection commitments for most
stormwater inspection categories. Per the Workplan, the Regional Water
Boards must perform an on-site audit for all Phase I and IIMS4 permittees
at least once every ten years, or 10% per year. The State reported
completing 39 audits out of 316 Phase I MS4s permittees (12%), and 15
audits out of 400 of Phase II MS4s permittees (4%) - falling short in this
category.
According to the Workplan, the Regional Water Boards are expected to
inspect at least 10% of industrial stormwater permittees, 10% of permitted
Phase I construction sites, and at least 5% of permitted Phase II
construction sites each year. The State reported completing 1,941 industrial
stormwater inspections out of 11,583 permittees (17%), and 1,838
construction stormwater inspections out of 8,629 permittees (21%).
Construction site category (i.e. Phase 1 v. Phase II) was not tracked during
this reporting period.
There are 1,736 medium and large CAFOs throughout California (some
covered by general NPDES permits and most covered by general WDRs).
Regional Boards inspected 26% of the CAFOs, which met the CMS goal of
inspecting large and medium CAFOs at least once every five years (20%
per year).
Relevant metrics N;iil siaic siaic siaic
Metric ID Number and Description Natl Goal . . ..
A\;i \ I) "'n or#
100% stale
4al Pretreatment compliance inspections specific CMS
and audits Plan
commitment
4a2 Significant Industrial User inspections 100% state
for SIUs discharging to non-authorized specific CMS
POTWs Plan
commitment
4a4 Major CSO inspections 20% of
Combined I 2 5t>"
Sewer Svstems
41 «>: 44.5".,
IlllkllOW II
State Review Framework Report | California | Page 8
-------
4a5 SSO inspections 5% of Sanitary
Sewer Systems
18 1093 1.6%
4a7 Phase IMS4 audits or inspections 10% of Phase I
permittees
39 316 12%
4a7 Phase IIMS4 audits or inspections 10% of Phase
II permittees
15 400 4%
4a8 Industrial stormwater inspections 10% of
industrial SW
permittees
11 SK
1 94i ii,jo 1?%
4a9 Phase I and II stormwater construction 100% state
inspections specific CMS
Plan
commitment
1,838 8,629 21%
4a 10 Medium and large NPDES CAFO 100% state
inspections specific CMS
Plan
commitment
460 1,736 26%
State response
Recommendation None required.
Element 2 Inspections
Metrics 5a and 5b: Inspection coverage compared to State Workplan commitments.
Finding 2-2
Meets or Exceeds Expectations
Summary
The State met or exceeded inspection commitments in its Clean Water Act
section 106 grant Workplan for major and minor facilities.
Explanation Metrics 5a and 5b measure the number of inspections at major and minor
(non-major) facilities in the State Fiscal Year 2016 compared to the
commitments in California's Clean Water Act section 106 grant Workplan.
EPA Region 9 established Workplan inspection commitments for
California consistent with the inspection frequency goals outlined in EPA's
2014 CWANPDES Compliance Monitoring Strategy.
Metric 5al measures the inspection coverage of NPDES majors, metric
5b 1 measures inspection coverage of NPDES non-majors with individual
permits (also called minors), and metric 5b2 measures inspection coverage
of NPDES non-majors with general permits. California inspected 124
(46%) major facilities and 60 (26%) minor facilities during the fiscal year,
State Review Framework Report | California | Page 9
-------
meeting the CMS based Workplan commitment to inspect major permittees
at least once every two years and each minor facility at least once during its
five-year permit term.
The State's non-major general permit inspections (metric 5b2) are
described individually in Finding 2-1, under the CMS and State Workplan
commitments for general stormwater and CAFO inspections. The industrial
stormwater, construction stormwater, and CAFO inspections and universes
were summarized into metric 5b2 below, which is well above the national
average.
Relevant metrics
Metric ID Number and Description
Natl Goal
Natl
Avg
State
N
State
D
State
% or #
100% state
5a 1 Inspection coverage of NPDES
majors
specific CMS
Plan
commitment
51.9%
124
271
46%
5b 1 Inspection coverage of NPDES non-
100% state
majors with individual permits
specific CMS
Plan
commitment
23.9%
60
227
26%
5b2 Inspection coverage of NPDES non-
100% state
majors with general permits
specific CMS
Plan
commitment
5.6%
4,239
21,94
8
19%
State response
Recommendation None required.
Element 2 Inspections
Metric 6a: Quality of inspection reports.
Finding 2-3 Area for State Improvement
Summary Seventy-eight percent of inspection reports reviewed were sufficient to
determine compliance. The seven inspection reports that were inadequate
either lacked a narrative description of the inspection findings or the
reports were missing from CIWQS or other databases.
Explanation Metric 6a assesses the quality of inspection reports, in particular, whether
the inspection reports provide sufficient documentation to determine the
State Review Framework Report | California | Page 10
-------
compliance status of inspected facilities. Twenty-six out of 33 inspection
reports reviewed at the Santa Ana Regional Board were complete and
sufficient to determine compliance in accordance with EPA's 2017 NPDES
Compliance Inspection Manual guidelines. The EPA file reviewers
evaluated RB8 inspection reports in CIWQS. CIWQS is used as an
electronic filing system for inspection reports and enforcement actions.
The EPA reviewers found seven inspection reports were either missing
narrative information or the report was missing from CIWQS entirely.
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
% or #
6a Inspection reports complete and sufficient to
determine compliance at the facility for Santa
Ana Regional Board
100%
26
33
78.8%
State response The Water Boards are committed to improving inspection reports,
including format. The Water Boards will update inspection report guidance
and procedures, and develop staff training by June 30, 2019. The guidance
and procedures will: (1) guide inspectors to develop clearly written
narratives, (2) include requirements to upload the reports into the proper
databases, and (3) track report completion. Subsequently, the Water Boards
will initiate training events and include the updated guidance into the
corresponding administrative procedures manual.
Recommendation By June 30, 2019, the State Board will work with the Regional Boards to:
1) require all inspection reports to include a narrative format that describes
the inspector's observations, across all NPDES platforms (CAFO,
stormwater, pretreatment, etc.), and 2) ensure that all inspection reports are
properly uploaded into CIWQS.
Element 2 Inspections
Metric 6b: Timeliness of inspection reports.
Finding 2-4 Area for State Improvement
Summary Only 18 of the 33 inspection reports reviewed by EPA were dated or
completed within EPA's recommended timeline for completing an
inspection report.
Explanation Metric 6b measures the state's timeliness on completing inspection reports
within the EPA recommended deadlines of 45 days for sampling inspection
State Review Framework Report | California | Page 11
-------
reports and 30 days for non-sampling types of inspections. The State did
not have a policy of tracking inspection completion times or a policy
regarding inspection report deadlines. Inspection reports lacking
completion dates, inspection reports bearing dates beyond the
recommended timeliness deadlines, and facility files that have at least one
inspection entered into ICIS with no corresponding inspection report in the
file were all considered as not meeting EPA's guidelines for timely on
completion of inspection reports.
Based on review of 33 files at RB8, EPA found that many inspection
reports were not dated, which made it difficult to assess the timeliness of
these reports. In the absence of any documentation of report completion
date, such as a cover letter transmitting a report to the discharger, EPA
reviewers assumed that undated reports were not timely. Stormwater
inspection reports were found to be finalized without dates of report
completion making it difficult to assess the timeliness of facility corrective
actions to address inspection findings and the need for escalated
enforcement response. The State enters its stormwater inspection reports
into its SMARTS database with inspection date but no data field for date of
report completion.
Nine of the 33 inspection reports reviewed were not dated and counted as
not meeting timeliness guidelines. An additional six reports were dated
later than EPA's recommended 30-day deadline.
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
% or #
6b Inspection reports completed within prescribed
timeframe for Santa Ana Regional Board
100%
18
33
54.5%
State response The Water Boards are committed to improving the content and completion
rate of inspection reports. The Water Boards' update and implementation
of inspection report guidance and procedures, as described in response to
Metric 6A above, will address the timeliness of inspection reports and
dates recorded in the report and corresponding databases.
Recommendation By June 30, 2019, the State Board will work with the Regional Boards to
improve the inspection report format to include a report completion,
especially for stormwater water inspection reports.
CWA Element 3 Violations
Metrics 7al, 8b and 8c: Tracking of single event violations.
State Review Framework Report | California | Page 12
-------
Finding 3-1
Area for State Attention
Summary California does not enter single event violations (SEVs) into EPA's ICIS
database as required for major facilities.
Explanation SEVs are violations discovered by means other than the ICIS automated
screening of DMRs for effluent limit and reporting violations. Violations
documented in inspection reports are typically classified as SEVs. Metric
7al measures whether SEVs are entered into ICIS. EPA's review of RB8
files revealed that SEVs documented in inspection and enforcement files at
major facilities were not reported consistently in ICIS as required under
EPA's data management policy. The Santa Ana Regional Board did not
report SEVs into ICIS; instead violations that arose from inspections were
noted in CIWQS. SEVs are required to be entered into ICIS for major
facilities and minor facilities that are pretreatment control authorities as
indicated in the December 28, 2007 EPA memorandum, ICIS Addendum to
the Appendix of the 1985 Permit Compliance System Statement (p. 9).
Although California is not entering SEVs in EPA's ICIS database,
California is currently entering SEVs into the main permitted discharger
portion and the SSO portion of their CIWQS state database. The California
State Water Board reviewed state-wide inspections to determine that at
least 16 violations were the direct result of inspections and entered as SEVs
in ICIS.
Metric 8b measures the percentage of SEVs accurately identified as SNC
or non-SNC by the state. California generally does not record SEVs in
ICIS and does not flag SEVs as SNC. EPA has established automated and
discretionary criteria for flagging discharger violations as SNC. California
relies on the automated DMR-based criteria to flag effluent limits and
reporting violations as SNC, but does not normally make discretionary
labeling of SEV violations as SNC.
Metric 8c requires timely reporting of SEVs identified as SNC at major
facilities. Regional Board 8 did not record any SEVs identified at majors as
SNC, so the numerator and denominator of this metric were both zero, and
as such the timeliness of such reports could not be gauged. The state is not
meeting the requirements of this metric.
EPA will provide to the State Board guidance materials covering SEV
codes and the minimum data entry requirements for non-DMR violations
identified at major facilities. EPA suggests these materials be disseminated
to staff to encourage proper identification and entry of the codes and proper
application of SNC criteria. In the meantime, EPA encourages the State to
continue use of the SEV codes to track noncompliance at minors, where
State Review Framework Report | California | Page 13
-------
helpful. The State Board should implement a quality assurance review for
all inspection reports to ensure SEV codes are identified and entered for
majors per the minimum national standards, and to ensure that basic
facility data is present in both inspection reports and their accompanying
entries into CIWQS and ICIS.
For non-major facilities, approximately 28 (12.3%) were in Category 1
noncompliance while three (1.2%) were in Category 2 noncompliance.
These numbers are likely incomplete and more indicative of further data
issues between CIWQS and ICIS, as discussed above.
Relevant metrics
Natl Natl
Metric ID Number and Description _ , .
Goal Avg
State State State
N D % or #
7al Number of major facilities with single event
violations (state-wide)
16 271 5.9%
7fl Non-major facilities in Category 1
noncompliance (state-wide)
28 227 12.3%
7gl Non-major facilities in Category 2
noncompliance (state-wide)
3 227 1.3%
8b Single-event violations accurately identified 0/
as SNC or non-SNC (Santa Ana) ' /o
unknown
8c Percentage of SEVs identified as SNC ioo0/
reported timely at major facilities (Santa Ana) 0
0 0%
State response
Recommendation None Required.
CWA Element 3 Violations
Metric 7e: Accuracy of compliance determinations
Finding 3-2 Meets or Exceeds Expectations
Summary Inspection reports generally provide sufficient information to ascertain
compliance determinations on violations found during inspections.
Explanation Metric 7e measures the percentage of inspection reports reviewed that led to
an accurate compliance determination. The number of inspection reports that
led to accurate compliance determinations (87.9%) is within the acceptable
range of the national goal of 100%. Stormwater program inspection reports
included a detailed narrative component that succinctly described compliance
findings based on site observations.
State Review Framework Report | California | Page 14
-------
The reports that did not provide sufficient information were either missing or
did not include a narrative format. Water Board staff should verify that their
inspection reports have been properly uploaded into CIWQS, in their
entirety.
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
% or #
7e Inspection reports reviewed that led to an
accurate compliance determination (Santa Ana)
100%
29
33
87.9%
State response
Recommendation None.
CWA Element 3 Violations
Metrics 7dl and 8a2: Major facilities in significant non-compliance.
Finding 3-3 Meets or Exceeds Expectations
Summary The rate of SNC at major facilities is lower than the national average.
Explanation Metric 7dl measures the percent of major facilities in non-compliance
reported in ICIS. State-wide noncompliance at major facilities in California
is 74.5% according to information available in data metric 7dl.
Noncompliance at major facilities in the Santa Ana Regional Board is
lower than the state-wide rate with 63.2% of major facilities in
noncompliance. Considering that major facilities in California have
stringent effluent limits, a high frequency of effluent monitoring, many
effluent limit parameters, and that only a single effluent violation places a
major facility in noncompliance, California's rates of noncompliance,
which appear high, are consistent with the national average noncompliance
rate of 73%.
Metric 8a2 measures the percentage of major facilities in significant
noncompliance. Thirty-four of the 271 major facilities in California were in
SNC for one or more quarters during FY2016. The rate of SNC in
California (12.5%) is better than the national average of 20%. Only two
facilities at RB8 were in SNC (10.5%).
State Review Framework Report | California | Page 15
-------
Relevant metrics
Natl
Metric ID Number and Description Goal
Natl
Avg
State
N
State
D
State
% or #
7dl Major facilities in noncompliance (state-
wide)
73%
202
271
74.5%
7dl Major facilities in noncompliance (Santa
Ana)
12
19
63.2%
8a2 Percentage of major facilities in SNC (state-
wide)
20%
34
271
12.5%
8a2 Percentage of major facilities in SNC (Santa
Ana)
2
19
10.5%
State response
Recommendation None required.
CWA Element 4 Enforcement
Metric 9a: Enforcement actions promoting return to compliance
Finding 4-1
Meets or Exceeds Expectations
Summary
Enforcement actions reviewed generally promote return to compliance.
Explanation Metric 9a measures the percent of enforcement responses that return or will
return the source to compliance. Fourteen of 17 enforcement actions
reviewed at RB8 resulted in a return to compliance specific to the relevant
NPDES requirement. The finding level is identified as Meets or Exceeds
Expectations because only three enforcement actions did not promote
return to compliance.
In 14 of the 17 enforcement actions reviewed, the EPA reviewers found
either that the enforcement action mandated a return to compliance or
found other documentation in the file indicating that the facility actually
returned to compliance as a result of the RB8 enforcement action. The
actions included a variety of informal (NOVs or notices of noncompliance)
and formal (administrative civil liability actions) enforcement actions, most
often with documented returns to compliance. In three of the 17 actions
evaluated, the EPA reviewers found that the action did not promote a
return to compliance. Each of these cases were either penalty actions or
informal actions (i.e. verbal warning) where the action did not include a
requirement to return to compliance. Although some of these facilities
may have returned to compliance, the EPA reviewers did not find
documentation in the file of return to compliance.
State Review Framework Report | California | Page 16
-------
Stormwater enforcement electronic files (i.e. SMARTS) contained
additional information useful in verifying facilities return to compliance.
Specifically, enforcement case files contained copies of required reports,
sampling results, and/or permit application documents developed or
submitted to address the deficiency/violation resulting in the enforcement
action. The Regional Board should include injunctive relief or follow-up
actions in most enforcement actions to ensure facilities have indeed
returned to compliance. Any follow-up actions should to be included as
records in case files.
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avg
State
N
State
D
State
% or #
9a Percentage of enforcement responses that
return or will return source in violation to
compliance (Santa Ana)
100%
14
17
82.4%
State response
Recommendation None Required.
CWA Element 4
Enforcement
Metrics 10a! and 10b: Timely and appropriate enforcement actions
Finding 4-2
Area for State Improvement
Summary
Enforcement actions taken at major and non-major facilities are often not
timely or appropriate. This is a recurring issue from the SRF Round 2 of
California's NPDES program.
Explanation For this finding, EPA used two metrics (metrics lOal and 10b) to evaluate
whether California is addressing violations with appropriate enforcement
actions and whether California's enforcement responses were taken in a
timely manner.
Metric lOal was used to assess California's response to SNC level
violations at major facilities. To evaluate metric lOal, the EPA reviewers
examined each of the 34 major facilities that were in SNC for one or more
quarters during FY16. The reviewers determined whether or not California
took enforcement action against each of the SNC facilities and whether the
action was timely and appropriate. According to EPA's policy, appropriate
actions for SNC violations are formal enforcement actions that require a
State Review Framework Report | California | Page 17
-------
return to compliance. The following California enforcement mechanisms
are considered appropriate enforcement: Cease and Desist Orders, Time
Schedule Orders, and Cleanup and Abatement Orders. EPA policy further
dictates that an enforcement action is considered timely if it is issued
within 5 V2 months at the end of the quarter when the SNC level violations
initially occurred.
EPA's review found that only four of the 34 statewide SNC facilities were
addressed with enforcement actions that were both timely and appropriate.
Neither of the two SNC facilities in the Santa Ana Regional Board were
addressed with enforcement that was both timely and appropriate.
Nearly all the 34 SNC facilities were addressed with some type of
enforcement, but the actions did not meet EPA's policy for appropriate
actions. Some of the SNC facilities were addressed with penalty actions
such as administrative civil liability actions (ACL) or mandatory minimum
penalties (MMP) and others were addressed with informal actions such as
staff enforcement letters. Penalty actions alone are not considered
appropriate as these actions typically do not mandate a return to
compliance.
There were several of the 34 SNC facilities that the reviewers judged as
lacking appropriate action even though the state had elected for good
reason to forgo enforcement. Four of the 34 facilities were in SNC for one-
time late submittal of DMRs submittal or do not have violations listed in
CIWQS. EPA understands that the state would not take an enforcement
action in these cases. In addition, there were two facilities on the SNC list
which the State believes were listed as SNC because of DMR reporting
errors.
Finally, the state explained that its enforcement rules and policies make it
nearly impossible for the state to meet EPA's timeliness deadlines. The
State's 2010 Water Quality Enforcement Policy requires escalating
enforcement responses and Regional Water Board hearings for formal
enforcement actions such as a Cleanup and Abatement Order (CAO),
Cease and Desist Order (CDO), or Time Schedule Order (TSO). As a
result, it is difficult for California to issue a formal enforcement action
within the 5 V2 month deadline established by EPA for timely response to
SNC violations.
Metric 10b was used to assess California's enforcement response to any
type of violation (SNC or lower level violations) at any type of facility
(major, minor or general permit discharger). EPA's evaluation of metric
10b was based on review of 28 enforcement responses selected from the
Santa Ana Regional Boards files. Each of the 28 enforcement responses
State Review Framework Report | California | Page 18
-------
were reviewed to determine if they met EPA expectations for enforcement
response as provided in EPA's Enforcement Management System (EMS).
The EMS includes the strict expectations cited above for enforcement
response to major facility SNC violations as well as the somewhat more
subjective guidelines for responses to non-SNC violations.
EPA found that 18 of the 28 enforcement responses were appropriate for
the type of violation. These responses included NOVs for minor
deficiencies, with documented follow-up and a return to compliance, or
formal enforcement (ACLs, compliance orders, etc.) for more serious
violations. In ten of the files, however, the EPA reviewers concluded that
the RB8 action was not appropriate for the circumstances. For example,
some facilities had effluent violations from toxic pollutants and the
corresponding enforcement actions were informal, or the enforcement
action did not return the facility to compliance or prevent the facility from
returning to noncompliance (recidivism).
Relevant metrics . Natl Natl Siaic sum- Siaic
Metric ID Number and Description _ , . .
Goal Avjj N I) ".i or#
lOal Major facilities with timely action as x0/
appropriate (state-wide)
lOal Major facilities with timely action as () ()11
appropriate (Santa Ana)
10b Enforcement responses reviewed that
address violations in an appropriate manner 98% IS 2X <4
(Santa Ana)
State response Water Board NPDES program staff and Office of Enforcement Staff are
currently working closely with U.S. EPA to reduce the number of
permitted facilities in Significant Non-Compliance. This effort includes
reviewing data procedures and data transfer into ICIS. In addition, the
Water Boards will develop and implement a plan by June 30, 2019 to
improve its enforcement response to be consistent with the EPA's
Enforcement Management System.
Recommendation EPA R9 currently works with the State to ensure facilities in SNC are
brought back into compliance with appropriate and timely enforcement
actions. EPA is prepared to take enforcement if the State is not able to take
enforcement or requests assistance. EPA will continue discussion of major
facilities in SNC as a standing agenda topic during regular meetings with
the state to ensure they are prioritized for swift enforcement.
The State Board will identify cases in which violations have not been
adequately addressed with an enforcement action and will timely refer
them to EPA for enforcement as necessary.
State Review Framework Report | California | Page 19
-------
By June 30, 2019, California will adopt and implement a plan to improve
its enforcement response procedures to provide for swift, appropriate
enforcement against facilities in SNC.
CWA Element 5 Penalties
Metrics 11a, 12, and 12b: Penalty calculation and collection
Finding 5-1 Meets or Exceeds Expectations
Summary Consideration of economic benefit and gravity is well documented in
files reviewed.
Explanation Metric 11a assesses the state's method for calculating penalties and
whether it properly documents the economic benefit and gravity
components in its penalty calculations. The Santa Ana Regional Board
has five of six penalties (83%) with adequate documentation in the files
supporting evidence on the calculation methodology for both economic
benefit and gravity.
Metric 12a assesses whether the state documents the rationale for
changing penalty amounts when the final value is less than the initial
calculated value. Documents reviewed in the RB8 files consistently
documented changes between the initial penalty calculations and final
assessed penalties. All but one of the penalty calculations reviewed had
documentation of the rationale for a change between the initial and the
final penalty. The only penalty action that did not meet metrics 11a and
12a was for an illegal discharge to water by an unpermitted facility
where the EPA reviewers could not find the penalty calculations in
CIWQS.
Metric 12b assesses whether the state documents collection of penalty
payments. RB8 files had documentation indicating collection of
assessed penalties in each of the 6 actions reviewed.
Supplemental Environmental Projects (SEPs) were included in several
penalty actions taken by RB8, especially for those cases issued as
Mandatory Minimum Penalties. The SEP value was typically half of the
total penalty settlement and went directly towards an environmental
project within the community impacted by the violations.
State Review Framework Report | California | Page 20
-------
Relevant metrics
Metric ID Number and Description
Natl Natl
Goal Avjj
Siale
\
Siale Siale
D %or#
11a Penalty calculations reviewed that consider
and include gravity and economic benefit for
Santa Ana Regional Board
100%
5
(. XV?",,
12a Documentation of the difference between
initial and final penalty and rationale for Santa
Ana Regional Board
100%
5
(. s i
12b Penalties collected for Santa Ana Regional
Board
100%
(>
(> I no"..
State response
Recommendation None.
State Review Framework Report | California | Page 21
-------
State Review Framework
California
Department of Toxic Substances Control
Resource Conservation and Recovery Act
Implementation in Federal Fiscal Year 2016
U.S. Environmental Protection Agency
Region 9, San Francisco
Final Report
March 11,2019
-------
Executive Summary
Introduction
EPA Region 9 enforcement staff conducted a State Review Framework (SRF) enforcement
program oversight review of the California Department of Toxics Substances Control (DTSC).
Data metrics from the Department as a whole were used in preparation of this report, while in-
field file reviews were conducted at three DTSC regional offices (CalCenter, Chatsworth, and
Berkeley).
The California Department of Toxic Substances Control (DTSC) is authorized by EPA to
implement the federal RCRA program. DTSC is located in Sacramento, with field offices in
Berkeley, Clovis, Cypress, Chatsworth, El Centro and San Diego. DTSC employs over 1,000
staff and has an operating budget of approximately $217 million (Region 9's RCRA grant is
$7M).
DTSC's RCRA compliance and enforcement program has been focused on permitted TSDFs
since the 1990's, when California state law established a "unified hazardous waste and hazardous
materials management" program ("Unified Program"). By 1996, the state had authorized all
counties and numerous cities to implement six existing state regulatory programs, including the
Hazardous Waste Generators program.
There are 83 Certified Unified Program Agencies, or CUP As, fielding over 700 inspectors
conducting approximately 80,000 inspections per year. CUPA agencies are almost all county or
city health or fire departments. Once certified, CUP As support their activities through local
fees. No EPA grant funds are provided to CUP As to implement the RCRA program.
As required by state law, each CUPA is evaluated every 3 years by CalEPA and the respective
state program agencies (e.g., DTSC for hazardous waste). Region 9 believes the oversight
program is thorough in identifying CUPA deficiencies and areas of concern.
California's RCRA compliance and enforcement data is migrated to RCRAInfo from DTSC's
EnviroStor database monthly.
EPA bases SRF findings on data and file review metrics, and conversations with program
management and staff. EPA will track recommended actions from the review in the SRF Tracker
and publish reports and recommendations on EPA's ECHO web site.
State Review Framework Report | California | Executive Summary | Page 1
-------
Areas of Strong Performance:
The quality of DTSC's written inspection reports is above average.
Most Significant RCRA Subtitle C Program Issues:
Completion dates for inspection reports (with violations) are not being coded correctly in
RCRAInfo. Summaries of Violations are issued on site by the DTSC inspector and are
being coded as completed inspection reports. DTSC is in the process of upgrading its
EnviroStor database to address this issue.
State Review Framework Report | California | Executive Summary | Page 2
-------
I. Background on the State Review Framework
The State Review Framework (SRF) is designed to ensure that EPA conducts nationally
consistent oversight. It reviews the following local, state, and EPA compliance and enforcement
programs:
Clean Water Act National Pollutant Discharge Elimination System
Clean Air Act Stationary Sources (Title V)
Resource Conservation and Recovery Act Subtitle C
Reviews cover:
Data completeness, accuracy, and timeliness of data entry into national data systems
Inspections meeting inspection and coverage commitments, inspection report quality,
and report timeliness
Violations identification of violations, determination of significant noncompliance
(SNC) for the CWA and RCRA programs and high priority violators (HPV) for the CAA
program, and accuracy of compliance determinations
Enforcement timeliness and appropriateness, returning facilities to compliance
Penalties calculation including gravity and economic benefit components, assessment,
and collection
EPA conducts SRF reviews in three phases:
Analyzing information from the national data systems in the form of data metrics
Reviewing facility files and compiling file metrics
Development of findings and recommendations
EPA builds consultation into the SRF to ensure that EPA and the state understand the causes of
issues and agree, to the degree possible, on actions needed to address them. SRF reports capture
the agreements developed during the review process in order to facilitate program improvements.
EPA also uses the information in the reports to develop a better understanding of enforcement
and compliance nationwide, and to identify issues that require a national response.
Reports provide factual information. They do not include determinations of overall program
adequacy, nor are they used to compare or rank state programs.
Each state's programs are reviewed once every five years. The first round of SRF reviews began
in FY 2004. The third round of reviews began in FY 2013 and will continue through FY 2017.
State Review Framework Report | Choose a state | Page 3
-------
II. SRF Review Process
Review period: Federal Fiscal Year 2016
Key dates: RCRA File Review: 7/20/17 (CalCenter office), 7/28/17 (Chatsworth
office), and 9/19/17 (Berkeley office)
Draft Report: September 2018
Final Report: March 2019
State and EPA key contacts for review:
DTSC: Denise Tsuji, Kristine Green, Roberto Kou, and Maria Soria.
EPA: John Schofield
III. SRF Findings
Findings represent EPA's conclusions regarding state performance and are based on findings
made during the data and/or file reviews and may also be informed by:
Annual data metric reviews conducted since the state's last SRF review
Follow-up conversations with state agency personnel
Review of previous SRF reports, Memoranda of Agreement, or other data sources
Additional information collected to determine an issue's severity and root causes
There are three categories of findings:
Meets or Exceeds Expectations: The SRF was established to define a base level or floor for
enforcement program performance. This rating describes a situation where the base level is met
and no performance deficiency is identified, or a state performs above national program
expectations.
Area for State Attention: An activity, process, or policy that one or more SRF metrics show as
a minor problem. Where appropriate, the state should correct the issue without additional EPA
oversight. EPA may make recommendations to improve performance, but it will not monitor
these recommendations for completion between SRF reviews. These areas are not highlighted as
significant in an executive summary.
Area for State Improvement: An activity, process, or policy that one or more SRF metrics
show as a significant problem that the agency is required to address. Recommendations should
address root causes. These recommendations must have well-defined timelines and milestones
for completion, and EPA will monitor them for completion between SRF reviews in the SRF
Tracker.
State Review Framework Report | Choose a state | Page 4
-------
Whenever a metric indicates a major performance issue, EPA will write up a finding of Area for
State Improvement, regardless of other metric values pertaining to a particular element.
The relevant SRF metrics are listed within each finding. The following information is provided
for each metric:
Metric ID Number and Description: The metric's SRF identification number and a
description of what the metric measures.
Natl Goal: The national goal, if applicable, of the metric, or the CMS commitment that
the state has made.
Natl Avg: The national average across all states, territories, and the District of Columbia.
State N: For metrics expressed as percentages, the numerator.
State D: The denominator.
State % or #: The percentage, or if the metric is expressed as a whole number, the count.
State Review Framework Report | Choose a state | Page 5
-------
Resource Conservation and Recovery Act Findings
RCRA Element 1 Data
Finding 1-1 Area for State Improvement
Summary Dates inspection reports are completed are not entered into RCRAInfo
for inspections with violations.
Explanation At the end of an inspection, DTSC issues a Summary of Violations
(SOV) if a violation(s) was observed during the inspection. The date the
SOV is issued in the field is the date entered into RCRAInfo as an
informal written enforcement action (coded "120 Written Informal").
Subsequently, a written inspection report is prepared. The date the
written inspection report is completed and sent to the facility is not being
entered into RCRAInfo.
All DTSC data entry into RCRAInfo is through the agency's EnviroStor
database. The EnviroStor database translates the SOV date as a 120
Written Informal RCRAInfo data entry. Even though the enforcement
code is not correct in RCRAInfo, the RCRAInfo listed date for the SOV
is consistent with the file information.
At the end of the SRF file review, EPA provided DTSC with preliminary
results of the data entry issue observed. As result of EPA's finding and
other internal issues with the EnviroStor database, the agency is in the
process of upgrading the EnviroStor database to address the above
finding.
Relevant metrics . Natl Natl State State State
Metric ID Number and Description _ , . __ .
Goal Avg N D % or #
2b Complete and accurate entry of mandatory ,nnn/ ~ . ~n/
^ 100% N/A 8 33 24.2%
data
State response DTSC has been using the "120 Written Informal" defined value for
issuance of a summary of violations. The "120 Written Informal" is used
by EPA to identify when a violator has been notified of a violation in
writing. EPA typically provides written notifications by delivery of the
written inspection report. DTSC is required by California statute to
provide a written summary of violations to a violator within 65 days of
the date of the inspection. This is typically a separate written
notification. The federal program has no corresponding requirement.
State Review Framework Report | Choose a state | Page 6
-------
DTSC is working with Region 9 staff to identify a more appropriate
defined value to use when reporting this event. DTSC is finishing the
upgrade to DTSC's EnviroStor database to transfer all existing summary
of violation dates to a different defined value and all existing inspection
report dates to "120 Written Informal." The upgrade will be completed
after EPA Region 9 staff identify an acceptable alternative defined value
for the summary of violation. DTSC anticipates this transfer will be
completed within 120 days of the final State Review Framework report.
Recommendation DTSC has been using the "120 Written Informal" defined value for
issuance of a summary of violations. The "120 Written Informal" is used
by EPA to identify when a violator has been notified of a violation in
writing. EPA typically provides written notifications by delivery of the
written inspection report. DTSC is required by California statute to
provide a written summary of violations to a violator within 65 days of
the date of the inspection. This is typically a separate written
notification. The federal program has no corresponding requirement.
DTSC is working with Region 9 staff to identify a more appropriate
defined value to use when reporting this event. DTSC is finishing the
upgrade to DTSC's EnviroStor database to transfer all existing summary
of violation dates to a different defined value and all existing inspection
report dates to "120 Written Informal." The upgrade will be completed
after EPA Region 9 staff identify an acceptable alternative defined value
for the summary of violation. DTSC anticipates this transfer will be
completed within 120 days of the final State Review Framework report
RCRA Element 2 Inspections
Finding 2-1 Meets or Exceeds Expectations
Summary DTSC inspection coverage for Treatment, Storage and Disposal
Facilities (TSDFs) exceeded the national average.
Explanation In California, DTSC is responsible for inspection/enforcement of
TSDFs, used oil recyclers, hazardous waste transporters, and e-waste
management facilities. Hazardous waste generator inspection and
enforcement responsibilities have been delegated by the state legislature
to 81 Certified Unified Program Agencies (CUPAs), such as city or
county fire departments or environmental health departments. DTSC
performs a limited number of generator inspections as part of its CUPA
oversight program or in response to a tip/complaint received by the
facility.
State Review Framework Report | Choose a state | Page 7
-------
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avg
State State
N D
State
% or #
5a Two-year inspection coverage of operating
TSDFs '
100%
90.3%
52 56
92.9%
State response DTSC appreciates EPA's acknowledgement that it exceeded the national
average for performing inspections at operating TSDFs at least every
two years. The national goal is 100%. As part of DTSC's efforts to
improve the inspection and enforcement program, in December 2017,
DTSC added an annual work plan report in EnviroStor. The report will
assist in identifying the facilities' inspection due date based on
inspection frequency for the facilities' workplan type (Treatment,
Storage, Disposal, Post Closure, Transporter, etc). DTSC will use the
annual work plan report to monitor the progress of inspections during the
state fiscal year.
Recommendation
No further action is recommended.
RCRA Element 2 -
Inspections
Finding 2-2
Meets or Exceeds Expectations
Summary
RCRA inspection reports prepared by DTSC were generally well written
and contained adequate supporting documentation.
Explanation All the inspection reports reviewed were completed in accordance with
DTSC Policy for Conducting Inspections, DTSC-OP-0005 (January 30,
2009). Each report contains facility information, inspection participants,
description of facility operations, description of permitted areas (if
applicable), files reviewed, observations/violations and appropriate
attachments and photographs to document the observation/violation.
Due to the fact DTSC issues an SOV at the conclusion of the inspection
where violations were observed, the facility is required to address the
violation(s) prior to completion of the inspection report. If the facility
has satisfactorily addressed the violation(s), a return to compliance
(RTC) statement will be included in the inspection report and the RTC
date entered into RCRAInfo.
There were some exceptional inspection reports reviewed. For example,
inspection reports prepared for Aerojet, Quemetco and Phibro-Tech
exceeded minimal requirements specified in Policy, DTSC-OP-0005.
State Review Framework Report | Choose a state | Page 8
-------
However, there were a few reports that contained typos or wrong dates,
which indicates these reports did not receive adequate quality
assurance/quality control review: Travis AFB, Vandenberg AFB
(incomplete EPA ID number), and GEM Rancho Cordova.
Relevant metrics
. Irv.. . _ . . Natl Natl State State State
Metric ID Number and Description _ , . __
Goal Avg N D % or #
6a Inspection reports complete and sufficient to VI/.
. * 100% N/A 33 33 100%
determine compliance
State response
DTSC appreciates EPA's acknowledgment that it met the national goal
of 100% of inspection reports prepared that are complete and sufficient
to determine compliance. The Inspection Policy for Conducting
Inspections (DTSC-OP-0005) requires the inspector to submit draft
inspection reports to their supervisor and backup inspector for review. In
addition, DTSC has developed a Supervisor's Report Review Guidance
for EERD supervisors to follow when reviewing staffs inspection
reports. Also, as of July 1, 2018, DTSC has implemented a new
Inspection Report template which streamlines the report writing process.
As a result, DTSC will assure that all the inspection reports receive the
adequate quality assurance/quality control review.
Recommendation
No further action is recommended.
RCRA Element 2 -
Inspections
Finding 2-3
Area for State Attention
Summary
Timeliness of completed inspection reports could be improved.
Explanation
In accordance with California Health and Safety Code Section
25185(c)(2)(A), DTSC is required to provide a copy of the written
inspection report to the facility within 65 days of the inspection.
Metric 6b measures the timeliness of inspection reports. Of the 33
inspection reports reviewed, 24 (72.7%) inspection reports were
completed within the 65-day requirement.
At the conclusion of an inspection, DTSC issues an SOV if a violation(s)
is observed or a Summary of Observation (SOO) if no violations are
observed or more investigation of an observation is required. The SOV
provides a concise summary of the violation(s) identified during the
inspection and is issued at the conclusion of the inspection. There are
State Review Framework Report | Choose a state | Page 9
-------
times when an SOV will be issued shortly after the inspection. This is
done only when an inspector may require additional time to determine if
an observation should be classified as a violation. The SOV is signed by
the inspector and by the facility representative receiving the SOV. The
SOV requires the facility to address the observed violation(s) within a
certain number of days (typically 30 days). If no violations are observed
a signed SOO is left instead.
Note: In 2017, DTSC conducted a LEAN process review of inspection
policies and procedures. One of the goals of the review process was to
improve timeliness of written inspection reports. EPA will discuss the
progress of this LEAN review during routine coordination meetings/calls
with DTSC in FY2018-FY2019.
Relevant metrics Natl Natl Sialc Sialc Sialc
Metric ID Number and Description _ . . .
Goal Avii N I) «>or#
6b Timeliness of inspection report completion 100% N/A 24 "2""..
State response DTSC acknowledges that it has a state statutory mandate to provide
completed inspection reports in most cases within 65 days of the
inspection {note: CA Health and Safety code section 25185 subdivision
(c) paragraph (2)(B) states: The time period required by subparagraph
(a) may be extended as a result of a natural disaster, inspector illness,
or other circumstances beyond the control of the department, or the
local officer or agency, if the department or the local officer or agency
so notifies the operator within 70 days from the date of the inspection
and provides the inspection report to the operator in a timely manner
after the reason for the delay is ended) However, the data set that EPA
reviewed for FY2016 (24 of 33 inspections or 72.7%) provide a less
accurate picture of DTSC's performance at meeting our statutory
mandate. For FY 2016, DTSC conducted a total number of 431
inspections. During this period, 402 inspection reports were completed
within 65 days. This reflects a percentage of 93.2%.
During 2017, DTSC performed a Lean Six Sigma (L6S) project to
streamline the inspection report process. The performance goal for this
project is to issue 95% of inspection reports within 30 days. Before
initiating the L6S project, 67% of the inspection reports were completed
within 30 days. DTSC started implementing the new process in
September 2017. By September 2018, DTSC has reached a rate of 87%
of inspection reports issued within 30 days and 88.7% within 65 days.
DTSC expects to achieve 100% compliance with the statutorily
mandated 65-day inspection report issuance time frame.
State Review Framework Report | Choose a state | Page 10
-------
In December 2017, DTSC added an inspection report project
management tool to EnviroStor. This project management tool will assist
DTSC inspectors with completing inspection reports by tracking
inspection report milestones identified in DTSC's policy on inspections
(Conducting Inspections [DTSC-OP-0005], dated 6/29/17).
Recommendation No further action is recommended.
State Review Framework Report | Choose a state | Page 11
-------
RCRA Element 3
Violations
Finding 3-1 Meets or Exceeds Expectations
Summary Files reviewed included accurate compliance determinations and SNC
(significant noncomplier) determinations, when applicable.
Explanation Of the 33 files reviewed with inspection reports, 24 (72.7%) of the
reports identified violations (Class 1 (SNC), secondary, and/or minor).
Of the 24 facilities with violations, 9 (37.5%) of the inspection reports
identified Class 1 (SNC) violations.
All Class 1 (SNC) determinations were made at the conclusion of the
inspection, as listed in the SOV, (i.e. within 150 days). Except for the
Quemetco Class 1 (SNC) determination, all SNY (significant
violation(s) found) and/or SNN (significant violation(s) has been
addressed) findings were entered correctly into RCRAInfo. There was
no SNY or SNN for Quemetco listed for the Class 1 (SNC) violation(s)
observed during the reporting period.
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avg
State State
N D
State
% or #
7b Violations found during inspections
N/A
35.9%
223
607
36.7%
8a SNC identification rate
N/A
2.1%
30
607
4.9%
8b Timeliness of SNC determinations
100%
N/A
9
9
100%
8c Appropriate SNC determinations
100%
N/A
24
24
100%
State response DTSC appreciates EPA's acknowledgement that it meets either the
national goal or exceeds the national average of metrics identified that
measure accuracy of compliance determinations and significant non-
complier determinations. In December 2017, DTSC added a new
violations report for data managers, inspectors, and supervisors in
EnviroStor. The violations report assists users in identifying facilities
with violations that have not been returned to compliance, including
those facilities that are significant noncompliers.
Regarding the designation of a Class 1 violation at Quemetco that has
not returned to compliance (significant violation has been addressed) or
SNN. Quemetco has disputed the violation. DTSC has filed a civil action
against Quemetco for this and other significant violations. DTSC and
Quemetco are in discussions to resolve the violation and settle the
enforcement action. It was DTSC's understanding that these violations
should retain the SNY (significant violation found) designation until
State Review Framework Report | Choose a state | Page 12
-------
these violations have returned to compliance. If EPA would like DTSC
to apply the SNY/SNN designations differently, please let DTSC know.
Recommendation
No further action is necessary.
RCRA Element 4 -
Enforcement
Finding 4-1
Meets or Exceeds Expectations
Summary
DTSC effectively manages noncompliant facilities with appropriate
enforcement responses.
Explanation For inspections where violations are identified, DTSC issues an SOV at
the conclusion of the inspection. The SOV includes a time period for the
facility to address the listed violation(s). DTSC has 65-days to complete
an inspection report. During this period, the facility will submit a
response to the SOV. DTSC includes a summary of the SOV response
in the inspection report and whether or not the identified violation(s) has
been satisfactorily addressed by the facility. If all the violations are
satisfactorily addressed by the facility, there is no required response by
the facility upon receipt of the inspection report.
For formal enforcement actions that have a calculated penalty of less
than $75,000, DTSC will pursue the enforcement action
administratively. For formal enforcement actions with a calculated
penalty greater than $75,000, the action will be referred to the Office of
Attorney General (AG).
Twenty-four of the files reviewed during the period had violations with
either Class 1 (SNC), secondary, and/or minor violations. Only one
enforcement action initiated by DTSC had not been addressed via
informal or formal enforcement (Acme Fill Corporation).
In accordance with EPA's December 2003 Hazardous Waste Civil
Response Policy, enforcement actions with SNC determinations should
be concluded within 360 days of the first date of the inspection. Metric
10a measures timeliness of returning to compliance for violations where
SNC is identified. According to the FY2016 frozen data, there were 19
inspections with SNY determination. Seventeen of the SNY
determinations (89.5%) were concluded within 360 days of first date of
the inspection. The national goal is 80%. The national average is
84.2%. California exceeded the national average.
State Review Framework Report | Choose a state | Page 13
-------
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avg
State State
N D
State
% or #
9a Enforcement that returns violators to
100
N/A
27
28
96.4%
compliance
10a Timely enforcement taken to address SNC
80
86.4%
17
19
89.5%
State response DTSC appreciates EPA's acknowledgement that it exceeds (89.5%) both
the national goal (80%) and the national average (86.4%) for timely
enforcement actions to address significant non-compliers and fell just
short (96.4%) of the national goal (100%) for enforcement that returns
violators to compliance. DTSC is adding enhancements to EnviroStor
that will help DTSC inspectors efficiently and timely complete
administrative and civil enforcement actions. The project management
tool will track enforcement milestones set in DTSC's policy on
enforcement [Enforcement Response (DTSC-OP-0006), dated 6/29/17],
Recommendation No further action is required.
RCRA Element 5 Penalties
Finding 5-1 Meets or Exceeds Expectations
Summary California includes gravity-based, multiday and economic benefit
components in their penalty calculation procedures.
Explanation Penalty related files are kept separately from the inspection and
enforcement files. Three formal penalty actions were reviewed. Each of
the penalty actions included a worksheet and justification memorandum
that applied each of the penalty components to each violation listed.
Files included differences between initial and final penalty, and also
included documentation that the penalties had been paid.
Relevant metrics
Metric ID Number and Description
Natl
Goal
Natl
Avg
State State
N D
State
% or #
1 la Penalty calculations include gravity and
economic benefit
100
N/A
3
3
100%
12a Documentation on difference between
initial and final penalty
100
N/A
3
3
100%
12b Penalties collected
100
N/A
3
3
100%
State Review Framework Report | Choose a state | Page 14
-------
State response DTSC appreciates EPA's acknowledgment that it met the national goal
of 100% of enforcement cases with penalties calculated, documented and
collected.
During 2016/17, DTSC performed a Lean Six Sigma (L6S) project to
streamline the issuance of enforcement actions with administrative
penalties under $75,000. The performance goal for this project is to
assess and approve 95% of penalties at $75,000 or less for administrative
cases within 14 days of sending the inspection report to the operator (44
days after the first day of inspection). Before initiating the L6S project, it
took DTSC an average of 259 days to assess a proposed penalty. DTSC
started implementing the new process in September 2017. As DTSC
implemented this L6S project, DTSC did not achieve the desired
improvements and identified process issued that resulted in delays in
issuing penalties. DTSC began an additional penalty assessment L6S
project in 2017/18 to address some of these process issues. This most
recent L6S project resulted in major changes in the way DTSC
calculates, reviews and approves administrative and civil penalties.
Starting in October 2018, DTSC began implementation of the new
processes and anticipates further improvements in calculation of
penalties.
Recommendation No further action is necessary.
State Review Framework Report | Choose a state | Page 15
------- |