Ģ
<
33
O
\
&
O
Ķz
LLi
CD
% pro^G


-------
Report Contributors:	Rudolph M. Brevard
Vincent Campbell
Eric Jackson Jr.
Christina Nelson
Teresa Richardson
Scott Sammons
Abbreviations
EPA	U.S. Environmental Protection Agency
FISMA Federal Information Security Modernization Act
FY	Fiscal Year
OIG	Office of Inspector General
READ	Registry of EPA Applications, Models and Databases
Cover image: Cybersecurity Framework Security Functions and FY 2016
Inspector General FISMA metric domains. (EPA OIG graphic)
Are you aware of fraud, waste or abuse in an
EPA program?
EPA Inspector General Hotline
1200 Pennsylvania Avenue, NW (2431T)
Washington, D.C. 20460
(888) 546-8740
(202) 566-2599 (fax)
OIG Hotline@epa.gov
Learn more about our OIG Hotline.
EPA Office of Inspector General
1200 Pennsylvania Avenue, NW (2410T)
Washington, D.C. 20460
(202) 566-2391
www.epa.gov/oiq
Subscribe to our Email Updates
Follow us on Twitter @EPAoig
Send us your Project Suggestions

-------
^tDsx
* O \
IŪ *
U.S. Environmental Protection Agency
Office of Inspector General
At a Glance
17-P-0044
November 14, 2016
Why We Did This Review
The Office of Inspector General
(OIG) conducted this audit to
evaluate the U.S.
Environmental Protection
Agency's (EPA's) compliance
with the Federal Information
Security Modernization Act of
2014 (FISMA) during fiscal year
2016.
A robust but agile information
security infrastructure is
paramount to combat constant
cybersecurity attacks. Security
officials must understand the
current status of their security
programs and risk factors that
could adversely affect
organizational operations,
assets, employees and external
partnerships.
We reported our audit results
using the CyberScope system
developed by the Department
of Homeland Security.
CyberScope calculates the
effectiveness of an agency's
information security program
based on the responses to the
FISMA reporting metrics.
This report addresses the
following EPA goal or
cross-agency strategy:
• Embracing EPA as a high-
performing organization.
Send all inquiries to our public
affairs office at (202) 566-2391
or visit www.epa.gov/oiq.
Listing of OIG reports.
Improvements Needed in EPA's Information
Security Program
What We Found
The EPA's information security function areas
did not meet the defined requirements to be
considered effective. We assessed the following
five Cybersecurity Framework Function areas
and the corresponding metric domains as
specified by the fiscal year 2016 Inspector
General FISMA reporting metrics.
1.	Identify - Risk Management and Contractor Systems.
2.	Protect - Configuration Management, Identity and Access Management,
and Security and Privacy Training.
3.	Detect - Information Security Continuous Monitoring.
4.	Respond - Incident Response.
5.	Recover-Contingency Planning.
We evaluated each security function area using the maturity model. The maturity
model is a tool to summarize the status of an agency's information security
program and to outline what still needs to be done to improve the program.
The maturity model assesses each function area as: Level 1 - Ad-hoc, Level 2 -
Defined, Level 3 - Consistently Implemented, Level 4 - Managed and
Measurable, or Level 5 - Optimized.
The maturity model defines the requirements to meet a particular maturity level,
and the EPA must meet all the requirements of that level before it can progress
to the next higher level within the maturity model. The EPA would need to
achieve Level 4 (Managed and Measurable) for a function area to be considered
effective. The table below summarizes each function area the EPA achieved.
EPA's information security function area maturity
More work is needed by
the EPA to achieve
managed and measurable
information security
function areas to manage
cybersecurity risks.
Security function areas
Maturity level rating
Identify, Protect, Respond, and Recover
Level 3 - Consistently Implemented
Detect
Level 2- Defined
Source: OIG testing results.
Appendix A contains the results for the fiscal year 2016 Inspector General FISMA
reporting metrics.
We worked closely with EPA officials and briefed them on the results. Where
appropriate, we updated our analysis and incorporated management's feedback.
EPA agreed with our results. We made no recommendations based on our
analysis.

-------
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
THE INSPECTOR GENERAL
November 14, 2016
MEMORANDUM
SUBJECT: Improvements Needed in EPA's Information Security Program
Report No. 17-P-00471
FROM: Arthur A. Elkins Jr.
TO:
Gina McCarthy, Administrator
This is our final report on the subject audit conducted by the Office of Inspector General (OIG) of the
U.S. Environmental Protection Agency (EPA). The project number for this audit was OA-FY16-0104.
This report contains findings and conclusions that meet the Federal Information Security Modernization
Act of 2014 reporting requirements prescribed by the U.S. Office of Management and Budget and
U.S. Department of Homeland Security. This report represents the opinion of the OIG and does not
necessarily represent the final EPA position.
The EPA office having primary oversight for the areas evaluated in this report is the Office of
Environmental Information.
Action Required
You are not required to provide a written response to this final report. In accordance with Office of
Management and Budget Federal Information Security Modernization Act reporting instructions, we are
forwarding this report to you for submission, along with the agency's required information, to the
Director of the Office of Management and Budget.
We will post this report to our website at www.epa.gov/oig.

-------
Improvements Needed in EPA's
Information Security Program
17-P-0044
Table of C
Purpose		1
Background		1
Responsible Office		2
Scope and Methodology		2
Results of Review		4
Appendices
A Department of Homeland Security CyberScope Template
B Information Security Reports Issued in FYs 2016 and 2015
C Distribution

-------
Purpose
The U.S. Environmental Protection Agency (EPA), Office of Inspector General
(OIG), conducted this audit to evaluate the EPA's compliance with the Federal
Information Security Modernization Act (FISMA) of 2014 during fiscal year
(FY) 2016.
Background
Under FISMA, agency heads are responsible for providing information security
protections commensurate with the risk and magnitude of harm resulting from the
unauthorized access, use, disclosure, disruption, modification or destruction of
information and information systems.
Per the FY 2016 Inspector General FISMA reporting metrics, there are five levels
of maturity for each of the Cybersecurity Framework Security Functions; each
function could be assigned one of the following maturity levels.
Figure 1: Progression of maturity levels

Level 1
Ad-Hoc
Agencies
automatically
receive points
regardless of
their
achievements in
this maturity
level.
Level 2
Defined
Meet all metrics
designated in
the "Ad-hoc"
level and half or
greater of the
metrics
designated in
the "Defined"
level.
Source: EPA OIG graphic.
Level 3
Consistently
Implemented
Meet all metrics
designated at
the "Defined"
level and half or
greater of the
metrics
designated in
the
"Consistently
Implemented"
level.

t
Managed and
Measurable
Level 5
Optimized
I
I LOWEST LEVEL TO
BE CONSIDERED
^ EFFECTIVE
Meet all metrics
designated in
the
"Consistently
Implemented"
level and half or
greater of the
metrics
designated in
the "Managed
and
Measurable"
level.
Meet all metrics
designated in
the "Managed
and
Measurable"
and
"Optimized"
levels.
Per the FY 2016 Inspector General FISMA reporting instructions, agencies are
allotted points for each Cybersecurity Framework Security Function based on
their achievement at each maturity level. For each Cybersecurity Framework
Security Function, a total of 20 points is possible. Table 1 illustrates the scoring
distribution:
17-P-0044
1

-------
Table 1: Maturity level scoring distribution
Maturity level
Scoring distribution
in points
Level 1 - Ad-hoc
3
Level 2 - Defined
4
Level 3 - Consistently Implemented
6
Level 4 - Managed and Measurable
5
Level 5 - Optimized
2
Source: FY 2016 Inspector General FISMA of 2014 reporting metrics.
According to the reporting metrics: "Agencies with programs that score at or
above the Managed and Measureable [level] for a NIST [National Institute of
Standards and Technology] [Cybersecurity] Framework Function have "effective"
programs within that area in accordance with the effectiveness definition in NIST
SP 800-53, Rev. 4."
Thus, EPA would have to meet all of the Consistently Implemented (level 3),
Defined (level 2) and Ad-Hoc (level 1) metrics, and half or greater of the
Managed and Measurable (level 4) metrics, to have its information security
program rated as effective.
A robust but agile information security infrastructure is paramount to combat
constant cybersecurity attacks. Security officials must understand the current
status of their security programs and risks factors that could adversely affect
organizational operations, assets, employees and external partnerships. As such,
proper care in selecting and implementing security controls is essential to
safeguard the confidentiality, integrity and availability of information that is
processed, stored and transmitted internally for managerial decisions and
externally for information sharing.
Responsible Office
The Office of Environmental Information leads the EPA's information
management and information technology programs to provide the information,
technology and services necessary to advance the protection of human health and
the environment. Within the Office of Environmental Information, the EPA's
Senior Agency Information Security Officer is responsible for the EPA's
information security program. Additionally, the Senior Agency Information
Security Officer ensures that the agencywide information security program is in
compliance with FISMA and related information security laws, regulations,
directives, policies and guidelines.
Scope and Methodology
We conducted our performance audit from March to November 2016 in
accordance with generally accepted government auditing standards. Those
standards require that we plan and perform the audit to obtain sufficient and
17-P-0044
2

-------
appropriate evidence to provide a reasonable basis for our findings and
conclusions based on our audit objectives. We believe that the evidence obtained
provides a reasonable basis for our conclusions based on our audit objectives.
The OIG is required to assess the agency's information security program for the
five Cybersecurity Framework Security Functions and corresponding metric
domains as specified in the FY 2016 Inspector General FISMA reporting metrics
version 1.1.3 (see Figure 2).
Figure 2: Cybersecurity Framework Security Functions to the FY 2016
Inspector General FISMA metric domains

Protect
Ķ Risk Management
> Contractor System

Configuration
Management
•	Identity and
Access
Management
•	Security and
Privacy Training
• Information
Security
Continuous
Monitoring
Recover
Respond
Incident Response
Contingency
Planning
Source: FY 2016 Inspector General FISMA of 2014 reporting metrics.
We conducted our testing through inquiries of agency personnel, inspection of
relevant documentation, and leveraging of current OIG information security audit
work related to the Cybersecurity Framework Security Functions and FISMA
metric domains.
We evaluated the Respond security function (incident response) against Level 1 -
Ad-hoc. We used the control self-assessment methodology1 to assess the Respond
security function for some of the maturity model levels. The control
self-assessment included collecting the EPA's responses to the following maturity
model levels: Defined, Consistently Implemented, and Managed and Measurable.
Our testing was limited to evaluating the veracity of the EPA's responses to each
FISMA metric, conducting follow-up with EPA officials to obtain clarification on
their responses on any issues related to the FISMA metrics, and reviewing
FY 2016 audit reports issued by the U.S. Government Accountability Office to
identify any issues related to the FISMA metrics. We believe using the control
self-assessment methodology provides a reasonable basis for our conclusion and
the information presented in this report for the Respond security function
1 According to the Institute of Internal Auditors, control self-assessment is a technique that allows personnel directly
involved in the business process to participate in assessing the organization's risk management and control
processes. Audit teams can use control self-assessment results to gather relevant information about risk and controls.
17-P-0044
3

-------
In FY 2015, the EPA achieved the Ad-hoc maturity model level for the
information security continuous monitoring program. We used the FY 2015
Ad-hoc results to answer this year's Level 1 (Ad- hoc) attributes. We evaluated
the Detect security function (information security continuous monitoring) only
against the defined maturity model level. The EPA was provided the opportunity
to complete a control self-assessment for the remaining maturity levels, but the
EPA did not provide any additional information.
For the Identify, Protect and Recover FISMA metric domains, we concluded that
the previous year's controls were still effectively implemented. We performed the
following procedures to validate our assumptions for each attribute within these
domains.
•	Reviewed U.S. Government Accountability Office and EPA OIG reports
issued during FY 2016 to determine whether any issues were identified for
the Identify, Protect and Recover FISMA metric domains.
•	Conducted follow-up with agency officials to determine whether any
significant process changes had occurred since the previous assessment.
•	Relied on FY 2015 results for those FISMA metrics that received a
passing rating.
The EPA OIG did not issue any recommendations in the FY 2015 FISMA audit
report; therefore, we did not conduct any audit follow-up regarding that report.
Results of Review
The EPA consistently implemented four of the five security function areas based on
the CyberScope system scoring, as shown in Table 2. The CyberScope system
awards a maximum of 20 points per security function area, and an area must score
at least 18 points (at or above the Level 4 - Managed and Measurable maturity
level) to be considered effective.
Table 2: Maturity level of EPA's information security function areas
Security
function
Maturity level
Points
achieved by
function area
Minimum points
needed to be
considered effective
1. Identify
Level 3: Consistently Implemented
13
18
2. Protect
Level 3: Consistently Implemented
13
18
3. Detect
Level 2: Defined
7
18
4. Respond
Level 3: Consistently Implemented
13
18
5. Recover
Level 3: Consistently Implemented
13
18
Source: OIG testing results.
Several function areas and corresponding metric domains within the EPA's
information security program were identified as receiving a Not Met response.
Table 3 highlights the areas for which the EPA did not receive a positive rating.
17-P-0044
4

-------
Table 3: Results of testing assessed as "Not Met"
Cybersecurity
Framework	FISMA Metric
Security Function Domain	FISMA Metric
Identify
Risk
Management
Contractor
System
EPA did not implement an insider threat detection and prevention
program, including the development of comprehensive policies,
procedures, guidance and governance structures, in accordance with
Executive Order 13587 and the National Insider Threat Policy.
EPA did not establish or implement a process to ensure that
contracts/statements of work/solicitations for systems and services
include appropriate information security and privacy requirements and
material disclosures; Federal Acquisition Regulation clauses; and
clauses on protection, detection and reporting of information.
EPA did not obtain sufficient assurance that the security controls of
systems operated on the organization's behalf by contractors or other
entities and services provided on the organization's behalf meet FISMA
requirements, Office of Management Budget policy, and applicable
National Institute of Standards and Technology guidelines.
Protect
Identity and
Access
Management
Security and
Privacy
Training
EPA did not ensure that all users are only granted access based on
least privilege and separation-of-duties principles.
EPA did not ensure that accounts are terminated or deactivated once
access is no longer required or after a period of inactivity, according to
organizational policy.
EPA did not identify and track status of specialized security and privacy
training for all personnel (including employees, contractors and other
organization users) with significant information security and privacy
responsibilities requiring specialized training.
Respond
Incident
Response
EPA did not integrate incident response activities with organizational
risk management, continuous monitoring, continuity of operations, and
other mission/business areas, as appropriate.
EPA did not capture qualitative and quantitative performance metrics on
the performance of its incident response program. The organization did
not ensure that the data supporting the metrics was obtained accurately
and in a reproducible format, or that data is analyzed and correlated in
ways that are effective for risk management.
EPA did not implement its defined incident response technologies. Also,
the tools are not interoperable to the extent practicable; do not cover all
components of the organization's network; and have not been
configured to collect and retain relevant and meaningful data consistent
with the organization's incident response policy, procedures and plans.
EPA incident response stakeholders did not implement, monitor and
analyze qualitative and quantitative performance measures across the
organization and did not collect, analyze and report data on the
effectiveness of the organization's incident response program.
EPA did not implement processes for consistently implementing,
monitoring and analyzing qualitative and quantitative performance
measures across the organization; and is not collecting, analyzing and
reporting data on the effectiveness of its processes for performing
incident response.
17-P-0044
5

-------
Cybersecurity
Framework	FISMA Metric
Security Function Domain	FISMA Metric


EPA data supporting incident response measures and metrics are not
obtained accurately, consistently and in a reproducible format.
EPA uses technologies for consistently implementing, monitoring and
analyzing qualitative and quantitative performance across the
organization; however, the data are not consistently collected, analyzed
and reported on the effectiveness of its technologies for performing
incident response activities.
EPA has not defined or implemented incident response performance
measures that include data on the implementation of its incident
response program for all sections of the network.
Recover
Contingency
Planning
EPA did not test its Business Continuity Plan and Disaster Recovery
Plan for effectiveness and update plans as necessary.
EPA did not determine alternate processing and storage sites based
upon risk assessments that ensure that the potential disruption of the
organization's ability to initiate and sustain operations is minimized, and
are not subject to the same physical and/or cybersecurity risks as the
primary sites.
Source: OIG analysis.
Appendix A contains the detailed results of our analysis. Management agreed
with the conclusions reported in Appendix A; we collected management's
feedback on the analysis either verbally or through email. We worked closely
with the agency and briefed them on each portion of the U.S. Department of
Homeland Security FISMA reporting metrics as the results were completed. As
such, we updated our analysis and incorporated management feedback throughout
the audit.
The EPA should take actions to address the issues above to protect the availability
and integrity of environmental data from loss, alteration and destruction.
17-P-0044
6

-------
Appendix A
Department of Homeland Security
CyberScope Template
17-P-0044

-------
Inspector General
2016
Section Report

Environmental Protection Agency
17-P-0044

-------
Section 0: Overall
0.1 Please provide an overall narrative assessment of the agency's information security program. Please note that OMB will include this
information in the publicly available Annual FISMA Report to Congress to provide additional context for the Inspector General's
effectiveness rating of the agency's information security program. OMB may modify this response to conform with the grammatical
and narrative structure of the Annual Report.
The EPA's information security function areas did not meet the defined requirements to be considered effective. We assessed the following
five Cybersecurity Framework Function areas and the corresponding metric domains as specified by the fiscal year 2016 Inspector General
FISMA reporting metrics.
1.	Identify - Risk Management and Contractor Systems.
2.	Protect - Configuration Management, Identity and Access Management, and Security and Privacy Training.
3.	Detect - Information Security Continuous Monitoring.
4.	Respond - Incident Response.
5.	Recover - Contingency Planning.
We evaluated each security function area using the maturity model as a tool to summarize the status of an agency's information security
program. The maturity model assesses each function area as: Level 1 - Ad-hoc, Level 2 - Defined, Level 3 - Consistently Implemented, Level
4 - Managed and Measurable, or Level 5 - Optimized.
The EPA achieved the following maturity models for each security function area:
Level 2 (Defined)- Detect
Level 3 (Consistently Implemented)- Identify, Protect, Respond and Recover
The metrics define the requirements to meet a particular maturity level and the EPA must meet all the requirements of that level before it can
progress to the next higher level within the maturity model. Based on the metrics, the EPA would need to achieve Level 4 (Managed and
Measurable) for a function area to be considered effective.
OIG Report - Annual 2016	Page 1 of 35
17-P-0044

-------
Section 1: Identify
1.1.3
Risk Management (Identify)
11 Has the organization established a risk management program that includes comprehensive agency policies and procedures consistent
with FISMA requirements, OMB policy, and applicable NIST guidelines?
Met
111 Identifies and maintains an up-to-date system inventory, including organization- and contractor-operated systems, hosting
environments, and systems residing in the public, hybrid, or private cloud. (2016 CIO FISMA Metrics, 1.1; NIST
Cybersecurity Framework (CF) ID.AM.l, NIST 800-53: PM-5)
Met
1.1.2 Develops a risk management function that is demonstrated through the development, implementation, and maintenance of a
comprehensive governance structure and organization-wide risk management strategy as described in NIST SP 800-37,
Rev. 1. (NIST SP 800-39)
Met
Incorporates mission and business process-related risks into risk-based decisions at the organizational perspective, as
described in NIST SP 800-37, Rev. 1. (NIST SP 800-39)
Met
1-1-4 Conducts information system level risk assessments that integrate risk decisions from the organizational and mission/business
process perspectives and take into account threats, vulnerabilities, likelihood, impact, and risks from external parties and
common control providers. (NIST SP 800-37, Rev. 1, NIST SP 800-39, NIST SP 800-53: RA-3)
Met
Provides timely communication of specific risks at the information system, mission/business, and organization-level to
appropriate levels of the organization.
Met
1.1.6	Performs comprehensive assessments to categorize information systems in accordance with Federal standards and
applicable guidance. (FIPS 199, FIPS 200, FISMA, Cybersecurity Sprint, OMB M-16-04, President's Management
Council (PMC) cybersecurity assessments)
Met
1.1.7	Selects an appropriately tailored set of baseline security controls based on mission/business requirements and policies and
develops procedures to employ controls within the information system and its environment of operation.
1.1.5
Defined
Defined
Consistently
Implemented
Consistently
Implemented
Consistently
Implemented
Managed and
Measureable
Consistently
Implemented
Defined
OIG Report - Annual 2016
Page 2 of 35
17-P-0044

-------
Section 1: Identify
Met
11-8 Implements the tailored set of baseline security controls as described in 1.1.7.
Met
1-1-9 Identifies and manages risks with system interconnections, including through authorizing system interconnections,
documenting interface characteristics and security requirements, and maintaining interconnection security agreements. (NIST
SP 800-53: CA-3)
Met
1.1.10	Continuously assesses the security controls, including hybrid and shared controls, using appropriate assessment procedures
to determine the extent to which the controls are implemented correctly, operating as intended, and producing the desired
outcome with respect to meeting the security requirements for the system.
Met
1.1.11	Maintains ongoing information system authorizations based on a determination of the risk to organizational operations and
assets, individuals, other organizations, and the Nation resulting from the operation of the information system and the
decision that this risk is acceptable (OMB M-14-03, NIST Supplemental Guidance on Ongoing Authorization).
Met
1.1.12	Security authorization package contains system security plan, security assessment report, and POA&M that are prepared
and maintained in accordance with government policies. (SP 800-18, SP 800-37)
Met
1.1.13	POA&Ms are maintained and reviewed to ensure they are effective for correcting security weaknesses.
Met
1.1.14	Centrally tracks, maintains, and independently reviews/validates POA&M activities at least quarterly. (NIST SP 800-53
:CA-5; OMB M-04-25)
Met
1.1.15	Prescribes the active involvement of information system owners and common control providers, chief information officers,
senior information security officers, authorizing officials, and other roles as applicable in the ongoing management of
infonnation-system-related security risks.
Met
OIG Report - Annual 2016
17-P-0044
Consistently
Implemented
Managed and
Measureable
Consistently
Implemented
Managed and
Measureable
Managed and
Measureable
Consistently
Implemented
Managed and
Measureable
Managed and
Measureable
Page 3 of 35

-------
Section 1: Identify
1.1.16 Implemented an insider threat detection and prevention program, including the development of comprehensive policies,
procedures, guidance, and governance structures, in accordance with Executive Order 13587 and the National Insider
Threat Policy. (PMC; NIST SP 800-53: PM-12)
Not Met
Comments:
Consistently
Implemented
The EPA's insider threat policy was signed in September 2016; however, implementation of the insider threat detection and
prevention program will occur after fiscal year 2016.
1.1.17 Provide any additional information on the effectiveness (positive or negative) of the organization's Risk Management
program that was not noted in the questions above. Based on all testing performed, is the Risk Management program
effective?
Effective
Comments:
We did not assess this question.
Contractor Systems (Identify)
12 Has the organization established a program to oversee systems operated on its behalf by contractors or other entities, including other
government agencies, managed hosting environments, and systems and services residing in a cloud external to the organization that is
inclusive of policies and procedures consistent with FISMA requirements, OMB policy, and applicable NIST guidelines?
Met
1 -2.1 Establishes and implements a process to ensure that contracts/statements of work/solicitations for systems and services,
include appropriate information security and privacy requirements and material disclosures, FAR clauses, and clauses on
protection, detection, and reporting of information. (FAR Case 2007-004, Common Security Configurations, FAR Sections
24.104, 39.101, 39.105, 39.106, 52.239-1; PMC, 2016 CIO Metrics 1.8, NIST 800-53, SA-4 FedRAMP standard
contract clauses; Cloud Computing Contract Best Practices)
Not Met
Comments:
Defined
Consistently
Implemented
The EPA indicated that new procedures were developed during fiscal year 2016; however, performance of the activities will
not commence until fiscal year 2017.
1.2.2 Specifies within appropriate agreements how information security performance is measured, reported, and monitored on	Consistently
contractor- or other entity-operated systems. (CIO and CAO Council Best Practices Guide for Acquiring IT as a Service,	Implemented
NIST SP 800-35)
Met
OIG Report - Annual 2016	Page 4 of 35
17-P-0044

-------
Section 1: Identify
1.2.3 Obtains sufficient assurance that the security controls of systems operated on the organization's behalf by contractors or	Consistently
other entities and services provided on the organization's behalf meet FISMA requirements, OMB policy, and applicable	Implemented
NIST guidelines. (NIST SP 800-53: CA-2, SA-9)
Not Met
Comments: The EPA indicated that new procedures were developed during fiscal year 2016; however, performance of the activities will
not commence until fiscal year 2017.
1.2.4 Provide any additional information on the effectiveness (positive or negative) of the organization's Contractor Systems
Program that was not noted in the questions above. Based on all testing performed, is the Contractor Systems Program
effective?
Effective
Comments: We did not assess this question.
Level
Score
Possible Score
LEVEL 3: Consistently Implemented
13
20
OIG Report - Annual 2016
17-P-0044
Page 5 of 35

-------
Section 2: Protect
Configuration Management (Protect)
2.1 Has the organization established a configuration management program that is inclusive of comprehensive agency policies and
procedures consistent with FISMA requirements, OMB policy, and applicable NIST guidelines?
Met
2.1.1	Develops and maintains an up-to-date inventory of the hardware assets (i.e., endpoints, mobile assets, network devices,
input/output assets, and SMART/NEST devices) connected to the organization's network with the detailed information
necessary for tracking and reporting. (NIST CF ID.AM-1; 2016 CIO FISMA Metrics 1.5, 3.17; NIST 800-53: CM-8)
Met
2.1.2	Develops and maintains an up-to-date inventory of software platforms and applications used within the organization and with
the detailed information necessary for tracking and reporting. (NIST 800-53: CM-8, NIST CF ID.AM-2)
Met
2.1.3	Implements baseline configurations for IT systems that are developed and maintained in accordance with documented
procedures. (NIST SP 800-53: CM-2; NIST CF PR.IP-1)
Met
2.1.4	Implements and maintains standard security settings (also referred to as security configuration checklists or hardening guides)
for IT systems in accordance with documented procedures. (NIST SP 800-53: CM-6; CIO 2016 FISMA Metrics, 2.3)
Met
2.1.5	Assesses configuration change control processes, including processes to manage configuration deviations across the
enterprise that are implemented and maintained. (NIST SP 800-53: CM-3, NIST CF PR.IP-3)
Met
2.1.6	Identifies and documents deviations from configuration settings. Acceptable deviations are approved with business
justification and risk acceptance. Where appropriate, automated means that enforce and redeploy configuration settings to
systems at regularly scheduled intervals are deployed, while evidence of deviations is also maintained. (NIST SP 800-53:
CM-6, Center for Internet Security Controls (CIS) 3.7)
Met
2.1.7	Implemented SCAP certified software assessing (scanning) capabilities against all systems on the network to assess both
code-based and configuration-based vulnerabilities in accordance with risk management decisions. (NIST SP 800-53:
RA-5, SI- 2; CIO 2016 FISMA Metrics 2.2, CIS 4.1)
Defined
Defined
Defined
Consistently
Implemented
Consistently
Implemented
Managed and
Measureable
Managed and
Measureable
Managed and
Measureable
OIG Report - Annual 2016
Page 6 of 35
17-P-0044

-------
Section 2: Protect
Met
2.1.8	Remediates configuration-related vulnerabilities, including scan findings, in a timely manner as specified in organization policy
or standards. (NIST 800-53: CM-4, CM-6, RA-5, SI-2)
Met
2.1.9	Develops and implements a patch management process in accordance with organization policy or standards, including timely
and secure installation of software patches. (NIST SP 800-53: CM-3, SI-2, OMB M-16-04, DHS Binding Operational
Directive 15-01)
Met
2.1.10	Provide any additional information on the effectiveness (positive or negative) of the organization's Configuration Management
Program that was not noted in the questions above. Based on all testing performed, is the Configuration Management
Program effective?
Effective
Comments:
We did not assess this question.
Identity and Access Management (Protect)
2.2
Has the organization established an identity and access management program, including policies and procedures consistent with
FISMA requirements, OMB policy, and applicable NIST guidelines?
Met
2.2.1	Ensures that individuals requiring access to organizational information and information systems sign appropriate access
agreements, participate in required training prior to being granted access, and recertify access agreements on a
predetermined interval. (NIST 800-53: PL-4, PS-6)
Met
2.2.2	Ensures that all users are only granted access based on least privilege and separation-of-duties principles.
Not Met
Consistently
Implemented
Managed and
Measureable
Defined
Consistently
Implemented
Consistently
Implemented
Comments:
The EPA commenced efforts to ensure all users are only granted access based on least privilege and separation of
duties. These efforts are scheduled to be completed by December 31, 2017.
2.2.3 Distinguishes hardware assets that have user accounts (e.g., desktops, laptops, servers) from those without user accounts
(e.g. networking devices, such as load balancers and intrusion detection/prevention systems, and other input/output devices
Consistently
Implemented
OIG Report - Annual 2016
Page 7 of 35
17-P-0044

-------
Section 2: Protect
such as faxes and IP phones).
Met
2.2.4	Implements PIV for physical access in accordance with government policies. (HSPD 12, FIPS 201, OMB M-05-24, OMB
M-07-06, OMB M-08-01, OMB M-ll-11)
Met
2.2.5	Implements PIV or a NIST Level of Assurance (LOA) 4 credential for logical access by all privileged users (system,
network, database administrators, and others responsible for system/application control, monitoring, or administration
functions). (Cybersecurity Sprint, OMB M-16-04, PMC, 2016 CIO FISMAMetrics 2.5.1)
Met
2.2.6	Enforces PIV or a NIST LOA 4 credential for logical access for at least 85% of non-privileged users. (Cybersecurity
Sprint, OMB M-16-04, PMC, 2016 CIO FISMA Metrics 2.4.1)
Met
2.2.7	Tracks and controls the use of administrative privileges and ensures that these privileges are periodically reviewed and
adjusted in accordance with organizationally defined timeframes. (2016 CIO FISMAMetrics 2.9, 2.10; OMB M-16-04,
CIS 5.2)
Met
2.2.8	Ensures that accounts are terminated or deactivated once access is no longer required or after a period of inactivity,
according to organizational policy.
Not Met
Comments:
Consistently
Implemented
Consistently
Implemented
Consistently
Implemented
Managed and
Measureable
Managed and
Measureable
The EPA has not completed efforts to ensure accounts are terminated once access is no longer needed. These efforts are
scheduled to be completed by December 31, 2017.
2 2.9 Identifies, limits, and controls the use of shared accounts. (NIST SP 800-53: AC-2)	Consistently
Implemented
Met
2.2.10 All users are uniquely identified and authenticated for remote access using Strong Authentication (multi-factor), including	Consistently
PIV. (NIST SP 800-46, Section 4.2, Section 5.1, NIST SP 800-63)	Implemented
Met
OIG Report - Annual 2016	Page 8 of 35
17-P-0044

-------
Section 2: Protect
2.2.11	Protects against and detects unauthorized remote access connections or subversion of authorized remote access
connections, including through remote scanning of host devices. (CIS 12.7, 12.8, FY 2016 CIO FISMAmetrics 2.17.3,
2.17.4,3.11,3.11.1)
Met
2.2.12	Remote access sessions are timed-out after 30 minutes of inactivity, requiring user re-authentication, consistent with OMB
M-07-16
Met
2.2.13	Enforces a limit of consecutive invalid remote access logon attempts and automatically locks the account or delays the next
logon prompt. (NIST 800-53: AC-7)
Met
2.2.14	Implements a risk-based approach to ensure that all agency public websites and services are accessible through a secure
connection through the use and enforcement of https and strict transport security. (OMB M-15-13)
Met
2.2.15	Provide any additional information on the effectiveness (positive or negative) of the organization's Identity and Access
Management Program that was not noted in the questions above. Based on all testing performed is the Identity and Access
Management Program effective?
Effective
Comments:
We did not assess this question.
Security and Privacy Training (Protect)
2.3 Has the organization established a security and privacy awareness and training program, including comprehensive agency policies and
procedures consistent with FISMA requirements, OMB policy, and applicable NIST guidelines?
Met
2.3.1	Develops training material for security and privacy awareness training containing appropriate content for the organization,
including anti-phishing, malware defense, social engineering, and insider threat topics. (NIST SP 800-50, 800-53: AR-5,
OMB M-15-01, 2016 CIO Metrics, PMC, National Insider Threat Policy (NITP))
Met
2.3.2	Evaluates the skills of individuals with significant security and privacy responsibilities and provides additional security and
privacy training content or implements human capital strategies to close identified gaps. (NIST SP 800-50)
OIG Report - Annual 2016
Consistently
Implemented
Managed and
Measureable
Consistently
Implemented
Consistently
Implemented
Defined
Consistently
Implemented
Consistently
Implemented
Page 9 of 35
17-P-0044

-------
Section 2: Protect
Met
2.3.3	Identifies and tracks status of security and privacy awareness training for all information system users (including employees,
contractors, and other organization users) requiring security awareness training with appropriate internal processes to detect
and correct deficiencies. (NIST 800-53: AT-2)
Met
2.3.4	Identifies and tracks status of specialized security and privacy training for all personnel (including employees, contractors,
and other organization users) with significant information security and privacy responsibilities requiring specialized training.
Not Met
Comments:
Consistently
Implemented
Consistently
Implemented
f he EPA does not identify and track status of specialized security training for contractors with significant information security
responsibilities that required specialized training.
2.3.5
2.3.6
Measures the effectiveness of its security and privacy awareness and training programs, including through social engineering
and phishing exercises. (PMC, 2016 CIO FISMA Metrics 2.19, NISf SP 800-50, NISf SP 800-55)
Met
Provide any additional information on the effectiveness (positive or negative) of the organization's Security and Privacy
Training Program that was not noted in the questions above. Based on all testing performed is the Security and Privacy
Training Program effective?
Effective
Comments:
We did not assess this question.
Managed and
Measureable
Level
Score
Possible Score
LEVEL 3: Consistently Implemented
13
20
OIG Report - Annual 2016
17-P-0044
Page 10 of 35

-------
Section 3: Detect
Level 1
Definition
3.11 ISCM program is not formalized and ISCM activities are performed in a reactive manner resulting in an ad hoc program that
does not meet Level 2 requirements for a defined program consistent with NIST SP 800-53, SP 800-137, OMB M-14-03,
and the CIO ISCM CONOPS.
People
3.1.1.1	ISCM stakeholders and their responsibilities have not been fully defined and communicated across the organization.	Ad Hoc
Met
3.1.1.2	The organization has not performed an assessment of the skills, knowledge, and resources needed to effectively implement an ISCM	Ad Hoc
program. Key personnel do not possess knowledge, skills, and abilities to successfully implement an effective ISCM program.
Met
3.1.1.3	The organization has not defined how ISCM information will be shared with individuals with significant security responsibilities and	Ad Hoc
used to make risk based decisions.
Met
3.1.1.4	The organization has not defined how it will integrate ISCM activities with organizational risk tolerance, the threat environment, and	Ad Hoc
business/mission requirements.
Met
Processes
3.1.1.5	ISCM processes have not been fully defined and are performed in an ad-hoc, reactive manner for the following areas: ongoing	Ad Hoc
assessments and monitoring of security controls; performing hardware asset management, software asset management, configuration
setting management, and common vulnerability management; collecting security related information required for metrics, assessments,
and reporting; analyzing ISCM data, reporting findings, and determining the appropriate risk responses; and reviewing and updating
the ISCM program.
Met
3.1.1.6	ISCM results vary depending on who performs the activity, when it is performed, and the methods and tools used.	Ad Hoc
Met
3.1.1.7	The organization has not identified and defined the qualitative and quantitative performance measures that will be used to assess the	Ad Hoc
effectiveness of its ISCM program, achieve situational awareness, and control ongoing risk.
OIG Report - Annual 2016	Page 11 of 35
17-P-0044

-------
Section 3: Detect
Met
3.1.1.8	The organization has not defined its processes for collecting and considering lessons learned to improve ISCM processes.	Ad Hoc
Met
Technology
3.1.1.9	The organization has not identified and defined the ISCM technologies needed in one or more of the following automation areas and	Ad Hoc
relies on manual/procedural methods in instances where automation would be more effective. Use of ISCM technologies in the
following areas is ad-hoc.
-	Patch management
-	License management
-	Information management
-	Software assurance
-	Vulnerability management
-	Event management
-	Malware detection
-	Asset management
-	Configuration management
-	Network management
-	Incident management
Met
3.1.1.10	The organization has not defined how it will use automation to produce an accurate point-in-time inventory of the authorized and	Ad Hoc
unauthorized devices and software on its network and the security configuration of these devices and software.
Met
Level 2
Definition
3.2.1 The organization has formalized its ISCM program through the development of comprehensive ISCM policies, procedures,
and strategies consistent with NIST SP 800-53, SP 800-137, OMB M-14-03, and the CIO ISCM CONOPS. However,
ISCM policies, procedures, and strategies are not consistently implemented organization-wide.
People
3.2.1.1 ISCM stakeholders and their responsibilities have been defined and communicated across the organization. However, stakeholders	Defined
OIG Report - Annual 2016	Page 12 of 35
17-P-0044

-------
Section 3: Detect
may not have adequate resources (people, processes, and technology) to effectively implement ISCM activities.
Met
3.2.1.2	The organization has performed an assessment of the skills, knowledge, and resources needed to effectively implement an ISCM	Defined
program. In addition, the organization has developed a plan for closing any gaps identified. However, key personnel may still lack the
knowledge, skills, and abilities to successfully implement an effective ISCM program.
Met
3.2.1.3	The organization has defined how ISCM information will be shared with individuals with significant security responsibilities and used	Defined
to make risk-based decisions. However, ISCM information is not always shared with individuals with significant security
responsibilities in a timely manner with which to make risk-based decisions.
Met
3.2.1.4	The organization has defined how it will integrate ISCM activities with organizational risk tolerance, the threat environment, and	Defined
business/mission requirements. However, ISCM activities are not consistently integrated with the organization's risk management
program.
Met
Processes
3.2.1.5	ISCM processes have been fully defined for the following areas: ongoing assessments and monitoring of security controls; performing	Defined
hardware asset management, software asset management, configuration setting management, and common vulnerability management;
collecting security related information required for metrics, assessments, and reporting; analyzing ISCM data, reporting findings, and
determining the appropriate risk responses; and reviewing and updating the ISCM program. However, these processes are
inconsistently implemented across the organization.
Met
3.2.1.6	ISCM results vary depending on who performs the activity, when it is performed, and the methods and tools used.	Defined
Met
3.2.1.7	The organization has identified and defined the performance measures and requirements that will be used to assess the effectiveness	Defined
of its ISCM program, achieve situational awareness, and control ongoing risk. However, these measures are not consistently
collected, analyzed, and used across the organization.
Met
3.2.1.8	The organization has a defined process for capturing lessons learned on the effectiveness of its ISCM program and making necessary	Defined
improvements. However, lessons learned are not consistently shared across the organization and used to make timely improvements
OIG Report - Annual 2016	Page 13 of 35
17-P-0044

-------
Section 3: Detect
to the ISCM program.
Met
Technology
3.2.1.9	The organization has identified and fully defined the ISCM technologies it plans to utilize in the following automation areas. In
addition, the organization has developed a plan for implementing ISCM technologies in these areas: patch management, license
management, information management, software assurance, vulnerability management, event management, malware detection, asset
management, configuration management, network management, and incident management. However, the organization has not fully
implemented technology is these automation areas and continues to rely on manual/procedural methods in instances where automation
would be more effective. In addition, while automated tools are implemented to support some ISCM activities, the tools may not be
interoperable.
Met
3.2.1.10	The organization has defined how it will use automation to produce an accurate point-in-time inventory of the authorized and
unauthorized devices and software on its network and the security configuration of these devices and software. However, the
organization does not consistently implement the technologies that will enable it to manage an accurate point-in-time inventory of the
authorized and unauthorized devices and software on its network and the security configuration of these devices and software.
Met
Level 3
Definition
3 3.1 In addition to the formalization and definition of its ISCM program (Level 2), the organization consistently implements its
ISCM program across the agency. However, qualitative and quantitative measures and data on the effectiveness of the
ISCM program across the organization are not captured and utilized to make risk-based decisions, consistent with NIST SP
800-53, SP 800-137, OMB M-14-03, and the CIO ISCM CONOPS.
ISCM stakeholders and their responsibilities have been identified and communicated across the organization, and stakeholders have
adequate resources (people, processes, and technology) to effectively implement ISCM activities.
Not Met
Comments:
People
3.3.1.1
Defined
Defined
Consistently
Implemented
The OIG DID NOT assess Level 3 (Consistently Implemented).
OIG Report - Annual 2016
17-P-0044
Page 14 of 35

-------
Section 3: Detect
3.3.1.2 The organization has fully implemented its plans to close any gapes in skills, knowledge, and resources required to successfully	Consistently
implement an ISCM program. Personnel possess the required knowledge, skills, and abilities to effectively implement the	Implemented
organization's ISCM program.
Not Met
Comments:
The OIG DID NOT assess Level 3 (Consistently Implemented).
3 3.1.3 ISCM information is shared with individuals with significant security responsibilities in a consistent and timely manner with which to
make risk-based decisions and support ongoing system authorizations.
Not Met
Consistently
Implemented
Comments:
The OIG DID NOT assess Level 3 (Consistently Implemented).
3.3.1.4 ISCM activities are fully integrated with organizational risk tolerance, the threat environment, and business/mission requirements.
Not Met
Consistently
Implemented
Comments:
The OIG DID NOT assess Level 3 (Consistently Implemented).
Processes
3.3.1.5 ISCM processes are consistently performed across the organization in the following areas: ongoing assessments and monitoring of	Consistently
security controls; performing hardware asset management, software asset management, configuration setting management, and	Implemented
common vulnerability management; collecting security related information required for metrics, assessments, and reporting; analyzing
ISCM data, reporting findings, and determining the appropriate risk responses; and reviewing and updating the ISCM program.
Not Met
Comments:
The OIG DID NOT assess Level 3 (Consistently Implemented).
3.3.1.6 The rigor, intensity, scope, and results of ISCM activities are comparable and predictable across the organization.
Consistently


Implemented
Not Met


Comments:
The OIG DID NOT assess Level 3 (Consistently Implemented).
3.3.1.7 The organization is consistently capturing qualitative and quantitative performance measures on the performance of its ISCM program	Consistently
in accordance with established requirements for data collection, storage, analysis, retrieval, and reporting. ISCM measures provide	Implemented
information on the effectiveness of ISCM processes and activities.
OIG Report - Annual 2016	Page 15 of 35
17-P-0044

-------
Section 3: Detect
Not Met
Comments:
The OIG DID NOT assess Level 3 (Consistently Implemented).
3.3.1.8 The organization is consistently capturing and sharing lessons learned on the effectiveness of ISCM processes and activities. Lessons	Consistently
learned serve as a key input to making regular updates to ISCM processes.	Implemented
Not Met
Comments:
The OIG DID NOT assess Level 3 (Consistently Implemented).
3.3.1.9 The organization has consistently implemented its defined technologies in all of the following ISCM automation areas. ISCM tools are	Consistently
interoperable to the extent practicable.	Implemented
-	Patch management
-	License management
-	Information management
-	Software assurance
-	Vulnerability management
-	Event management
-	Malware detection
-	Asset management
-	Configuration management
-	Network management
-	Incident management
Not Met
Comments:
The OIG DID NOT assess Level 3 (Consistently Implemented).
Technology
3 .3 .1.10 The organization can produce an accurate point-in-time inventory of the authorized and unauthorized devices and software on its	Consistently
network and the security configuration of these devices and software.	Implemented
Not Met
Comments:
The OIG DID NOT assess Level 3 (Consistently Implemented).
Level 4
Definition
OIG Report - Annual 2016	Page 16 of 35
17-P-0044

-------
Section 3: Detect
3-4.1 in addition to being consistently implemented (Level 3), ISCM activities are repeatable and metrics are used to measure and
manage the implementation of the ISCM program, achieve situational awareness, control ongoing risk, and perform ongoing
system authorizations.
People
3.4.1.1 The organization's staff is consistently implementing, monitoring, and analyzing qualitative and quantitative performance measures	Managed and
across the organization and is collecting, analyzing, and reporting data on the effectiveness of the organization's ISCM program.	Measureable
Not Met
Comments:
The OIG DID NOT assess Level 4 (Managed and Measurable).
3.4.1.2 Skilled personnel have been hired and/or existing staff trained to develop the appropriate metrics to measure the success of the
ISCM program.
Not Met
Managed and
Measureable
Comments:
The OIG DID NOT assess Level 4 (Managed and Measurable).
3.4.1.3 Staff are assigned responsibilities for developing and monitoring ISCM metrics, as well as updating and revising metrics as needed
based on organization risk tolerance, the threat environment, business/mission requirements, and the results of the ISCM program.
Not Met
Managed and
Measureable
Comments:
The OIG DID NOT assess Level 4 (Managed and Measurable).
Processes
3.4.1.4 The organization has processes for consistently implementing, monitoring, and analyzing qualitative and quantitative performance	Managed and
measures across the organization and is collecting, analyzing, and reporting data on the effectiveness of its processes for performing	Measureable
ISCM.
Not Met
Comments:
The OIG DID NOT assess Level 4 (Managed and Measurable).
3.4.1.5 Data supporting ISCM metrics are obtained accurately, consistently, and in a reproducible format.
Not Met
Managed and
Measureable
Comments:
The OIG DID NOT assess Level 4 (Managed and Measurable).
OIG Report - Annual 2016
17-P-0044
Page 17 of 35

-------
Section 3: Detect
3.4.1.6 The organization is able to integrate metrics on the effectiveness of its ISCM program to deliver persistent situational awareness	Managed and
across the organization, explain the environment from both a threat/vulnerability and risk/impact perspective, and cover mission areas	Measureable
of operations and security domains.
Not Met

Comments:
The OIG DID NOT assess Level 4 (Managed and Measurable).
3.4.1.7
The organization uses its ISCM metrics for determining risk response actions including risk acceptance, avoidance/rejection, or
transfer.
Not Met
Managed and
Measureable

Comments:
The OIG DID NOT assess Level 4 (Managed and Measurable).
3.4.1.8
ISCM metrics are reported to the organizational officials charged with correlating and analyzing the metrics in ways that are relevant
for risk management activities.
Not Met
Managed and
Measureable

Comments:
The OIG DID NOT assess Level 4 (Managed and Measurable).
3.4.1.9
ISCM is used to maintain ongoing authorizations of information systems and the environments in which those systems operate,
including common controls and keep required system information and data (i.e., System Security Plan Risk Assessment Report,
Security Assessment Report, and POA&M) up to date on an ongoing basis.
Not Met
Managed and
Measureable

Comments:
The OIG DID NOT assess Level 4 (Managed and Measurable).
Technology
3.4.1.10 The organization uses technologies for consistently implementing, monitoring, and analyzing qualitative and quantitative performance
across the organization and is collecting, analyzing, and reporting data on the effectiveness of its technologies for performing ISCM.
Not Met
Managed and
Measureable

Comments:
The OIG DID NOT assess Level 4 (Managed and Measurable).
3.4.1.11
The organization's ISCM performance measures include data on the implementation of its ISCM program for all sections of the
network from the implementation of technologies that provide standard calculations, comparisons, and presentations.
Not Met
Managed and
Measureable

Comments:
The OIG DID NOT assess Level 4 (Managed and Measurable).
OIG Report - Annual 2016
17-P-0044
Page 18 of 35

-------
Section 3: Detect
3.4.1.12
The organization utilizes
and manage risk
Not Met
a SIEM tool to collect, maintain, monitor, and analyze IT security information, achieve situational awareness,
Managed and
Measureable

Comments:
The OIG DID NOT assess Level 4 (Managed and Measurable).
Level 5



Definition



3-5.1 in addition to being managed and measurable (Level 4), the organization's ISCM program is institutionalized, repeatable,
self-regenerating, and updated in a near real-time basis based on changes in business/mission requirements and a changing
threat and technology landscape.

People



3.5.1.1
The organization's assigned personnel collectively possess a high skill level to perform and update ISCM activities on a near real-time
basis to make any changes needed to address ISCM results based on organization risk tolerance, the threat environment, and
business/mission requirements.
Not Met
Optimized

Comments:
The OIG DID NOT assess Level 5 (Optimized).
Processes


3.5.1.2
The organization has institutionalized a process of continuous improvement incorporating advanced cybersecurity and practices.
Not Met
Optimized

Comments:
The OIG DID NOT assess Level 5 (Optimized).
3.5.1.3
On a near real-time basis, the organization actively adapts its ISCM program to a changing cybersecurity landscape and responds to
evolving and sophisticated threats in a timely manner.
Not Met
Optimized

Comments:
The OIG DID NOT assess Level 5 (Optimized).
3.5.1.4
The ISCM program is fully integrated with strategic planning, enterprise architecture and capital planning and investment control
processes, and other mission/business areas, as appropriate.
Not Met
Optimized

Comments:
The OIG DID NOT assess Level 5 (Optimized).
OIG Report - Annual 2016
Page 19 of 35
17-P-0044

-------
Section 3: Detect
3.5.1.5 The ISCM program achieves cost-effective IT security objectives and goals and influences decision making that is based on cost,
risk, and mission impact.
Not Met
Comments:
Optimized
The OIG DID NOT assess Level 5 (Optimized).
Technology
3.5.1.6 The organization has institutionalized the implementation of advanced cybersecurity technologies in near real-time.	Optimized
Not Met
Comments:
The OIG DID NOT assess Level 5 (Optimized).
3.5.1.7 The organization has institutionalized the use of advanced technologies for analysis of trends and performance against benchmarks to
continuously improve its ISCM program.
Not Met
Optimized
Comments:
The OIG DID NOT assess Level 5 (Optimized).
Level
Score
Possible Score
LEVEL 2: Defined
7
20
OIG Report - Annual 2016
17-P-0044
Page 20 of 35

-------
Section 4: Respond
Level 1
Definition
4.1.1 Incident response program is not formalized and incident response activities are performed in a reactive manner resulting in
an ad-hoc program that does not meet Level 2 requirements for a defined program consistent with FISMA (including
guidance from NIST SP 800-83, NIST SP 800-61 Rev. 2, NIST SP 800-53, OMB M-16-03, OMB M-16-04, and
US-CERT Federal Incident Notification Guidelines).
People
4.1.1.1	Incident response team structures/models, stakeholders, and their roles, responsibilities, levels of authority, and dependencies have	Ad Hoc
not been fully defined and communicated across the organization, including the designation of a principal security operations center or
equivalent organization that is accountable to agency leadership, DHS, and OMB for all incident response activities.
Met
4.1.1.2	The organization has not performed an assessment of the skills, knowledge, and resources needed to effectively implement an	Ad Hoc
incident response program. Key personnel do not possess the knowledge, skills, and abilities to successfully implement an effective
incident response program.
Met
4.1.1.3	The organization has not defined a common threat vector taxonomy and defined how incident response information will be shared	Ad Hoc
with individuals with significant security responsibilities and other stakeholders, and used to make timely, risk-based decisions.
Met
4.1.1.4	The organization has not defined how it will integrate incident response activities with organizational risk management, continuous	Ad Hoc
monitoring, continuity of operations, and other mission/business areas, as appropriate.
Met
Processes
4.1.1.5	Incident response processes have not been fully defined and are performed in an ad-hoc, reactive manner for the following areas:	Ad Hoc
incident response planning, incident response training and testing; incident detection and analysis; incident containment, eradication,
and recovery; incident coordination, information sharing, and reporting to internal and external stakeholders using standard data
elements and impact classifications within timeframes established by US-CERT.
Met
4.1.1.6	The organization has not fully defined how it will collaborate with DHS and other parties, as appropriate, to provide on-site, technical	Ad Hoc
OIG Report - Annual 2016	Page 21 of 35
17-P-0044

-------
Section 4: Respond
assistance/surge resources/special capabilities for quickly responding to incidents.
Met
4.1.1.7	The organization has not identified and defined the qualitative and quantitative performance measures that will be used to assess the	Ad Hoc
effectiveness of its incident response program, perform trend analysis, achieve situational awareness, and control ongoing risk.
Met
4.1.1.8	The organization has not defined its processes for collecting and considering lessons learned and incident data to improve security	Ad Hoc
controls and incident response processes.
Met
Technology
4.1.1.9	The organization has not identified and defined the incident response technologies needed in one or more of the following areas and	Ad Hoc
relies on manual/procedural methods in instances where automation would be more effective. Use of incident response technologies
in the following areas is ad-hoc.
-	Web application protections, such as web application firewalls
-	Event and incident management, such as intrusion detection and prevention tools, and incident tracking and reporting tools
-	Aggregation and analysis, such as security information and event management (SIEM) products
-	Malware detection, such as anti-virus and antispam software technologies
-	Information management, such as data loss prevention
-	File integrity and endpoint and server security tools
Met
4.1.1.10	The organization has not defined how it will meet the defined Trusted Internet Connection (TIC) security controls and ensure that all	Ad Hoc
agency traffic, including mobile and cloud, are routed through defined access points, as appropriate.
Met
4.1.1.11	The organization has not defined how it plans to utilize DHS' Einstein program for intrusion detection/prevention capabilities for traffic	Ad Hoc
entering and leaving the organization's networks.
Met
4.1.1.12	The organization has not defined how it plans to utilize technology to develop and maintain a baseline of network operations and	Ad Hoc
expected data flows for users and systems.
Met
Level 2
OIG Report - Annual 2016	Page 22 of 35
17-P-0044

-------
Section 4: Respond
Definition
4.2.1 The organizational has formalized its incident response program through the development of comprehensive incident
response policies, plans, and procedures consistent with FISMA (including guidance from NIST SP 800-83, NIST SP
800-61 Rev. 2, NIST SP 800-53, OMB M-16-03, OMB M-16-04, and US-CERT Federal Incident Notification
Guidelines). However, incident response policies, plans, and procedures are not consistently implemented
organization-wide.
People
4.2.1.1	Incident response team structures/models, stakeholders, and their roles, responsibilities, levels of authority, and dependencies have	Defined
been fully defined and communicated across the organization, including the designation of a principal security operations center or
equivalent organization that is accountable to agency leadership, DHS, and OMB for all incident response activities. However,
stakeholders may not have adequate resources (people, processes, and technology) to effectively implement incident response
activities. Further, the organization has not verified roles and responsibilities as part of incident response testing.
Met
4.2.1.2	The organization has performed an assessment of the skills, knowledge, and resources needed to effectively implement an incident	Defined
response program. In addition, the organization has developed a plan for closing any gaps identified. However, key personnel may
still lack the knowledge, skills, and abilities to successfully implement an effective incident response program.
Met
4.2.1.3	The organization has defined a common threat vector taxonomy and defined how incident response information will be shared with	Defined
individuals with significant security responsibilities and other stakeholders, and used to make timely, risk-based decisions. However,
the organization does not consistently utilize its threat vector taxonomy and incident response information is not always shared with
individuals with significant security responsibilities and other stakeholders in a timely manner.
Met
4.2.1.4	The organization has defined how it will integrate incident response activities with organizational risk management, continuous	Defined
monitoring, continuity of operations, and other mission/business areas, as appropriate. However, incident response activities are not
consistently integrated with these areas.
Met
Processes
4.2.1.5	Incident response processes have been fully defined for the following areas: incident response planning, incident response training and	Defined
testing; incident detection and analysis; incident containment, eradication, and recovery; incident coordination, information sharing,
OIG Report - Annual 2016	Page 23 of 35
17-P-0044

-------
Section 4: Respond
and reporting using standard data elements and impact classifications within timeframes established by US-CERT. However, these
processes are inconsistently implemented across the organization.
Met
4.2.1.6	The organization has fully defined, but not consistently implemented, its processes to collaborate with DHS and other parties as	Defined
appropriate, to provide on-site, technical assistance/surge resources/special capabilities for quickly responding to incidents.
Met
4.2.1.7	The organization has identified and defined the qualitative and quantitative performance measures that will be used to assess the	Defined
effectiveness of its incident response program, perform trend analysis, achieve situational awareness, and control ongoing risk.
However, these measures are not consistently collected, analyzed, and used across the organization.
Met
4.2.1.8	The organization has defined its processes for collecting and considering lessons learned and incident data to improve security	Defined
controls and incident response processes. However, lessons learned are not consistently captured and shared across the organization
and used to make timely improvements to security controls and the incident response program.
Met
Technology
4.2.1.9	The organization has identified and fully defined the incident response technologies it plans to utilize in the following areas:	Defined
-	Web application protections, such as web application firewalls
-	Event and incident management, such as intrusion detection and prevention tools, and incident tracking and reporting tools
-	Aggregation and analysis, such as security information and event management (SIEM) products. However, the organization has not
ensured that security and event data are aggregated and correlated from all relevant sources and sensors.
-	Malware detection such as Anti-virus and antispam software technologies
-	Information management such as data loss prevention
-	File integrity and endpoint and server security tools
However, the organization has not fully implemented technologies in these areas and continues to rely on manual/procedural methods
in instances where automation would be more effective. In addition, while tools are implemented to support some incident response
activities, the tools are not interoperable to the extent practicable, do not cover all components of the organization's network, and/or
have not been configured to collect and retain relevant and meaningful data consistent with the organization's incident response
policy, plans, and procedures.
Met
OIG Report - Annual 2016	Page 24 of 35
17-P-0044

-------
Section 4: Respond
4.2.1.10
4.2.1.11
4.2.1.12
The organization has defined how it will meet the defined TIC security controls and ensure that all agency traffic, including mobile and
cloud, are routed through defined access points, as appropriate. However, the organization has not ensured that the TIC 2.0 provider
and agency managed capabilities are consistently implemented.
Met
The organization has defined how it plans to utilize DHS' Einstein program for intrusion detection/prevention capabilities for traffic
entering and leaving its networks.
Met
The organization has defined how it plans to utilize technology to develop and maintain a baseline of network operations and
expected data flows for users and systems. However, the organization has not established, and does not consistently maintain, a
comprehensive baseline of network operations and expected data flows for users and systems.
Met
Level 3
Definition
People
4.3.1.1
4.3.1.2
4 3.1 In addition to the formalization and definition of its incident response program (Level 2), the organization consistently
implements its incident response program across the agency, in accordance with FISMA (including guidance from NIST SP
800-83, NIST SP 800-61 Rev. 2, NIST SP 800-53, OMB M-16-03, OMB M-16-04, and US-CERT Federal Incident
Notification Guidelines). However, data supporting metrics on the effectiveness of the incident response program across the
organization are not verified, analyzed, and correlated.
Incident response team structures/models, stakeholders, and their roles, responsibilities, levels of authority, and dependencies have
been fully defined, communicated, and consistently implemented across the organization (Level 2). Further, the organization has
verified roles and responsibilities of incident response stakeholders as part of incident response testing.
Met
The organization has fully implemented its plans to close any gaps in the skills, knowledge, and resources needed to effectively
implement its incident response program. Incident response teams are periodically trained to ensure that knowledge, skills, and
abilities are maintained.
Met
Defined
Defined
Defined
Consistently
Implemented
Consistently
Implemented
OIG Report - Annual 2016
17-P-0044
Page 25 of 35

-------
Section 4: Respond
4.3.1.3	The organization consistently utilizes its defined threat vector taxonomy and shares information with individuals with significant security
responsibilities and other stakeholders in a timely fashion to support risk-based decision making.
Met
4.3.1.4	Incident response activities are integrated with organizational risk management, continuous monitoring, continuity of operations, and
other mission/business areas, as appropriate.
Not Met
Comments:
Consistently
Implemented
Consistently
Implemented
The EPA indicated in its self-assessment that it has not fully integrated incident response activities with organizational risk management,
continuous monitoring, continuity of operations, and other mission/business areas. However, integration (automated and manual) with
limited continuous monitoring technologies and processes is implemented.
Processes
4.3.1.5	Incident response processes are consistently implemented across the organization for the following areas: incident response planning,
incident response training and testing; incident detection and analysis; incident containment, eradication, and recovery; incident
coordination, information sharing, and reporting using standard data elements and impact classifications within timeframes established
by US-CERT.
Met
4.3.1.6	The organization has ensured that processes to collaborate with DHS and other parties as appropriate, to provide on-site, technical
assistance/surge resources/special capabilities for quickly responding to incidents are implemented consistently across the
organization.
Met
4.3.1.7	The organization is consistently capturing qualitative and quantitative performance metrics on the performance of its incident response
program. However, the organization has not ensured that the data supporting the metrics was obtained accurately and in a
reproducible format or that the data is analyzed and correlated in ways that are effective for risk management.
Not Met
Consistently
Implemented
Consistently
Implemented
Consistently
Implemented
Comments:
The EPA indicated in its self-assessment that the incident response team does not consistently capture qualitative and quantitative
metrics for performance measures. However, trend analysis is not consistently performed or documented in a manner that would
optimize situational awareness or control ongoing risk. These measures ARE NOT yet consistently collected, analyzed and used
across the organization.
OIG Report - Annual 2016
17-P-0044
Page 26 of 35

-------
Section 4: Respond
4.3.1.8	The organization is consistently collecting and capturing lessons learned and incident data on the effectiveness of its incident response
program and activities. However, lessons learned may not be shared across the organization in a timely manner and used to make
timely improvements to the incident response program and security measures.
Met
4.3.1.9	The rigor, intensity, scope, and results of incident response activities (i.e. preparation, detection, analysis, containment, eradication,
and recovery, reporting and post incident) are comparable and predictable across the organization.
Met
Technology
4.3.1.10	The organization has consistently implemented its defined incident response technologies in the following areas:
-	Web application protections, such as web application firewalls
-	Event and incident management, such as intrusion detection and prevention tools, and incident tracking and reporting tools
-	Aggregation and analysis, such as security information and event management (SIEM) products. The organization ensures that
security and event data are aggregated and correlated from all relevant sources and sensors
-	Malware detection, such as anti-virus and antispam software technologies
-	Information management, such as data loss prevention
-	File integrity and endpoint and server security tools
In addition, the tools are interoperable to the extent practicable, cover all components of the organization's network, and have been
configured to collect and retain relevant and meaningful data consistent with the organization's incident response policy, procedures,
and plans.
Not Met
Consistently
Implemented
Consistently
Implemented
Consistently
Implemented
Comments:
The EPA indicated in its self-assessment that it has partially implemented its defined incident response technologies and that
interoperability and integration in most cases HAS NOT yet been determined or implemented to the extent practical; DOES NOT
cover all components of the network; and HAS NOT been configured to collect and retain all relevant and meaningful data consistent
with the organization's incident response policy, procedures and plans.
4.3.1.11 The organization has consistently implemented defined TIC security controls and implemented actions to ensure that all agency traffic,
including mobile and cloud, are routed through defined access points, as appropriate.
Met
Consistently
Implemented
OIG Report - Annual 2016
17-P-0044
Page 27 of 35

-------
Section 4: Respond
4.3.1.12	The organization is utilizing DHS' Einstein program for intrusion detection/prevention capabilities for traffic entering and leaving their
networks.
Met
4.3.1.13	The organization has fully implemented technologies to develop and maintain a baseline of network operations and expected data
flows for users and systems.
Met
Level 4
Definition
4 4.1 In addition to being consistently implemented (Level 3), incident response activities are repeatable and metrics are used to
measure and manage the implementation of the incident response program, achieve situational awareness, and control
ongoing risk. In addition, the incident response program adapts to new requirements and government-wide priorities.
People
4.4.1.1 Incident response stakeholders are consistently implementing, monitoring, and analyzing qualitative and quantitative performance
measures across the organization and are collecting, analyzing, and reporting data on the effectiveness of the organization's incident
response program.
Not Met
Comments:
Consistently
Implemented
Consistently
Implemented
Managed and
Measureable
The EPA indicated in its self-assessment that incident response stakeholders DO NOT consistently implement, monitor and analyze
qualitative and quantitative performance measures across the organization and DO NOT consistently collect, analyze and report data
on the effectiveness of the organization's incident response program.
4.4.1.2	Skilled personnel have been hired and/or existing staff trained to develop the appropriate metrics to measure the success of the
incident response program.
Met
4.4.1.3	Incident response stakeholders are assigned responsibilities for developing and monitoring incident response metrics, as well as
updating and revising metrics as needed based on organization risk tolerance, the threat environment, business/mission requirements,
and the results of the incident response program.
Met
Managed and
Measureable
Managed and
Measureable
OIG Report - Annual 2016
Page 28 of 35
17-P-0044

-------
Section 4: Respond
Processes
4.4.1.4 The organization has processes for consistently implementing, monitoring, and analyzing qualitative and quantitative performance	Managed and
measures across the organization and is collecting, analyzing, and reporting data on the effectiveness of its processes for performing	Measureable
incident response.
Not Met
Comments: The EPA indicated in its self-assessment that processes have not been documented for consistently implementing, monitoring and
analyzing qualitative and quantitative performance measures across the organization; and is collecting, analyzing and reporting data on
the effectiveness of its processes for performing incident response.
4.4.1.5 Data supporting incident response measures and metrics are obtained accurately, consistently, and in a reproducible format.
Managed and
Measureable
Not Met
Comments:
The EPA indicated in its self-assessment that data supporting incident response measures and metrics ARE NOT obtained accurately,
consistently, and in a reproducible format.
4.4.1.6	Incident response data, measures, and metrics are analyzed, collected, and presented using standard calculations, comparisons, and
presentations
Met
4.4.1.7	Incident response metrics are reported to organizational officials charged with correlating and analyzing the metrics in ways that are
relevant for risk management activities.
Met
Technology
4.4.1.8	The organization uses technologies for consistently implementing, monitoring, and analyzing qualitative and quantitative performance
across the organization and is collecting, analyzing, and reporting data on the effectiveness of its technologies for performing incident
response activities.
Not Met
Comments:
Managed and
Measureable
Managed and
Measureable
Managed and
Measureable
The EPA indicated in its self-assessment that it uses various technologies for implementing, monitoring and analyzing qualitative and
quantitative performance across the organization; however, the data IS NOT consistently collected, analyzed and reported to properly
measure the effectiveness of its technologies for performing incident response activities.
OIG Report - Annual 2016
Page 29 of 35
17-P-0044

-------
Section 4: Respond
4.4.1.9 The organization's incident response performance measures include data on the implementation of its incident response program for
all sections of the network.
Not Met
Comments:
Managed and
Measureable
The EPA indicated in its self-assessment that it HAS NOT defined or implemented incident response performance measures that
include data on the implementation of its incident response program for all sections of the network.
Level 5
Definition
4 5.1 In addition to being managed and measurable (Level 4), the organization's incident response program is institutionalized,
repeatable, self-regenerating, and updated in a near real-time basis based on changes in business/mission requirements, and
a changing threat and technology landscape.
People
4.5.1.1 The organization's assigned personnel collectively possess a high skill level to perform and update incident response activities on a
near real-time basis to make any changes needed to address incident response results based on organization risk tolerance, the threat
environment, and business/mission requirements.
Not Met
Comments:
The OIG DID NOT assess Level 5 (Optimized).
Processes
4.5.1.2 The organization has institutionalized a process of continuous improvement incorporating advanced cybersecurity practices.
Not Met
Comments:
The OIG DID NOT assess Level 5 (Optimized).
4.5.1.3 On a near real-time basis, the organization actively adapts its incident response program to a changing cybersecurity landscape and
responds to evolving and sophisticated threats in a near real-time manner.
Not Met
Comments:
The OIG DID NOT assess Level 5 (Optimized).
4.5.1.4 The incident response program is fully integrated with organizational risk management, continuous monitoring, continuity of
operations, and other mission/business areas, as appropriate.
Optimized
Optimized
Optimized
Optimized
OIG Report - Annual 2016
Page 30 of 35
17-P-0044

-------
Section 4: Respond

Not Met



Comments:
The OIG DID NOT assess Level 5 (Optimized).
4.5.1.5
The incident response program achieves cost-effective IT security objectives and goals and influences decision making that is based
on cost, risk, and mission impact.
Not Met
Optimized

Comments:
The OIG DID NOT assess Level 5 (Optimized).
Technology


4.5.1.6
The organization has institutionalized the implementation of advanced incident response technologies in near real-time.
Not Met
Optimized

Comments:
The OIG DID NOT assess Level 5 (Optimized).
4.5.1.7
The organization has institutionalized the use of advanced technologies for analysis of trends and performance against benchmarks to
continuously improve its incident response program.
Not Met
Optimized

Comments:
The OIG DID NOT assess Level 5 (Optimized).
4.5.1.8
The organization uses simulation based technologies to continuously determine the impact of potential security incidents to its IT
assets and adjusts incident response processes and security measures accordingly.
Not Met
Optimized

Comments:
The OIG DID NOT assess Level 5 (Optimized).
Level
Score
Possible Score
LEVEL 3: Consistently Implemented
13
20
OIG Report - Annual 2016
17-P-0044
Page 31 of 35

-------
Section 5: Recover
Contingency Planning (Recover)
5-1 Has the organization established an enterprise-wide business continuity/disaster recovery program, including policies and procedures
consistent with FISMA requirements, OMB policy, and applicable NIST guidelines?
Met
5.1.1 Develops and facilitates recovery testing, training, and exercise (TT&E) programs. (FCD1, NIST SP 800-34, NIST SP
800-53)
Met
512 Incorporates the system's Business Impact Analysis and Business Process Analysis into analysis and strategy toward
development of the organization's Continuity of Operations Plan, Business Continuity Plan (BCP), and Disaster Recovery
Plan (DRP). (NIST SP 800-34)
Met
5.1.3	Develops and maintains documented recovery strategies, plans, and procedures at the division, component, and IT
infrastructure levels. (NIST SP 800-34)
Met
5.1.4	BCP and DRP are in place and ready to be executed upon if necessary. (FCD1, NIST SP 800-34, 2016 CIO FISMA
Metrics 5.3, PMC)
Met
5.1.5	Tests BCP and DRP for effectiveness and updates plans as necessary. (2016 CIO FISMA Metrics, 5.4)
Not Met
Defined
Consistently
Implemented
Consistently
Implemented
Consistently
Implemented
Consistently
Implemented
Managed and
Measureable
Comments:
The EPA DID NOT update its disaster recovery plan to include the critical application that is needed to restore the agency's
hosting environment at an alternate site.
5.1.6	Tests system-specific contingency plans, in accordance with organizationally defined timeframes, to determine the	Consistently
effectiveness of the plans as well as readiness to execute the plans if necessary. (NIST SP 800-53: CP-4)	Implemented
Met
5.1.7	Develops after-action reports that address issues identified during contingency/disaster recovery exercises in order to	Managed and
improve contingency/disaster recovery processes. (FCD1, NIST SP 800-34)	Measureable
Met
OIG Report - Annual 2016	Page 32 of 35
17-P-0044

-------
Section 5: Recover
5.1.8
5.1.9
Determines alternate processing and storage sites based upon risk assessments which ensure the potential disruption of the
organization's ability to initiate and sustain operations is minimized, and are not subject to the same physical and/or
cybersecurity risks as the primary sites. (FCD1, NIST SP 800-34, NIST SP 800-53: CP-6, CP-7)
Not Met
Comments:
Consistently
Implemented
The EPA DOES NOT have an alternate data storage site for key financial applications, and the responsible office had not
obtained the required authorization as required by EPA's policy.
5.1.10
5.1.11
Conducts backups of information at the user- and system-levels and protects the confidentiality, integrity, and availability of
backup information at storage sites. (FCD1, NIST SP 800-34, NIST SP 800-53: CP-9, NIST CF, PR.IP-4, NARA
guidance on information systems security records)
Met
Contingency planning that considers supply chain threats.
Met
Provide any additional information on the effectiveness (positive or negative) of the organization's Contingency Planning
Program that was not noted in the questions above. Based on all testing performed is the Contingency Planning Program
effective?
Effective
Comments:
Managed and
Measureable
Defined
We did not assess this question.
Level
Score
Possible Score
LEVEL 3: Consistently Implemented
13
20
OIG Report - Annual 2016
17-P-0044
Page 33 of 35

-------
APPENDIX A: Maturity Model Scoring
Maturity Levels by Section
Section

Level


Score
Possible Score
Section 1: Identify
LEVEL 3: Consistently Implemented
13
20
Section 2: Protect
LEVEL 3: Consistently Implemented
13
20
Section 3: Detect
LEVEL 2: Defined
7
20
Section 4: Respond
LEVEL 3: Consistently Implemented
13
20
Section 5: Recover
LEVEL 3: Consistently Implemented
13
20
TOTAL

59
100
Section 1: Identify
Model Indicator
Met

Not Met
Total
%
Points Assigned
Possible Points
Ad-Hoc
0
0
0
100%
3
3
Defined
4
0
4
100%
4
4
Consistently Implemented
8
3
11
73%
6
6
Managed and Measureable
6
0
6
100%
0
5
Optimized
0
0
0
100%
0
2
Section 2: Protect
Model Indicator
Met

Not Met
Total
%
Points Assigned
Possible Points
Ad-Hoc
0
0
0
100%
3
3
Defined
5
0
5
100%
4
4
Consistently Implemented
16
2
18
89%
6
6
Managed and Measureable
7
1
8
88%
0
5
Optimized
0
0
0
100%
0
2
Section 3: Detect
Model Indicator
Met

Not Met
Total
%
Points Assigned
Possible Points
Ad-Hoc
10
0
10
100%
3
3
Defined
10
0
10
100%
4
4
Consistently Implemented
0
10
10
0%
0
6
Managed and Measureable
0
12
12
0%
0
5
Optimized
0
7
7
0%
0
2
OIG Report - Annual 2016	Page 34 of 35
17-P-0044

-------
Section 4: Respond
Model Indicator
Met
Not Met
Total
%
Points Assigned
Possible Points
Ad-Hoc
12
0
12
100%
3
3
Defined
12
0
12
100%
4
4
Consistently Implemented
10
3
13
77%
6
6
Managed and Measureable
4
5
9
44%
0
5
Optimized
0
8
8
0%
0
2
Section 5: Recover
Model Indicator
Met
Not Met
Total
%
Points Assigned
Possible Points
Ad-Hoc
0
0
0
100%
3
3
Defined
2
0
2
100%
4
4
Consistently Implemented
5
1
6
83%
6
6
Managed and Measureable
2
1
3
67%
0
5
Optimized
0
0
0
100%
0
2
OIG Report - Annual 2016
17-P-0044
Page 35 of 35

-------
Appendix B
Information Security Reports Issued
in FYs 2016 and 2015
The EPA OIG issued the following reports in FYs 2016 and 2015 that included
recommendations regarding different areas within the EPA's information security program:
•	Report No. 16-P-0006, EPA Needs to Improve Security Planning and Remediation of
Identified Weaknesses in Systems Used to Protect Human Health and the
Environment, dated October 14, 2015. We reported that the EPA's Xacta system (the
EPA's official system for recording and maintaining information about the agency's
compliance with mandated information system security requirements) was placed into
service without complete and properly approved information system documentation.
Additionally, EPA security personnel were not developing a required Plan of Action and
Milestones in a timely manner to manage the remediation of known vulnerabilities, as
required by agency guidance. We made five recommendations, and EPA officials agreed
with the recommendations along with completing four of the five recommendations. The
EPA plans to complete the last recommendation by December 31, 2016.
•	Report No. 15-P-0295, EPA Needs to Improve the Recognition and Administration of
Cloud Sendees for the Office of Water's Permit Management Oversight System, dated
September 24, 2015. We reported that the EPA's Office of Water did not follow EPA
procedures when adopting cloud computing services when implementing the Permit
Management Oversight System. We also reported that the lack of oversight of the Office
of Water's Permit Management Oversight System contractor resulted in the oversight
system being hosted in a cloud service provider's environment that did not comply with
federal security requirements. We reported that there was no assurance that the EPA had
access to the service provider's cloud environment for audit and investigative purposes.
We also reported that the service provider's terms of service were not compliant with the
Federal Risk and Authorization Management Program. We made seven
recommendations, and the EPA agreed with them. The EPA indicated it would complete
all corrective actions by May 2016. The EPA indicated in the agency's Management
Audit Tracking System that it completed corrective actions for six of the seven
recommendations. The agency's Management Audit Tracking System did not identify the
remaining corrective action as completed.
•	Report No. 15-P-0290, Incomplete Contractor Systems Inventory and a Lack of
Oversight Limit EPA '.s Ability to Facilitate IT Governance, dated September 21, 2015.
We reported that agency officials were unaware of which systems or services are required
by the System Life Cycle Management Procedure to be included in the EPA's authoritative
information system database, known as the Registry of EPA Applications, Models and
Databases (READ). The READ inventory is important because it provides the tracking
mechanism to ensure information technology investments receive the appropriate level of
17-P-0044

-------
oversight. We reported that officials were unaware of which stage of the system life cycle
to enter contractor systems into READ, and, in cases where multiple offices manage
separate components of the same contractor system, which program office is responsible
for updating READ. As a result, we noted that:
o READ did not contain information on 22 contractor systems that are owned or
operated on behalf of the EPA and are located outside of the agency's network.
o READ also lacked information on 81 internal EPA contractor-supported systems.
o Personnel with oversight responsibilities for contractor systems were not aware of
the requirements outlined in EPA information security procedures.
We made five recommendations, and EPA officials agreed with all of the
recommendations. The EPA completed corrective actions on four of the
recommendations. The EPA plans to implement the last corrective action during
FY 2017.
17-P-0044

-------
Appendix C
Distribution
Office of the Administrator
Chief Information Officer, Office of Environmental Information
Agency Follow-Up Official (the CFO)
Agency Follow-Up Coordinator
General Counsel
Associate Administrator for Congressional and Intergovernmental Relations
Associate Administrator for Public Affairs
Principal Deputy Assistant Administrator, Office of Environmental Information
Senior Agency Information Security Officer, Office of Environmental Information
Director, Office of Information Technology Operations, Office of Environmental Information
Audit Follow-Up Coordinator, Office of Environmental Information
17-P-0044

-------