U.S. ENVIRONMENTAL PROTECTION AGENCY (EPA) & MAJOR PARTNERS'
    LESSONS LEARNED FROM IMPLEMENTING EPA's PORTION OF THE
           AMERICAN RECOVERY AND REINVESTMENT ACT:
    FACTORS AFFECTING IMPLEMENTATION AND PROGRAM SUCCESS
      INFORMATION SYSTEMS DEVELOPMENT AND ENHANCEMENT
                         SEPTEMBER 2013
                         EPA-100-K-13-007
                          PREPARED FOR
                  U.S. ENVIRONMENTAL PROTECTION AGENCY
                   OFFICE OF THE CHIEF FINANCIAL OFFICER
                         WASHINGTON, DC

-------

-------
                                  ACKNOWLEDGEMENTS
        This study could not have been possible without the help and cooperation of the many
        U.S. Environmental Protection Agency (EPA) employees at Headquarters and Regional
        offices  who agreed  to  be  interviewed, state  staff and  funding recipients who
        participated in lively focus group sessions, and the many other EPA and state staff who
        graciously provided answers  to  follow-up questions  after the interviews and focus
        groups were completed. The Science Applications International Corporation (SAIC) Team
        appreciates the time given to  share experiences  beyond all the  other audits  and
        questions. The recollections of those 'working in the trenches' during the intense period
        of American Recovery and Reinvestment Act (ARRA)  implementation were invaluable in
        this study.
September 2013

-------
                                  This page intentionally blank.
September 2013

-------
                               TABLE OF CONTENTS
EXECUTIVE SUMMARY	1
  PURPOSE	1
  METHODOLOGY	1
  FINDINGS	1
  RECOMMENDATIONS	1
SECTION 1.  INTRODUCTION	3
  1.1    PURPOSE/OBJECTIVES OF THIS STUDY	3
  1.2    BACKGROUND	4
    1.2.1    Data Flow between Stakeholders	6
    1.2.2    Data Quality Review Process	7
    1.2.3    Revising EPA Data Systems to Address ARRA	9
  1.3    STUDY QUESTIONS	10
SECTION 2.  METHODOLOGY	13
  2.1    DATA GATHERING	13
    2.1.1    Step 1: Collect, Review and Analyze Documents and Information Systems	13
    2.1.2    Step 2: Conduct Interviews with EPA Personnel	14
    2.1.3    Step 3: Analyze Collected Information	14
  2.2    STUDY LIMITATIONS	15
SECTIONS.  FINDINGS	17
  3.1    CHALLENGES	20
  3.2    SUCCESSFUL APPROACHES	24
  3.3    RECOMMENDATIONS/LESSONS LEARNED	27
  3.4    ESTABLISHED CHANGES MOVING FORWARD	28
  3.5    SAIC OBSERVATIONS	29
REFERENCES	31
September 2013

-------
                                       TABLES
TABLE 1. EPA INFORMATION SYSTEMS ENHANCED OR DEVELOPED TO SUPPORT ARRA	4
TABLE 2. EPA MANAGEMENT AND OVERSIGHT EXPENDITURES FOR ARRA SYSTEMS DEVELOPMENT AND
       ENHANCEMENT EFFORTS	10
TABLES. SYSTEM DEVELOPMENT AND ENHANCEMENT STUDY QUESTIONS	11
TABLE 4. NUMBER OF EPA INTERVIEWEES BY EPA INFORMATION SYSTEM	14
TABLE 5. SYSTEM DEVELOPMENT AND ENHANCEMENT STUDY QUESTIONS WITH BIG PICTURE FINDINGS	17
TABLE 6. SUMMARY OF CHALLENGES	22
TABLE 7. SUMMARY OF SUCCESSFUL APPROACHES	26



FIGURE 1. GENERAL ARRA DATA FLOW CONCEPT CHART	7
FIGURE 2. REPORTING TIMELINE AND DATA QUALITY REVIEW ACTIVITIES FOR ARRA SECTION 1512 REPORTING
        TO FEDERALREPORTING.GOV	8
FIGURE 3. NUMBER OF SYSTEMS IMPACTED BY CHALLENGES ACCORDING TO INTERVIEWEES	23
FIGURE 4. NUMBER OF SYSTEMS IMPLEMENTING SUCCESSFUL APPROACHES ACCORDING TO INTERVIEWEES .... 27
FIGURE 5. MAP ILLUSTRATING EPA's PUBLICLY-AVAILABLE GEOSPATIAL DATA	29




APPENDIX 1: SYSTEMS ENHANCEMENT AND DEVELOPMENT EFFORTS TO EPA INFORMATION SYSTEMS
           SUPPORTING ARRA	APPENDIX 1-1
September 2013
IV

-------
EXECUTIVE SUMMARY
PURPOSE

The American Recovery and Reinvestment Act (ARRA) enacted in 2009 required the U.S. Environmental
Protection Agency (EPA) to rapidly obligate its ARRA funds to many hundreds of projects in six
environmental programs across all fifty states, tribes and territories and to satisfy the special reporting
requirements for oversight, accountability, and transparency. EPA contracted with Science Applications
International Corporation (SAIC) to assess EPA's information systems development and enhancement
efforts to implement ARRA.

METHODOLOGY

SAIC used interviews of EPA senior staff combined with additional information from literature and
databases to capture, verify, and analyze the critical lessons learned and successful approaches related to
EPA's system development and enhancement efforts. During the interviews, EPA respondents identified a
number of challenges, successful approaches, recommendations/lessons learned, and systems and
process changes that were initiated due to ARRA, but have had more expansive impacts to EPA's
programs beyond ARRA.

FINDINGS

More than half of the respondents identified the following two challenges: 1) unclear and evolving Office
of Management and Budget (OMB) and Recovery and Transparency Board (RAT Board) guidance, and 2)
the time constraints for ARRA implementation. The majority of the respondents identified successful
approaches that helped them meet the deadlines for ARRA implementation: 1) use existing funding or a
simple process for obtaining funding for systems modification, 2) use existing staff, 3) modify existing
systems, and 4) use existing contracts. Using existing personnel and infrastructure made it possible for
EPA to meet the aggressive ARRA implementation schedule.

Respondents described the process and systems changes they made as a result of the ARRA program and
plan to implement and maintain on a permanent basis. One EPA program established a systematic
process for future system updates. Other programs expanded system capabilities to improve data
management for the entire program, not just ARRA-related functions  (e.g., providing states direct-entry,
expanding the use of funding recommendation templates, improving accounting  consistency between
multiple funds management systems, and adding data elements to enable storage of estimated and actual
data.) In addition to these process and system changes which EPA offices made permanent, three
respondents noted that ARRA brought an  increased focus on geospatial elements of data reporting.

RECOMMENDATIONS

Two of the recommendations proposed by the EPA interviewees were directly related to challenges that
they faced. They might be possible to implement during future large-scale funding efforts.

    •    Respondents recommended providing additional lead time to allow for strategic planning related
        to systems changes. The short implementation schedule did not allow time for strategic planning.
September 2013

-------
        One EPA respondent strongly recommended that the responsible entities from all levels -
        highest management level to the lowest level (individual EPA offices) define the systems
        requirements before starting the effort (and not change them when implementing the
        requirements). (During ARRA, defining the system requirements was not possible due to the
        ongoing changes to the OMB and RAT Board guidance and the changes made to the data
        requirements only a few weeks before the first reporting period.)
September 2013

-------
SECTION 1.    INTRODUCTION
In February of 2009, Congress passed the American Recovery and Reinvestment Act, aimed primarily at
making new jobs and saving old ones, stimulating economic activity and long-term growth, and fostering
accountability and transparency in government spending. Of the $787 billion authorized in the Recovery
Act, EPA was given $7.2 billion. EPA distributed the majority of its ARRA funds to states in grants and
contracts to support clean water and drinking water projects, diesel emissions reductions, leading
underground storage tank clean-ups, Brownfields development, and Superfund clean-ups. This was a
massive undertaking for EPA. The administration of the funds, which were to be injected into the
economy at an unprecedented pace, required that EPA develop or revise policies, processes, and
automated information systems. In the Fall of 2011, EPA tasked Science Applications International
Corporation (SAIC), and its subcontractor Toeroek Associates, Inc., to design and conduct a study to
examine several components of EPA's implementation of ARRA. The SAIC Team studied three
management topics - Cost Estimating processes, Funds Management processes, and Systems
enhancement and development. The Team also looked at three topics geared more towards outcomes
than management processes. These include the Green Project Reserve initiative, the use of ARRA funds to
spur Innovative Technologies, and the use of ARRA funds to Leverage Local Economic Benefits. After
completion of the research phase, the SAIC Team produced a series of six reports, each covering one of
the six topics noted above. The Team also prepared a separate overarching summary report with an
Executive Summary, containing highlights of each of the six reports, as well as a  description of the goals
and methodology for the entire study.

1.1     PURPOSE/OBJECTIVES OF THIS STUDY

EPA tasked the SAIC Team to assess EPA's information systems development and enhancement efforts to
implement ARRA. The assessment focused on EPA-managed information systems that were enhanced or
developed to aid the Agency in the implementation of ARRA requirements. Information systems
developed by other entities, outside of EPA, were not included as part of the assessment.

The primary objective of this task was to capture, verify, and analyze the critical lessons learned,
successful approaches, and successful strategies related to EPA's system development and enhancement
efforts.

This report, one of six in a series, presents the SAIC Team's findings. The report is presented in the
following sections:

    •   Section 1. Introduction - provides an overview of what systems were included in the study and
        why EPA had to modify or develop systems in response to ARRA. This section also lists the study
        questions used by the SAIC Team to  frame and guide the study.
    •   Section 2. Methodology- provides the approach for the data collection and analysis used in the
        study. This section also describes what limitations were considered in the study and their impact
        on the study's findings.
    •   Section 3. Findings - provides the lessons learned and successful approaches as communicated
        by EPA staff.
September 2013

-------
  1.2
BACKGROUND
  When Congress enacted ARRA and EPA was appropriated $7.2 billion in stimulus resources, it nearly
  doubled the Agency's annual budget. The challenge for EPA was to rapidly obligate its ARRA funds to
  many hundreds of projects in six environmental programs across all fifty states, tribes and territories. To
  accomplish this, EPA put in place an agency-wide executive level Stimulus Steering Committee (SSC) that
  provided a governance structure to ensure ARRA requirements were met. The SSC met frequently and
  reached out to all affected EPA Program Offices and stakeholders to adopt a number of measures, tools,
  and business process changes to ensure the timely obligation and expenditures of stimulus funds and to
  satisfy the special reporting requirements for oversight, accountability, and transparency (EPA,2010a).

  It was recognized at the passing of ARRA that modifying existing information systems within EPA was key
  to meeting the spending, oversight, accountability, and reporting requirement  deadlines of ARRA. Given
  the tight timelines, when possible, EPA modified and enhanced existing information systems and
  accompanying guidance materials. In a few cases, EPA also developed new database tools as well as
  related guidance materials to best meet their oversight, accountability, and reporting requirements.

  There are fourteen main EPA information systems that were either developed or enhanced to accomplish
  the implementation of ARRA (see Table 1). A description of the general modifications made to the
  information systems is included in Appendix 1.
 TABLE 1. EPA INFORMATION SYSTEMS ENHANCED OR  DEVELOPED TO SUPPORT ARRA
       NAME OF SYSTEM
                                          DESCRIPTION OF SYSTEM
Office of Water Systems
Drinking Water State Revolving
Fund (Project and Benefits
Reporting) (PER) System
                     The Drinking Water State Revolving Fund Project and Benefits Reporting
                     (PER) system is used by EPA and State Drinking Water State Revolving Fund
                     (DWSRF) programs to track and report on the environmental progress of the
                     DWSRF program.
Clean Water State Revolving Fund
Benefits Reporting (CBR) System
                     The CBR database contains data provided by Clean Water State Revolving
                     Fund (CWSRF) programs on the environmental benefits achieved by CWSRF
                     assistance. The CBR system is used by EPA and states to track and report on
                     the environmental progress of the CWSRF program.
Watershed Assessment Tracking
Environmental Results (WATERS)
                     The EPA Office of Water manages numerous programs in support of the
                     Agency's water quality efforts. Many of these programs collect and store
                     water quality related data in databases. These databases are managed by
                     the individual Water Programs and this separation often inhibits the
                     integrated application of the data they contain. Under WATERS, the Water
                     Program databases are connected to a larger framework.
Office of Environmental Information (OEI) System
Central Data Exchange (CDX) 1512
Data Warehouse
                     Section 1512 of the Recovery Act requires recipient reporting to the Office of
                     Management and Budget (OMB). OMB established a specific website,
                     FederalReporting.gov, for receipt of all ARRA information. EPA extracts
                     recipient reported data from FederalReporting.gov to its CDX "1512 Data
                     warehouse". EPA's CDX is the point of entry on the Environmental
                     Information Exchange Network (Exchange Network) for environmental data
                     submissions to the Agency.
Office of Air and Radiation (OAR) System
  September 2013

-------
NAME OF SYSTEM
Database for Reporting Innovative
Vehicle Emission Reductions
(DRIVER)
DESCRIPTION OF SYSTEM
EPA established this new Oracle system to manage, analyze, and report
Diesel Emission Reduction Program (DERA) programmatic and Recovery Act
data.
Office of Superfund Remediation and Technology Innovation Systems
Comprehensive Environmental
Response, Compensation and
Liability Information System
(CERCLIS)
eFacts
CERCLIS is the official reporting system for the Superfund program. CERCLIS
identifies ARRA sites, the funding for ARRA sites, and ARRA activities within
these sites.
eFacts is an internal data reporting and servicing resource tool for organizing
and reporting data about specific Superfund sites. ARRA information from
CERCLIS and EPA's financial system, the IFMS database (see below under
Office of Chief Financial Officer systems), is extracted, tabulated, sorted, and
presented in a variety of report formats.
Office of Underground Storage Tanks (OUST) System
LUST4 Leaking Underground
Storage Tanks (LUST4)
LUST4, a new system, provides the ability for OUST to receive, store,
process, and report its Recovery Act performance measures data as well as
perform timely quarterly reporting. Data are provided by states in this web-
based Oracle database.
Office of Solid Waste and Emergency Response (OSWER) Systems
Assessment, Cleanup and
Redevelopment Exchange System
(ACRES)
Performance Assessment Tool
(PAT)
ACRES is an analytical system used to support the Brownfields Program in
meeting its mission by storing, tracking, reporting/querying, and sharing
information related to Brownfields with other environmental programs
throughout the Agency. ACRES is an online database for Brownfields
grantees to electronically submit data directly to EPA through CDX.
PAT pulls/reads ARRA data from OSWER's major systems (ACRES, RCRAInfo,
CERCLIS) needed for specific performance measures. It allows manual input
of some measures data that are not currently stored in an OSWER database.
It also pulls/reads additional data from Agency-wide grant and financial
systems and stores these data for analysis, tracking, comparison, and ad hoc
reporting.
Office of Grants and Debarment (OGD) System
Integrated Grants Management
System (IGMS)
The Integrated Grants Management System (IGMS) is EPA's management
information system for all EPA grant programs. This national system is used
by Headquarters, Regions, and states to track, award, administer, and
monitor grants.
Office of Chief Financial Officer (OCFO) Systems
OCFO Reporting and Business
Intelligence Tool (ORBIT),
Executive Management Dashboard
Annual Commitment System (ACS)
and Budget and Accounting
System (BAS)
The Integrated Financial
EPA designed and maintains this enterprise-wide web-based interface to
provide access to multiple data sources containing financial, budget, human
resources, and performance information. Data are pulled from Agency
budget and financial sources. This system provides reports to OMB and
Agency managers for accountability and transparency of use of Recovery Act
funds. This system provides reconciliation with recipient reported funding,
reports to meet OMB's reporting requirements, quality assurance reports,
and performance measures reports. ORBIT was also modified to provide
data for offices to use to double check recipient reporting.
ACS is a performance module in the Budget and Accounting System (BAS)
that tracks annual headquarters and Regional performance commitment
information and results, Senior Management Measures, and Regionally-
created measures.
EPA designed the Integrated Financial Management System (IFMS) expressly
September 2013

-------
NAME OF SYSTEM
Management System (IFMS)
DESCRIPTION OF SYSTEM
for government financial accounting; it supports Government Accountability
Office (GAO) requirements and OMB internal control requirements. It has
recently (October 1, 2011) been replaced by a new EPA financial
management system named COMPASS. These systems perform funds
control from commitments through payment; update all ledgers and tables
as transactions are processed; provides a standard means of data entry, edit
and inquiry; and provides a single set of reference and control files.



1.2.1   DATA FLOW BETWEEN STAKEHOLDERS

Information management needs for ARRA included entering, transferring, tracking, and managing data
between multiple stakeholder entities - EPA Program Offices and Regions, ARRA Prime Recipients (e.g.,
States) and Sub Recipients (e.g., contractors, municipalities), the Office of Management and Budget
(OMB), Congress, and the general public. EPA data systems were used to track and manage:

    •   Funds Distribution - the distribution of ARRA funds from EPA Program Offices and Regions to
        States and Tribes and to Recipients.

    •   Funds Management - Recovery Act obligations, expenditures, project requirements, performance
        measures, and project performance for both internal management use and for reporting to
        outside entities such as OMB, Congress, and the general public.

    •   Reporting and Verification - ensuring the accuracy and quality of data reported by ARRA Prime
        and Sub Recipients to OMB per Section 1512 requirements as well as internal and external
        metrics reporting. Transparency was a key requirement of ARRA.

Figure 1 provides a general data flow concept chart between EPA and outside stakeholders. ARRA funds
distributions were tracked using EPA Program Databases (e.g., IGMS). Many ARRA grant recipients directly
entered, or otherwise provided to  EPA, grant project performance data into EPA program databases. For
some ARRA recipients these data could also be retrieved from EPA program databases for upload to
FederalReporting.gov, avoiding duplicate data entry. EPA extracted Section 1512 recipient reported data
from FederalReporting.gov as part of its oversight and data verification responsibilities under ARRA (see
Data Quality Review Process discussion below). EPA Program Databases provided ARRA output to the
public through the EPA's web pages. In addition, EPA Program Databases provided ARRA data, including
project performance, funds obligations and funds expenditures to other federal agencies including OMB,
Government Services Administration (GSA), Congress and the White  House.
September 2013

-------
                FIGURE 1. GENERAL ARRA DATA FLOW CONCEPT CHART
   ARRA Prime &
   Sub Recipients
       I
Section 1512
  Excel/
XML output
                    Data Entry of
                  performance data
                  (various formats,
                  including web data
                    entry, Excel
                   spreadsheets)
 Program
Databas
                CBR
              WATERS
            CDX/1512 Data
             Warehouse
              DRIVER
              ACRES
              CERCLIS
               eFacts
               LUST4
          IGMS
         ORBIT,
         Executive
       Management
        Dashboard
          ACS
          BAS
          IFMS
                                                      I	>•
                                                                 Other Federal
                                                                   Agencies
            GSA
            OMB
          Congress
         White House
Extract of
Section 1512
Recipient
reported data
for EPA
verification
                        Recipient quarterly reporting per Section 1512
                   (via online Web Form, Excel Spreadsheet, or XML file upload)
          FederalReporting.gov
                                                   t
                                                    I
                                                    I
                                                    I
                                                  . J
Recovery.gov
                                                                                   Public access to ARRA
                                                                                   quarterly reported data
1.2.2  DATA QUALITY REVIEW PROCESS

The Recovery Act required recipients of ARRA funds to report information on funded projects and
activities on a quarterly basis. Recipient reporting was performed through data entry or import into an
OMB hosted website - FederalReporting.gov, with public access to the reported data through
Recovery.gov. The Recovery Act required EPA to conduct a limited review of recipient reported
information each quarter in accordance with Section 1512 of ARRA. EPA established a detailed procedure
for performing the review. The procedure applied to EPA organizations that administered and oversaw
the following EPA programs: Brownfields cooperative agreements, Leaking Underground Storage Tanks
assistance agreements, interagency agreements and contracts, Brownfields interagency agreements and
contracts, Diesel Emissions Reduction grants, Clean Water State Revolving Fund grants, Drinking Water
State Revolving Fund grants, Water Quality Management Planning grants, and Superfund contracts,
interagency agreements and state cooperative agreements. See Figure 2 for the quarterly reporting and
data quality review timeline (EPA 2010a).
September 2013

-------
                Reporting Timeline and  Activities
    Continuous
    Reporting
    Phuse:
i - 16 days after
 end of Quarter
17-19 days after end
   of QIKII-U.T
                  Pflme
                 Recipients
                  S Subs
                 Enter Draft
                 Reporting
                  Data
                Prfmo
               Recipients
               Reviaw Data
               Submitted	
               By Sub(s)A
                 Agency Daily Extract
                                 Agency "View Only"
                               Agency Review Period
                                       Recipient Report Adjustments Possible
         **Reports will be unlocked Days 33-75
         "•There will be no late submission during Days 33-75
                                                Agency
                                             Comment Period
      FIGURE 2.  REPORTING TIMELINE AND DATA QUALITY REVIEW ACTIVITIES
         FOR ARRA SECTION  1512 REPORTING TO  FEDERALREPORTING.GOV
  Source: EPA 2010a

The procedure for EPA's limited review of Section 1512 recipient reported data focused on significant
errors (e.g., missing one or more of four specific data elements of major concern to OMB) and material
omissions (e.g., data provided is not responsive to a specific data element). Data were extracted by the
Tracking and Reporting Subcommittee (a subcommittee of EPA's Stimulus Steering Committee) from
FederalReporting.gov via an Extensible Markup Language (XML) data feed and stored in a data repository
commonly referred to as the "Section 1512 data warehouse". Macro (agency-wide) reviews of recipient
reported data  across all programs were performed, as were local reviews by National Program Managers
(NPMs) and Regions. Comparisons were made between the recipient reported data extracted from
FederalReporting.gov and EPA's master list of awards provided to OMB. Specific data elements (e.g.,
Amount of Award, Award date, Project Description, Total ARRA funds received/invoiced) were also
compared between what recipients reported against information housed in EPA's program databases
September 2013

-------
(e.g., IFMS, IGMS, ACRES, CBR/PBR, etc.) EPA and NMPs worked together to assist recipients in correcting
errors to ensure the highest quality data was reported to the public.
1.2.3   REVISING  EPA DATA SYSTEMS TO ADDRESS ARRA

EPA Program Office subject matter experts (SMEs) working with the Stimulus Steering Committee and
using guidance provided by the Office of Management and Budget, and the Recovery Accountability and
Transparency Board (RAT Board), determined what data elements, data definitions, and system
modifications would be needed for each EPA program information system. The Recovery Act included a
Management and Oversight (M&O) budget for EPA to administer the implementation of ARRA. Some of
the M&O funds were used to enhance existing information systems or to develop new ones.

EPA's Office of Chief Financial Officer (OCFO) had direct oversight of the M&O funds. OCFO implemented
a proposal process for EPA offices to request and receive funds for their systems development or
enhancement efforts. Each Program  Office submitted its proposal funding request to OCFO. The proposal
requests included:

    •    Responsible National Program Manager
    •    Systems (IT) Program Name
    •    Existing or New Funding Vehicle
    •    Type of Spending (e.g., contract, grant)
    •    Recovery Act Purpose/System Need
    •    Description of Systems Development
    •    Contact Person(s)
    •    Funding Amount Requested (excel format).

EPA invested more than $1.6 million of M&O funds in modifying and developing information management
systems to effectively implement ARRA. Table 2  provides a breakdown of the M&O funds invested in each
system.
September 2013

-------
 TABLE 2. EPA MANAGEMENT AND OVERSIGHT  EXPENDITURES FOR ARRA SYSTEMS
                    DEVELOPMENT AND ENHANCEMENT EFFORTS
B3
/



OSWER



OSWER


OAR


OCFO


OCFO


OEI


Totals
RECOVERY ACT PROGRAM
ARRA SRG Reporting & Data Quality Monitoring System
PER
CBR

Superfund Remedial
CERCLIS (OSWER)
e-Facts

LUST
LUST4 (part of OSWER's Performance Assessment Tool [PAT])

DERA- Diesel Grants
DRIVER

Overall Reporting & Management
ORBIT, Executive Management Dashboard

Overall Reporting & Management
Budget Automation System/ Annual Commitment System

Overall Reporting & Management
Central Data Exchange (CDX)/ Section 1512 Data Warehouse


BUDGETED AMOUNT
$195,000
$97,500
$292,500

$125,000
$50,000
$175,000

$50,000
$50,000

$200,000
$200,000

$370,000
$370,000

$42,000
$42,000

$500,000
$500,000

$1,629,500
Most of the system development and enhancement efforts used existing IT support contract vehicles to
implement system revisions. Standard EPA protocols for modifying systems were followed (e.g., updated
configuration management manuals, preparing data schema, updating data element dictionaries, etc.).
Appendix 1 provides a summary of the system changes that were performed for each database system.

1.3     STUDY QUESTIONS

Based on the purpose and objectives of the study as described in Section 1.1, the SAICTeam and EPA
developed questions to frame and guide the effort. Table 3, Study Questions, presents the questions. The
findings are presented in the Lessons Learned in Section 3 of this report.
September 2013
10

-------
      TABLE  3. SYSTEM DEVELOPMENT AND ENHANCEMENT STUDY QUESTIONS
      STUDY QUESTION
 Focus AREA
 Factors for Success:

 What factors contributed to
 developing or modifying
 systems that met the needed
 data requirements of ARRA?
Responsibility
for System
Changes
                             Processes to
                             Implement
                             Systems
                             Changes
          DETAILED STUDY QUESTIONS
Staffing:
How did the characteristics of the staffing team(s)
(e.g., program expertise, seniority, authority, program
office representativeness) responsible for developing
and modifying data systems facilitate or impact the
outcome of meeting the data requirements of ARRA?
                 Coordination:
                 How did responsible parties and program offices
                 within EPA coordinate system development or
                 modification efforts?
                 Process:
                 What were EPA's processes for implementing system
                 development or modification efforts that allowed EPA
                 to meet the schedule and requirements of ARRA?
 System
 Development/Modification
 Challenges:

 What were the challenges to
 developing or modifying
 EPA's information
 management systems?

 What factors created
 challenges to developing or
 modifying systems?
                             Identification of
                             Needed
                             Systems and
                             System
                             Modifications
                 Identification Process:
                 What were the challenges in identifying what
                 information systems needed to be modified or
                 developed to meet ARRA requirements?
                 Identification Challenges:
                 What were the challenges to meeting internal
                 deadlines for identifying system revisions or
                 development of new systems?
                             Funding
                             Mechanisms
                                             Funding Source:
                                             How were system development/modification efforts
                                             funded?
                 Funding Process:
                 What was the process for securing funding?
                 Funding Challenges:
                 What were the challenges, if any, to obtaining
                 funding?
                 Responsibility for Implementation:
                 What skills and authority level were required of the
                 staff responsible for implementing changes to EPA's
                 information systems?

                 What additional skills may have made the process
                 easier?
                             Implementation
                             of System
                             Development/
                             Modifications
                 Implementation Schedule:
                 What were the challenges of meeting the
                 implementation schedule for ensuring ARRA data
                 needs were met?
                                             Implementation Challenges:
                                             What types of problems were encountered when
                                             modifying existing information systems and why?

                                             How did EPA overcome them?

                                             How did EPA manage conflicting or evolving data
                                             system requirements that delayed or otherwise
                                             impacted system development/enhancement efforts?
September 2013
                                                                      11

-------
      STUDY QUESTI
 Leveraging for Future
 Systems Development:

 How well did systems
 development and
 modification efforts succeed
 in meeting their ARRA
 objectives?

 Which ARRA information
 management process
 changes have been and
 should be leveraged for
 future use?

 Are any system development
 changes applicable to
 multiple EPA program
 systems?
Benefits of
System
Development/
Modifications
                           DETAILED STUDY QUESTIONS
                 Leveraging for Other Purposes (i.e., non-ARRA)1:
                 How might system modifications that were made to
                 support ARRA implementation be useful for other
                 purposes?

                 Were the modifications unattainable without the
                 recovery funds?
Other Benefits (i.e., non-ARRA):
What other beneficial changes attained through the
system modifications (e.g., new views on data
available due to enhanced data reporting, EPA/State
data sharing, or increased transparency of data to
public/state/regional/headquarters entities) should be
made permanent?
                 Successful Approaches:
                 With regard to ARRA system development and
                 modification efforts, what strategies (e.g., staffing
                 approach, communication strategy, planning strategy)
                 do staff consider successful approaches?
1 EPA did not use ARRA funds for any non-ARRA purposes. This question addresses whether systems
changes made to implement ARRA also facilitated the implementation of non-ARRA EPA programs.
September 2013
                                                                     12

-------
SECTION 2.   METHODOLOGY
The methodology for this study primarily focused on interviews of EPA senior staff combined with
additional information from literature and databases to address the study questions. The majority of the
information was gathered from EPA staff interviews.

The SAIC Team implemented the study methodology in the following steps which are described in more
detail below:

    1.   Gathered, compiled, and analyzed existing information such as previous studies, system
        documentation (e.g., data element dictionaries, data verification procedures).
    2.   Interviewed EPA personnel representing all fourteen information systems that were either
        modified or developed in response to ARRA.
    3.   Analyzed the results of the interviews and integrated these with information found in the
        analysis of existing data.
    4.   Prepared results from the above analysis (see Section 3. Findings).

2.1     DATA GATHERING

The SAIC Team started the study by reviewing background information, accessing and looking at
databases, systems documentation, data quality procedures, and guidance materials developed by EPA. A
list of reference documents for this study is included at the end of this report.
2.1.1   STEP 1:  COLLECT,  REVIEW AND ANALYZE  DOCUMENTS AND INFORMATION
SYSTEMS
The SAIC Team reviewed and analyzed information and data from the following sources:
    •   Information system supporting documentation: The SAIC Team reviewed many information
        systems documents to include systems descriptions, data element dictionaries, data schemas,
        and reports/outputs from databases. Other documentation was also reviewed and analyzed,
        such as funding requests/proposals submitted to OCFO for system development/enhancement
        efforts specific to ARRA. Other supporting documents included EPA procedures for verifying
        recipient reported data.
    •   Access to  EPA databases: The SAIC team was granted access to and was provided data from
        many of the EPA data systems included in this study. The SAIC Team was able to view and
        generate reports relevant to ARRA funding distribution, project management, and data
        verification to gain an understanding of the system  and the revisions made to accommodate
        ARRA data requirements.
    •   Existing studies: EPA's activities related to ARRA,  including ensuring data quality, have been
        reviewed within EPA by the Office of Inspector General (OIG) and by outside agencies such as the
        U.S. Government Accountability Office. The SAIC Team reviewed these reports and extracted
        relevant information on improvements, lessons learned, and successful approaches as it relates
        to systems development and enhancement efforts and integrated them into this report as
        appropriate.
September 2013                                                                            13

-------
2.1.2   STEP 2:  CONDUCT INTERVIEWS WITH EPA PERSONNEL
The primary source of information that the SAIC Team used for this study were interviews with the EPA
Subject Matter Experts (SMEs) who were responsible for the overall oversight and management of the
system modification or development efforts. The SAIC Team first developed a list of key EPA personnel to
interview, to cover the fourteen EPA information systems (listed  in Table 1) that were modified or
developed to meet ARRA data needs. Participation was voluntary and some staff were not available for
interviews due to scheduling difficulties. In most cases, prior to an interview, the SAIC Team provided the
interviewee with the study questions and  provided a brief overview of the purpose of the study. Most
interviews were approximately one hour in length. During a few EPA interviews, an EPA Office of the Chief
Financial Officer representative was present. The interviewees were specifically asked and agreed to the
OCFO representative's presence.

In all, the SAIC Team interviewed 27 EPA personnel that had a role in enhancing or developing information
systems to meet the data requirements and needs of ARRA. The EPA staff interviewed served in various
support roles, from management to staff level support, in modifying or developing the database systems
that were included in the study.

      TABLE 4.  NUMBER OF EPA INTERVIEWEES BY EPA INFORMATION SYSTEM


A INTERVIEWS CONDUC




PBR/
CBR/
Waters
X
CDX
X
Driver
x(3)
Acres
X
CERCLIS/
eFacts
x(5)
LUST4
x(5)
PAT
x(2)
IGMS
X
ORBIT
X
ACS/
BAS
X(4)
IFMS
X
QA
Process
X
Funding
Process
X
2.1.3   STEP 3: ANALYZE COLLECTED INFORMATION

The SAIC Team analyzed the information collected in Steps 1 and 2 above. Information from existing
studies as well as interview responses was aggregated, summarized, categorized and analyzed to identify
challenges, lessons learned, or successful approaches. Specifically, the SAIC Team reviewed the
interviewees' responses that identified:

    •   Commonalities in interview responses with regards to implementation challenges for system
        modifications
    •   Accuracy and helpfulness of guidance materials from OMB, OEI, or other oversight entities (for
        the program offices that needed to modify existing systems)
    •   Similarities in problems encountered with regard to:
            o   Funding information system development/modification efforts;
            o   Meeting internal deadlines for information system roll-out;
            o   Coordination of data and related guidance to Regions and states

    •   Benefits resulting from system modifications/development efforts that could be leveraged for
        future Agency-wide programs.
September 2013
14

-------
2.2     STUDY LIMITATIONS

The following limitations are noted with regard to this study:

    •   There is inherent uncertainty introduced in the collection of subjective information. The
        interview process introduces uncertainty through the collection of subjective information
        provided by individuals relaying recollections/memories of activities conducted three years ago.
        However, these recollections/memories are part of the institutional knowledge created during
        ARRA implementation.
    •   Staff turnover. The interviews of some key staff were not possible due to a change in their
        position or employment status (e.g., retirement). Thus, in a few cases, the interviewee did  not
        have first-hand knowledge of all of the initial processes involved in modifying or developing
        systems for ARRA implementation.
    •   Information systems outside of the scope of study. This study, by design, focused only on
        information systems within EPA. Other information systems owned by states, tribes, or other
        federal agencies were not considered.
September 2013                                                                             15

-------
                                  This page intentionally blank.
September 2013                                                                             16

-------
SECTION 3.   FINDINGS
During the interviews, EPA respondents identified a number of challenges, successful approaches,
recommendations/lessons learned, and systems and process changes that were initiated due to ARRA,
but have had more permanent and expansive impacts to EPA's programs.

In addition to information gleaned from the interviews, this section contains references and quotes from
previous studies and reviews conducted by EPA, which are included as examples to illustrate how other
studies and reviews corroborate the findings from the interviews conducted as part of this study.

Table 5 summarizes big picture findings for each study question. The  table looks like the study questions
table in the introduction but uses the third column to show the big picture findings. The big  picture
findings are based on interviews with EPA Headquarters staff responsible for managing and  implementing
changes to EPA systems in response to ARRA data needs. The text following the table includes a more
thorough discussion of finding and is presented as five sections - challenges, successful approaches,
recommendations/lessons learned, established changes moving forward, and observations made by the
SAICTeam.

TABLE 5. SYSTEM DEVELOPMENT AND ENHANCEMENT STUDY QUESTIONS WITH BIG
                                    PICTURE FINDINGS
                          fERARCHING STUDY QUESTION - FACTORS FOR SUCCESS
  Factors for Success:
  What factors contributed to developing or modifying systems that met the needed data requirements of ARRA?
           DETAILED STUDY QUESTIONS
               BIG PICTURE FINDINGS
 Staffing:
 How did the characteristics of the staffing team(s)
 (e.g., program expertise, seniority, authority,
 program office representativeness) responsible for
 developing and modifying data systems facilitate or
 impact the outcome of meeting the data
 requirements of ARRA?
EPA used existing staff and existing contractors who
already had the knowledge and expertise in their systems
to facilitate and ensure systems modifications were done
effectively and in the required timeframe.
 Coordination:
 How did responsible parties and program offices
 within EPA coordinate system development or
 modification efforts?
EPA program offices maintained communications
primarily through frequent meetings and coordination
with all the stakeholders.  EPA staff used the meetings,
which involved staff across multiple disciplines and multiple
offices, as a mechanism to discuss issues and solutions.
The meetings also resulted in collaborative efforts and
relationship building across the agency, which were needed
for the systems modifications.
  Process:
  What were EPA's processes for implementing system
  development or modification efforts that allowed
  EPA to meet the schedule and requirements of
  ARRA?
Use of existing processes, staff and contractors and
modifying existing systems (rather than developing new
systems) and constant and regular communication and
collaboration were the common factors which allowed
and enabled EPA to meet the schedule and system
requirements of ARRA.
September 2013
                                               17

-------
OVERARCHING STUDY QUESTION - SYSTEM DEVELOPMENT/MODIFICATION CHALLENGES
System Development/Modification Challenges:
What were the challenges to developing or modifying EPA's information management systems?
What factors created challenges to developing or modifying systems?
DETAILED STUDY QUESTIONS
Identification Process:
What were the challenges in identifying what
information systems needed to be modified or
developed to meet ARRA requirements?
Identification Challenges:
What were the challenges to meeting internal
deadlines for identifying system revisions or
development of new systems?
Funding Source:
How were system development/modification efforts
funded?
Funding Process:
What was the process for securing funding?
Funding Challenges:
What were the challenges, if any, to obtaining
funding?
Responsibility for Implementation:
What skills and authority level were required of the
staff responsible for implementing changes to EPA's
information systems?
What additional skills may have made the process
easier?
Implementation Schedule:
What were the challenges of meeting the
implementation schedule for ensuring ARRA data
needs were met?
Implementation Challenges:
What types of problems were encountered when
modifying existing information systems and why?
BIG PICTURE FINDINGS
This study did not find challenges in identifying the
systems to be modified.
The challenges in identifying the systems modifications
that were needed were a result of ongoing changes to the
data needs and schema from the RAT Board.
In the early stages of determining ARRA data
requirements, additional coordination efforts could have
been undertaken between data stakeholders (OMB, EPA
program offices, OCFO, Regions, and States) to examine
longer-term ARRA data needs.
Funding came from the ARRA Management and Oversight
Funds administered by the OCFO.
The OCFO implemented their standard proposal process,
with which EPA program staff were familiar. Program
offices submitted funding requests that included the
system enhancements/modifications needed.
Generally there were no challenges to obtaining the
funding, although one EPA office thought the process
could have been more transparent.
Overall EPA used existing staff and existing contractors
who already had the knowledge and necessary authority
to facilitate and ensure systems modifications were done
effectively and in the required timeframe.
This study did not find additional skills that would have
made the process easier.
Challenges included the short timeframe to implement
modifications to meet the deadlines (time constraints);
unclear and evolving guidance from the RAT Board and
OMB made the deadlines more difficult to meet since
clarifications were needed before the systems could be
revised. Basically EPA addressed these challenges through
the dedication and increased workload of the existing staff
to ARRA implementation and the continuous
communication and collaboration efforts (described under
Factors for Success above).
Taking existing staff from their existing responsibilities
and dedicating them to ARRA implementation created a
challenge of having sufficient staff to implement the
regular program responsibilities.
September 2013
18

-------
F
OVERARCHING STUDY QUESTION - SYSTEM DEVELOPMENT/MODIFICATION CHALLENGES
                                                 Data requirements were not in place with sufficient lead
                                                 time. The RAT Board was slow to provide EPA with needed
                                                 data requirements and often those requirements were
                                                 evolving while system modifications were ongoing.
  How did EPA overcome them?
                                      Some EPA programs addressed staffing challenges by
                                      using other (sometimes less experienced) staff to
                                      implement the regular program responsibilities.
  How did EPA manage conflicting or evolving data
  system requirements that delayed or otherwise
  impacted system development/enhancement
  efforts?
                                      To address evolving data requirements, EPA staff worked
                                      long hours to meet an already short deadline to
                                      implement system revisions.
            OVERARCHING STUDY QUESTION - LEVERAGING FOR FUTURE SYSTEMS DEVELOPMENT
  Leveraging for Future Systems Development:
  How well did systems development and modification efforts succeed in meeting their ARRA objectives?
  Which ARRA information management process changes have been and should be leveraged for future use?
  Are any system development changes applicable to multiple EPA program systems?
           DETAILED STUDY QUESTIONS
                                                     BIG PICTURE FINDINGS
  Leveraging for Other Purposes (i.e., non-ARRA) :
  How might system modifications that were made to
  support ARRA implementation be useful for other
  purposes?
                                      Grant programs will benefit from the development of a
                                      standard funding recommendation template. All
                                      programs could benefit from web-based data entry by
                                      states into program information systems.
 Were the modifications unattainable without the
 recovery funds?
                                      Although it is possible that these new processes would
                                      have been attainable in lieu of ARRA, they were primarily
                                      done to expedite the ARRA funds obligation and
                                      expenditures.
 Other Benefits (i.e., non-ARRA):
 What beneficial changes attained through the system
 modifications (e.g., new views on data available due
 to enhanced data reporting, EPA/State data sharing,
 or increased transparency of data to
 public/state/regional/headquarters entities) should
 be made permanent?
                                      Three EPA offices implemented processes that became
                                      permanent:
                                              An established process for making future systems
                                              modifications
                                              An established process for State entry of data
                                              An established funding recommendation
                                              template.
 Successful Approaches:
 With regard to ARRA system development and
 modification efforts, what strategies (e.g., staffing
 approach, communication strategy, planning
 strategy) do staff consider successful approaches?
                                      As described above, the factors considered as successful
                                      approaches are use of existing staff, contractors, and
                                      systems, and continual communication and collaboration.
  EPA did not use ARRA funds for any non-ARRA purposes. This question addresses whether systems
changes made to implement ARRA also facilitated the implementation of non-ARRA EPA programs.
September 2013
                                                                                      19

-------
3.1     CHALLENGES

The following provides an overview of the major challenges described by the EPA systems interviewees
that the SAIC Team heard consistently or were specifically mentioned by an EPA interviewee:
      "A challenge  all programs faced in administering ARRA  resources was ensuring  the
      success of our state  partners and  recipients in meeting newly established  deadlines,
      policies and business rules. The fast creation and an "on-the-fly" approach to  rolling out
      FederalReporting.gov help desk support and guidance in general created an unnecessary
      tension with our recipients. If given the opportunity for a re-start, EPA would advocate
      for early interaction with recipients by OMB and the RATB."

                 Quoted from EPA New Ways Evaluation Responses October, 2010
Data Requirements Not in Place with Sufficient Lead Time: One EPA respondent reported that ARRA data
requirements were not in place in time to prepare and test procedures for receiving and manipulating
ARRA data. He noted that it was unclear what data would be on the Federal reporting web site, how the
data would be organized and how a person would access the data, until about four weeks prior to the first
data reporting period. EPA did not receive the finalized data schema  (the format in which the data would
be supplied) until just two weeks before the first reporting period. However, even though EPA received
the data schema at that time, EPA did not receive the test schema from the RAT Board until just a few
days before the first reporting period started so that EPA could test their procedures for receiving and
analyzing the data.

Unclear and Evolving RAT Board and OMB Guidance: The Office of Management and Budget and the RAT
Board issued guidance for recipients and the EPA  programs. Several EPA respondents indicated that the
guidance from OMB and the RAT Board contained issues that resulted in implementation  inefficiencies.
The EPA respondents also noted that the RAT Board and OMB guidance was unclear and was continually
evolving, even while the EPA program staff were working on the system modifications. Respondents also
indicated that OMB's priorities changed once implementation began (e.g., a Financial Operations Report
was only required two reporting quarters and environmental performance results were not updated on
the OMB website).

Ongoing Changes to the Data Needs and Schema: Several  EPA respondents agreed that the changing RAT
Board data requests created a large challenge. One EPA respondent described a situation  in which EPA
had prepared its systems, and then without prior  notice, the RAT Board changed the data schema. For up
to three reporting periods, changes to the data schema caused problems with the EPA systems. EPA's
3 A database schema of a database system is its structure described in a formal language supported by the
database management system (DBMS) and refers to the organization of data to create a blueprint of how
a database will be constructed (divided into tables).
September 2013                                                                            20

-------
systems semi-automatically pull data from the federalreporting.gov system. EPA did not know that
changes had been made to the schema until the data pull failed. EPA then reviewed the data linkages to
determine the problem. During this review, EPA discovered that the data schema had changed.

Nontransparent Process to Obtain Funding for Systems Modifications: One EPA respondent stated that
the process for funding the system modifications implemented by the Office of the Chief Financial Officer
was not transparent. Although OCFO did provide the form for submitting funding proposals, they did not
provide specific information or criteria as to how the funding would be allocated. Further, another
respondent indicated that in the early stages of identifying system changes and data needs, the finance
and program staff would have benefitted from considering long-term data needs beyond the immediate
data requirements as determined by the RAT Board.

One EPA respondent had a misunderstanding about the contract policy, and the existing systems support
contract was undergoing a re-compete which would have complicated the process. Therefore, this EPA
program office opted not to request funding because they thought they would need to develop a new
contract and the timeframe would not allow that. As a further note, this EPA program office respondent
stated that the dollars needed for implementing ARRA-related modifications were inconsequential
compared to the normal budgets for systems maintenance and modification.

Time Constraints to Implement Changes: Most respondents described how the short ARRA
implementation deadlines affected their systems enhancement efforts. One respondent stated that the
time pressure was the biggest problem. Another respondent noted that with the short timeframe, staff
did not have sufficient time to think through the changes that needed to  be made  and many changes
were done 'on the fly.' Thus, some of the decisions made in haste had to  be revisited later, and since
these types of changes had never been done before, there were no predictable outcomes to the process.
A third respondent noted that the due to the short timeframe, their system had quite a few bugs and
problems; as a result, the program office has spent a large amount of money in the past fiscal year (i.e.,
Fiscal Year 2012) to address these problems and make improvements. Another EPA employee left after
the policies and groundwork for systems changes had been established, and his successor was left to live
with his predecessor's mistakes and was not able to revisit or revise those decisions due to the time
constraints.

Another respondent  noted that the time constraints for ARRA implementation required that ARRA system
changes be made at the same time as the roll-out of the new EPA Acquisition System (EAS). During FY
2010, EPA was migrating data from the Integrated Contracts Management System (ICMS) to EAS. Like its
predecessor ICMS,  EAS provided summary procurement data to the Federal Procurement Data System
Next Generation (FPDS-NG) including the information on ARRA-funded actions. The rollout of the EAS at
this time greatly complicated reconciliations between data in the EAS, EPA's Integrated Financial
Management System (IFMS), and FPDS-NG.

Increased Staff Workload: Seven respondents noted that the responsibilities associated with
implementing ARRA systems requirements affected their existing staffs ability to complete their regularly
assigned responsibilities. For example, one program respondent noted that creating the ARRA reports was
done at the expense  of creating other reports that the program may have needed. Another EPA
respondent noted that their staff, who worked on the ARRA systems enhancement efforts, also managed
the program hotline, and because of the intense effort needed for the ARRA systems enhancement, they
probably took longer to address hotline issues. Finally another EPA program respondent noted that the
September 2013                                                                            21

-------
most difficult challenge associated with the ARRA systems enhancement was coordinating the system
revision deadlines with other program priorities, such as data reporting, new initiatives, and other job
functions. In response to this challenge, several respondents stated that EPA used less experienced staff
to cover the workload of more experienced staff.

Inefficiency Due to Higher-Level Decision-Making: EPA respondents noted that the level of decision-
making related to ARRA implementation was higher than usual, creating a challenge for staff. As
examples, EPA respondents provided the following comments:

    •   The process for enhancing program systems had a higher management level of decision-making
        authority than was necessary primarily because of the potential risk of involvement by the
        Inspector General. The senior management personnel perceived career risks. As a result,
        decision-making authority was elevated several organizational levels higher than typical decision-
        making authority. This higher level of decision-making sometimes affected the ability of EPA staff
        to move forward and implement the required modifications.
    •   The levels of management oversight on the systems runs for the quarterly reporting data was
        burdensome to EPA program management staff, EPA Regions, and grantees.
    •   Because ARRA was a highly visible, highly politicized effort, everyone was very reactive, and did
        not communicate effectively at the Federal agency level (i.e., Federal agency to agency).

Table 6 summarizes the challenges described by the EPA systems interviewees that the SAIC Team heard
consistently or were specifically mentioned by an EPA interviewee. Figure 3 indicates how many (of 14
total systems) were impacted by each challenge according to the EPA interviewees.

                         TABLE 6.  SUMMARY OF CHALLENGES
                  |  Data Re
                                    SUMMARY OF CHALLENGES
    Requirements Not in Place with Sufficient Lead Time
Unclear and Evolving RAT Board and OMB Guidance
                   Ongoing Changes to the Data Needs and Schema
                    Nontransparent Process to Obtain Funding for Systems
                    Modifications
                   Time Constraints to Implement Changes
                    Increased Staff Workload
                    Inefficiency Due to Higher-Level Decision-Making
September 2013
                                                                         22

-------
            FIGURE 3. NUMBER OF SYSTEMS IMPACTED BY CHALLENGES
                            ACCORDING TO INTERVIEWEES
               Data Requirements Not in Place

  Unclear/Evolving RAT Board and OMB Guidance

  Ongoing Changes to the Data Needs and Schema

      Nontransparent Process to Obtain Funding

         Time Constraints to Implement Changes

                    Increased Staff Workload

  Inefficiency Due to Higher-Level Decision-Making
                                                                             10
                                                                                    12
                                                                                           14
                                     I Number of Impacted Systems
The following two comments were provided as follow-up to one of the EPA interviews. The comments
both deal with inconsistent or missing data elements within non-EPA systems that caused challenges for
EPA staff. These data element issues are not represented in the graph above because they were problems
with non-EPA systems.

Problems with the Federal Procurement Data System - Next Generation: The EPA and all other Federal
agencies must submit detailed information regarding contract actions to the General Services
Administration's FPDS-NG. The data elements used to identify which contract actions were ARRA-funded
were inconsistent. Three data element problems are described below:

    •  The data identification method  used at the start of ARRA required the Description of
        Requirement data element to begin with a  prescribed format (e.g., TAS::688195::TAS). If that
       format was not followed to the exact letter, contract actions would not show up in the FPDS-
        NG's ARRA Report.
    •   In March 2010, OMB began requiring all agencies to complete the Treasury Account Symbol
        (TAS) Code data element for all contract actions. However, the TAS Code (68 0108) given to EPA
       to track ARRA M&O funds had already been used for over a decade for non-ARRA M&O funds.
       The Agency used manual fixes for about 3 months to ensure that EPA's ARRA-funded M&O
       contract actions were coded properly. Other Federal Agencies also found this to be a problem.
    •  Starting in July 2010, OMB added a new data element, the Initiative field, which needed to be
        marked "ARRA" for contract actions with ARRA funding. It was to be left blank on other actions.
       This particular modification posed a problem for EPA, because the EPA Acquisition System (EAS),
       which  is used to upload data to FPDS-NG, did not have the capability to allow a particular TAS
       Code to be coded as ARRA for one action, but non-ARRA for another. Some other Federal
       Agencies also had problems with this.
September 2013
23

-------
Problems with the Federal Reporting.gov System:  One respondent noted that every oversight/audit
group that reviewed EPA's Section 1512 Recipient Reporting process concluded that the
FederalReporting.gov system should have had the capability (i.e., data element) to separately document
that a Contracting Officer Representative (COR), Contracting Officer (CO), and ARRA coordinator had
reviewed a specific report. The only capability was to either mark a report as reviewed without comment
or post written comments. EPA spent additional time and effort reaching out to the required reviewers to
ensure they were aware of their responsibilities. Although this manual method of communicating with the
reviewers worked, the respondent believed that it would have been preferable to program the system to
allow documentation for each level of review.

3.2     SUCCESSFUL APPROACHES

The following provides an overview of the successful approaches described by the EPA interviewees.

Use Existing Funding or a Simple Process for Obtaining Funding for Systems Modifications: Most EPA
respondents noted that they had no difficulties with the OCFO process (described in Section 1) to obtain
funding and received the amounts they requested for systems development. The EPA respondents'
reasons for this were the following:

    •   The OCFO process was similar to the process they typically use for requesting funding for system
        modifications.
    •   All of the interviewees responded that sufficient funds were available to make their systems
        changes.
    •   Several associated and necessary system activities and services (training, dashboard
        development, data integration services, and behind-the-scenes wiring) were paid for within the
        Working Capital Fund.

Use Existing Staff: All of the EPA respondents stated that their program offices used existing Full Time
Equivalents (FTEs) to support the systems development and enhancement efforts needed for ARRA. This
was essential to successful implementation because of their knowledge and expertise as well as the fact
that they had existing working relationships. The EPA respondents described the different ways in which
their offices used existing staff to accomplish the systems modifications in the required timeframe:

    •   One EPA program office awarded only ARRA grants in 2009, the first year of ARRA. Then this
        program office combined the 2009 and 2010 program funds and awarded both program and
        ARRA funds in 2010.
    •   Another EPA program office, which had multiple systems that needed to be modified, had a small
        core team that worked on their office's systems modifications. Existing staff managers took the
        leads for the modifications for their already-assigned systems.
    •   An EPA office, which is accustomed to shifting its focus based on the priorities of the
        administration, assigned approximately three existing FTEs to support ARRA. The minor staffing
        changes did not affect other activities in this office.
September 2013                                                                             24

-------
    •   Another EPA program office was able to easily and quickly update its system because of the close
        relationships among the existing staff that needed to work together. These same EPA staff were
        involved in all aspects of the system modifications including the formulation, control and
        execution.

Although most interviewees found this to be a successful approach, the use of existing staff caused an
associated challenge in that existing staff working on ARRA implementation could not complete their
regularly assigned program responsibilities (see Bullet 6 in Section 3.1 Challenges).

Modify Existing Systems (rather than Create New Systems):  Most EPA respondents stated that EPA's
success in meeting the ARRA's timely obligation of funds and  subsequent data management and reporting
requirements was largely due to EPA management's early decision to use existing program office data
systems versus creating new ones. For example, one respondent stated that modifying existing systems
allowed  the data to flow more quickly while another respondent stated that it is better not to start from
scratch.  In addition, EPA program offices, Regions, and grantees were familiar with the existing systems
and related business processes, so guidance and training needs were reduced. One respondent indicated
that the main challenge with modifying existing systems is encountered during the testing stages. Testing
is required to ensure that none of the underlying data relationships are damaged. Although this is a
challenge associated with modifying existing systems, the respondent agreed that there are many more
challenges associated with developing an entirely new system.

Many EPA respondents specifically mentioned the benefits of using the existing Executive Management
Dashboard to create internal and public ARRA reports. The reports facilitated quality review of the data by
the Regions and Program Offices and simplified  reporting to the RAT Board.
               "[The] electronic dashboard and standardized reports for senior
               management and analysts to track Recovery Act obligations,
               expenditures and recipient reported information...enabled the agency
               to maintain obligation/outlay progress, improve data quality of
               recipient information and keep non-reporting to a minimum."

               Quoted from EPA New Ways Evaluation Responses October, 2010
Use Existing Contractors: Seven respondents stated that they used existing contracts/contractors to
modify their systems to meet ARRA requirements. They cited reasons why modifying their existing data
systems using existing contract vehicles was successful:
    •   A longstanding contract was already in place and the contractor had knowledge of the systems
        so they could modify them more quickly.
    •   EPA could simply increase funding and modify an existing contract (as long as the contract had
        capacity) to perform the work versus going through a time-consuming bid and proposal process.

Maintain Communication and Encourage Collaboration: Several EPA respondents described the initial
efforts which they felt contributed to implementing the systems modifications. Many respondents
participated in frequent and numerous meetings to discuss topics such as the data to collect, ways to
September 2013
25

-------
display the data and development of performance measures. The various efforts described by the EPA
respondents included the following:

    •   One EPA program office was very effective working with their Congressional counterparts. They
        worked with the Congressional staff ahead of the curve to ensure a common understanding of
        what information management needs would be in the Act.
    •   Several EPA respondents felt that the frequent internal meetings, especially at the beginning of
        the process, were very useful. Responses from the EPA staff described the meetings as follows:
            o   Meeting organizers showed vision and kept staff engaged.
            o   Meetings helped to develop relationships that continue and are a source of support
                when needed.
            o   Meetings provided a way for EPA to coordinate across offices with different goals and
                different information systems.
    •   Multiple respondents noted that coordination with other stakeholders was also important. One
        program office had previously established information coordinators in the Regions, which were
        critical in communicating with grantees throughout the process.

Set Clear Goals: Many respondents noted that the leadership shown by EPA management and staff
facilitated ARRA implementation. One respondent stated that it was important to have a clear goal and
that both EPA Headquarters and Regional staff stepped up. Another respondent noted that the overall
community is needed to make things happen and that EPA had experienced and dedicated staff who did
this. Another respondent commended management for rallying the troops.

Table 7 lists the successful approaches described by the EPA systems interviewees that the SAIC Team
heard consistently or was specifically mentioned by an EPA interviewee. Figure 4 indicates how many (of
14 total systems) implemented each successful approach according to the EPA interviewees.

                  TABLE 7. SUMMARY OF SUCCESSFUL APPROACHES
                              SUMMARY OF SUCCESSFUL APPROACHES
                  Use Existing Funding or a Simple Process for Obtaining Funding for
                  Systems Modifications
                  Use Existing Staff
                  Modify Existing Systems (rather than Create New Systems)
                  Use Existing Contractors
                  Maintain Communication and Encourage Collaboration
                 Set Clear Goals
September 2013
26

-------
    FIGURE 4. NUMBER OF SYSTEMS IMPLEMENTING SUCCESSFUL APPROACHES
                            ACCORDING TO INTERVIEWEES
     Use Existing/Simple Process to Obtain Funding
                           Use Existing Staff
                      Modify Existing Systems
                      Use Existing Contractors
                    Communicate/Collaborate
                             Set Clear Goals
                                          0246

                                    I Number of Implementing Systems
                                                                             10     12
                                                                                           14
3.3     RECOMMENDATIONS/LESSONS LEARNED
The SAIC Team asked the EPA respondents if they had any recommendations or lessons learned for
moving forward in the event that they would receive another large influx of funding with specified
requirements and deadlines. Below is a list of the recommendations from the EPA interviews.

Sufficient Lead Time/Strategic Planning: Although the comments were slightly different, these two topics
are grouped together as one recommendation because sufficient lead time is needed to do appropriate
planning for identifying and implementing systems changes. One EPA respondent described the
recommendation that if EPA were to once again receive an infusion of funding, he would first prefer to do
strategic assessment and planning before jumping into determining and implementing the systems
modifications (see corresponding Challenge - Time Constraints in Section 3.1).

Data Requirements in Place: One EPA respondent strongly recommended that the responsible entities
from all  levels - highest management level to the lowest level (individual EPA offices) define the systems
requirements before starting the effort (and not change them when implementing the requirements) (see
corresponding challenges - Ongoing Changes to OMB and RAT Board Guidance and Data Requirements
Not in Place with Sufficient Lead Time in Section 3.1).

Planning Process/Use Different Tools: Associated with the Strategic Planning recommendation, two EPA
respondents stated that they would have used different data system tools if given the time to review and
analyze. Further, one respondent noted that if EPA were to receive funding similar to the ARRA stimulus
funds in the future, EPA would have an easier time managing the financial data using the Agency's new
Compass Financial information, as the Compass tool has improved reporting capabilities. Another EPA
September 2013
27

-------
respondent stated that he would pick a different platform for the Business Intelligence Tool (i.e., Oracle
Business), use a SharePoint dashboard, and use Microsoft Excel for statistical presentations.
3.4
ESTABLISHED CHANGES  MOVING FORWARD
During interviews, respondents described the process and systems changes they made as a result of the
ARRA program and plan to implement and maintain on a permanent basis. These process and systems
changes are described in detail below.

Established Process for System Modifications: The ARRA process helped one EPA program office
establish a systematic process for updating its system. The process, which the office is now implementing,
involves system users submitting their issues, which are documented. The office reviews, ranks, and
categorizes the issues based on priority or functional similarity. This analysis is then used to make
deliberate choices for system enhancements. The issues are presented to the contractor in purposeful
bundles, resulting in more streamlined releases of system updates.

Established Process for State Data Entry: One EPA program office had been trying for several years to
provide access to the data system such that the states could enter their own data directly into the
database system. When the opportunity arose through the ARRA funding, the program created a web-
based system, designed mainly for the ARRA program, into which the States could now enter their data.
The States are also responsible for performing quality assurance review of their data. The EPA Regions
oversee what the States are reporting and because of this change, EPA now receives the data more
quickly. In  addition, the program office designed the data system such that their program office staff can
open and close the window when data are to be entered, controlling the period when the States can
enter their data.

Established Funding Recommendation Template: One EPA office respondent stated that they are re-
inventing one of the products that they implemented for the ARRA funds. Prior to ARRA, when an EPA
program office did a solicitation and made their selection for an award, the EPA staff prepared a funding
package to submit to the EPA grants office. The package includes a form that justifies the decision and the
commitment of funds. The funding package is lengthy and refers to policy and other types of in-depth
information. The  EPA office developed a template for each of the ARRA programs that generated
responses  for many questions on the funding recommendation form. It shortened the time required for
EPA staff to complete the form, review the data, and distribute funding.  The EPA office currently uses this
funding recommendation template for several EPA programs.
           "Program offices  worked  with legal counsel to  develop standard
           program   funding  recommendation  templates.   EPA's   Grants
           organization posted the templates on the Agency's electronic grants
           management system for programs to access electronically and entered
           the  specific award information  for processing  -  this improved
           processing time."

              Quoted from EPA New Ways Evaluation Responses October, 2010
September 2013
                                                                                   28

-------
Improved Accounting Consistency: ARRA resulted in system changes that ensure accounting consistency
between multiple funds management systems, especially in tracing nonliquidated obligations (e.g., funds
that are unused after a project is complete). The system changes go beyond ensuring accounting
consistency for ARRA funds and provide greater consistency for a broader range of EPA grants.

Expanded Systems Capabilities: During ARRA, one program system added the ability to easily store both
estimated and actual environmental progress data based on a grantee's original work plan and final
report. This ability has improved EPA's reporting and has supported the writing of a Report to Congress.

In addition to these changes which EPA offices made permanent, three respondents noted that ARRA
brought an increased focus on geospatial elements of data reporting. One EPA program focused
considerable effort on translating its financial, emissions, and environmental benefits data to geospatial
presentation. This resulted in  the support of EPA's GeoGrants Initiative, in which the EPA program serves
as a key data provider. The EPA respondent stated that this effort and the resulting database
modifications would not have been attainable without ARRA funds.  Figure 5 shows an example map of
EPA's publicly available geospatial data.
    FIGURE 5. MAP ILLUSTRATING EPA'S PUBLICLY-AVAILABLE  GEOSPATIAL DATA
                       irnersef  Village
                  \   IT,    B                 r-hili-ir
                <,  \  \   B *-yi-"«  1     |
         B  >,  \  ,.,,, \  5   .«™««»-* I  "a     -, f«
                              [

                                        B
                                CLEVEIAND PARK .
                                                         ;       n
                                              - !"4
                                           LOGWjG(8a            LfSNational
                                                                               ^lErnssions
                                                                                (AIRS/AFS)
                                                   BBre^ryatt:vi"ej-f   ^ .srs;
                                                   BrenhTOod   Bladensburg   handover Hills-
                                               ^                    450      Jy f  (-lean
                                               -     Mt f "nier                     Diesel
                                                       Trnar Manor i^;      La -over 
-------
Although the existing staff had experience and knowledge of what needed to be accomplished, convening
multi-disciplinary, multi-organizational staff working groups to solve particular challenges allowed the
separate EPA offices to more quickly and effectively manage major new requirements.

However, the aggressive implementation schedule resulted  in a lack of strategic assessment regarding
necessary system outputs and ways to implement those outputs. Ensuring that the RAT Board
requirements were followed prevented organizations from looking at their own data needs for longer-
term program assessment. Collaborative involvement of program offices and OCFO in determining data
requirements and funding requirements could have led to more robust systems changes with more lasting
programmatic impacts.
September 2013                                                                            30

-------
EPA. "EPA Procedure for Review of ARRA Section 1512 Recipient Reported Information." September
   2010a.

EPA. "EPA New Ways Evaluation Responses." EPA Stimulus Tracking and Reporting Subcommittee,
   Washington, D.C. October 2010b.

EPA. "EPA Effectively Reviewed Recovery Act Recipient Data but Opportunities for Improvement Exist."
   EPA Report No. 10-R-0234. USEPA Office of Inspector General, Washington, D.C. September 27, 2010c.

EPA. 'The American Recovery and Reinvestment Act of 2009, Environmental Protection Agency,
   Reference Guide for Recipient Reporting". Revised September 2010d.

EPA. "EPA Recovery Act Recipient Reporting and Data Review Process." EPA Report No. 10-R-0020. USEPA
   Office of Inspector General, Washington D.C. October 29, 2009.

EPA. "EPA Recovery Act Systems Development Requirements Descriptions and Explanation of Proposed IT
   Systems." Office of Air and Radiation. 2009.

EPA. "EPA Recovery Act Systems Development Requirements Descriptions and Explanation of Proposed IT
   Systems." Office of the Chief Financial Officer. 2009.

EPA. "EPA Recovery Act Systems Development Requirements Descriptions and Explanation of Proposed IT
   Systems." Office of the Chief Financial Officer/Office of Planning, Analysis and Accountability and
   Office of Budget. 2009.

EPA. "EPA Recovery Act Systems Development Requirements Descriptions and Explanation of Proposed IT
   Systems." Office of Environmental Information. 2009.

EPA. "EPA Recovery Act Systems Development Requirements Descriptions and Explanation of Proposed IT
   Systems." Office of Underground Storage Tanks. 2009.

EPA. "EPA Recovery Act Systems Development Requirements Descriptions and Explanation of Proposed IT
   Systems." Office of Water. 2009.

EPA. "EPA Recovery Act Systems Development Requirements Descriptions and Explanation of Proposed IT
   Systems." Superfund (Office of Superfund Remediation and Technology Innovation). 2009.

EPA. "EPA Recovery Act Systems Development Requirements Financial Summary of Proposed Systems."
   Office of Air and Radiation. 2009.

EPA. "EPA Recovery Act Systems Development Requirements Financial Summary of Proposed Systems."
   Office of the Chief Financial Officer and Office of the Chief Financial Officer - Office of Planning,
   Analysis and Accountability/Office of Budget. 2009.

EPA. "EPA Recovery Act Systems Development Requirements Financial Summary of Proposed Systems."
   Office of Environmental Information. 2009.

EPA. "EPA Recovery Act Systems Development Requirements Financial Summary of Proposed Systems."
   Office of Underground Storage Tanks. 2009.
September 2013                                                                            31

-------
EPA. "EPA Recovery Act Systems Development Requirements Financial Summary of Proposed Systems."
   Superfund. 2009.

EPA. "EPA Recovery Act Systems Development Funding Form." Office of Air and Radiation. 2009.

EPA. "EPA Recovery Act Systems Development Funding Form." Office of Water. 2009.
September 2013                                                                         32

-------
APPENDIX 1: SYSTEMS ENHANCEMENT AND DEVELOPMENT EFFORTS
TO EPA INFORMATION SYSTEMS SUPPORTING ARRA

-------
This page intentionally blank.

-------
      APPENDIX 1. SYSTEMS ENHANCEMENT AND DEVELOPMENT EFFORTS
              TO EPA INFORMATION SYSTEMS SUPPORTING ARRA
NAME OF SYSTEM MOD..^^^^^^ EFFORTS
Office of Water Systems
Drinking Water State Revolving Fund (Project
and Benefits Reporting) (PER) System
Clean Water State Revolving Fund Benefits
Reporting (CBR) System
Modification of the CWSRF and DWSRF applications to meet final
ARRA reporting standards.
Creation of a spreadsheet and XML 1512 generator to leverage
already-reported data to facilitate recipient reporting.
Creation of an Office of Water (OW) Reporting Data Mart.
Creation of linkages between CWSRF and DWSRF applications and the
OW Reporting Data Mart.
Creation of a linkage between the OW Business Intelligence data mart
and OW web publishing database to facilitate transparency.
Office of Environmental Information (OEI) System
Central Data Exchange (CDX)
Systems development includes building out an infrastructure for Really
Simple Syndication (RSS) and ATOM (an enhanced RSS) feeds to
support existing OMB reporting requirements, developing workflow
capabilities to emulate the recipient reporting process, facilitating the
proof of concept and documenting the results.
OEI will also support integration of shared services with EPA program
office and financial grant reporting systems to support greater
automation of reporting to Recovery.gov.
Office of Air and Radiation (OAR) System
Diesel Emissions Reduction Program
Database for Reporting Innovative Vehicle
Emission Reductions (DRIVER)
The DERA program created DRIVER to be able to interact (view
reports, certify recipient data, run reports) with the
Recoveryreporting.gov system to ensure timely review of recipient
reports. DRIVER can consume the XML format for viewing and
aggregating recipient data to accommodate a more thorough and
expeditious review.
Office of Superfund Remediation and Technology Innovation (OSRTI) Systems
Comprehensive Environmental Response,
Compensation and Liability Information
System (CERCLIS)
eFacts
CERCLIS required the addition of program priority flags for ARRA, new
ARRA budget accounting codes and modified action codes to
designate ARRA sites, the funding for ARRA sites and ARRA activities
within these sites. Development of a web module for the Regions to
input, transmit and store ARRA specific site progress information into
CERCLIS, including review of this information by headquarters staff,
was also required.
e-Facts reports were required in order to provide information about
the eight program measures EPA proposed to report for the Superfund
program ARRA activities, including six new measures for ARRA
purposes and two measures based on existing GPRA measures. The
report logic is available for viewing for quality control purposes and
report archiving is envisioned to allow these reports on program
measures to be available for discrete time intervals to assess ARRA
progress.
September 2013
Appendix 1-1

-------
             NAME OF SYSTEM
                   DESCRIPTION OF SYSTEM
                IFICATION OR DEVELOPMENT EFFORTS
 Office of Underground Storage Tanks (OUST) System
 LUST4 Leaking Underground Storage Tanks
 (LUST4)
Creation of a set of database tables in the Agency's standard database
management system, Oracle Database; creation of extraction,
transformation, and load (ETL) specifications for use with the Agency's
standard ETL tool, Informatica PowerCenter, to populate the LUST4
database from data previously entered in the LUSTS legacy
application; development of specifications for the use of Oracle
Application Express, a commercial-off-the-shelf (COTS) component of
the Oracle Database, to provide data entry capability over the
Agency's intranet; and development of specifications for the use of
the Oracle Business Intelligence tool, provided under the Agency's
Working Capital Fund, to perform reporting and presentation of the
data.
 Office of Chief Financial Officer (OCFO) Systems
 OCFO Reporting and Business Intelligence
 Tool (ORBIT), Executive Management
 Dashboard
Using the existing ORBIT system, EPA developed reports to meet
OMB's reporting requirements, quality assurance reports, and
performance measures reports. ORBIT was also modified to provide
data for offices to use to double check recipient reporting. (Note - EPA
converted to a new financial system on Oct 1, 2011.)
 Annual Commitment System (ACS) and
 Budget and Accounting System (BAS)
Proposed systems developments and enhancements include
modifying existing ACS reports to include the ARRA flag; developing
spreadsheets/ pull reports and QA/QC data and reports; supporting
importation of data from program systems (e.g. PAT); including data
quality information and context in ACS; and providing support for
ARRA related performance reporting in the budget and Performance
Accountability Report (e.g. creation of new tabs or reports).
September 2013
                                              Appendix 1-2

-------