ฃ
<
33
O
\
^eDSK%
PRO"*

X>
Z
111
o
J
U.S. ENVIRONMENTAL PROTECTION AGENCY
OFFICE OF INSPECTOR GENERAL

Scan this mobile
code to learn more
about the EPA OIG.
Quality Control Review of
EPA Office of
Inspector General Reports
Issued in Fiscal Year 2013
Report No. 14-N-0358
September 25, 2014

-------
Report Contributors:	Kevin Chaffin
Carolyn J. Hicks
John Trefry
Abbreviations
EPA
U.S. Environmental Protection Agency
FY
Fiscal Year
GAGAS
Generally Accepted Government Auditing Standards
OA
Office of Audit
OIG
Office of Inspector General
OPE
Office of Program Evaluation
PLD
Product Line Director
PM
Project Manager
PMH
Project Management Handbook
Are you aware of fraud, waste or abuse in an
EPA program?
EPA Inspector General Hotline
1200 Pennsylvania Avenue, NW (2431T)
Washington, DC 20460
(888) 546-8740
(202) 566-2599 (fax)
OIG Hotline@epa.gov
More information at www.epa.gov/oiq/hotline.html.
EPA, Office of Inspector General
1200 Pennsylvania Avenue, NW (241OT)
Washington, DC 20460
(202) 566-2391
www.epa.gov/oig
Subscribe to our Email Updates
Follow us on Twitter @EPAoig
Send us your Report Suggestions

-------
^tDsx
* O \
Iฎ *
U.S. Environmental Protection Agency
Office of Inspector General
At a Glance
14-N-0358
September 25, 2014
Why We Did This Review
The purpose of this review was
to report on compliance with
the set of criteria the Office of
Inspector General (OIG) of the
U.S. Environmental Protection
Agency (EPA) uses to ensure
quality in reports issued by its
Office of Audit and Office of
Program Evaluation for
consistency with generally
accepted government auditing
standards. We also sought to
assess any trends or issues
related to possible non-
compliance with quality
standards and identify areas in
which quality processes can be
improved.
This report addresses the
following EPA goal or
cross-agency strategy:
• Embracing EPA as a high-
performing organization.
Quality Control Review of EPA Office of
Inspector General Reports Issued in Fiscal Year 2013
What We Found
Monitoring of quality
controls is an ongoing,
periodic assessment of
work to ensure
compliance with the
OIG's system of quality
control.
During fiscal year 2013, the OIG continued to
make improvements regarding documentation of
workpaper reviews. Supervisory reviews were
better documented, including the supporting
workpapers for the draft and final reports. In
addition, staff are responding to the Product Line
Director and Project Manager comments, and
clearance by the Product Line Director/Project Manager is documented in the
review sheets and notes.
Nonetheless, we noted the following areas where improvements should be
made:
•	Workpapers should not be unnecessarily lengthy.
•	Indexing should be updated at various stages.
•	Use of draft agency documents should be better managed and attributed.
•	Dates used to define the scope of work should be more standardized.
Suggestions for Improvement
We suggest that the OIG reinforce to staff the Project Management Handbook
requirements to:
•	Include as part of the workpaper preparation and review processes that
each workpaper addresses only one audit or evaluation step or sub-step.
•	Include the proper elements on indexing.
•	Specify that reports should clearly attribute draft sources and verify that
the sources contain the most up-to-date information.
•	Properly report the beginning and end dates for all reports.
Send all inquiries to our public
affairs office at (202) 566 2391
or visit www.epa.gov/oia.
The full report is at:
www.epa.gov/oig/reports/2014/
20140925-14-N-0358.pdf

-------
^ฃDSX
V,	UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
I r4lB-7 5	WASHINGTON, D.C. 20460
\ vR?"
PRO"*4-	OFFICE OF
INSPECTOR GENERAL
September 25, 2014
MEMORANDUM
SUBJECT:
FROM:
TO:
Quality Control Review of EPA Office of Inspector General
Issued in Fiscal Year 2013
Report No. 14-N-0358
Aracely Nunez-Mattocks, Chief of Staff /
Charles Sheehan, Deputy Inspector General
eports
This is our report on assessing adherence to quality control elements in fiscal year 2013 reports issued by
the U.S. Environmental Protection Agency (EPA) Office of Inspector General (OIG) in terms of
compliance with generally accepted government auditing standards (GAGAS). This report covers
reports issued by the OIG's Office of Audit and Office of Program Evaluation.
This report, as with prior quality control review reports, offers observations and makes suggestions for
improvement to you that will enhance and strengthen the OIG's project execution processes and provide
opportunities for improving adherence to quality control elements within the OIG. The reports scored
during this fiscal year 2013 review are included in appendices A through D. The focus of this report was
on quality control elements of Planning (Preliminary Research), Field Work, Evidence, Supervision and
Reporting (Timeliness and Readability).
cc: Kevin Christensen, Acting Assistant Inspector General for Audit
Carolyn Copper, Assistant Inspector General for Program Evaluation

Deputy Inspector General Agrees with Suggestions for
Improvement

Deputy Inspector General Disagrees with Suggestions for
Improvement

-------
Quality Control Review of EPA Office of Inspector General
Reports Issued in Fiscal Year 2013
14-N-0358
Table of C
Chapters
1	Introduction		1
Purpose		1
Background		1
Measuring Adherence to Quality Control Elements of OIG Reports		2
Scope and Methodology		4
Scoring the Results		4
2	Notable Improvements Made, But Further Opportunities Exist		6
Many Improvements Made Since Last Quality Assurance Review		6
Additional Opportunities for Improvement Exist		6
Workpaper Preparation		7
Report Indexing		7
Use of Draft Agency Documents		8
Scope of Work		9
Other Considerations		9
Appendices
A OIG Reports Reviewed With Project Quality Scorecards - FY 2013		10
B OIG Project Quality Scorecard Results - FY 2013		12
C OIG Reports Reviewed With CMR - FY 2013		14
D OIG CMR Results-FY 2013		15

-------
Chapter 1
Introduction
Purpose
The purpose of this review is to report on compliance with the set of criteria the
Office of Inspector General (OIG) of the U.S. Environmental Protection Agency
(EPA) uses to measure adherence to quality control elements in reports issued by
its Office of Audit (OA) and Office of Program Evaluation (OPE) for consistency
with generally accepted government auditing standards (GAGAS). We also
sought to assess any trends or issues related to and identify areas in which
processes can be improved. OIG quality control criteria were applied to 40 OA
and 22 OPE reports issued from October 1, 2012, through September 30, 2013.1
Background
The Inspector General Act of 1978, as amended by the Inspector General Reform
Act of 2008, requires that federal Inspectors General comply with standards
established by the Comptroller General of the United States for audits of federal
establishments, organizations, programs, activities and functions. The EPA OIG
conducts audits and evaluations in accordance with GAGAS and maintains a
system of quality controls to provide the audit organization with reasonable
assurance that the organization's products and services, and its personnel, comply
with professional standards and applicable legal and regulatory requirements.
In our quality assurance report Measuring the Quality of Inspector General
Reports Issued in Fiscal Years 2008 and 2009 (Report No. 10-N-0134, issued
June 2, 2010), we concluded that several recommendations from a February 2008
quality assurance report had been implemented and helped to improve the quality
of reports and work processes.
Prior quality assurance reports also include Assessing the Quality of the
Independent Referencing Process During Fiscal Year 2011 (Report No. 12-N-
0416, issued April 19, 2012) and Analysis of Office of Inspector General Policies
and Procedures Addressing CIGIE Quality Standards (Report No. 12-N-0516,
issued June 4, 2012). These reports identified various recommendations for
improvement to ensure the OIG policies and procedures are current based on the
expected review date and improvements to our referencing process that focus on
consistency and timeliness. The OIG with this current quality assurance report has
resumed annual reporting on systemic issues identified during referencing, along
with suggestions for improvement.
1 There were 45 OA and 23 OPE reports issued during FY 2013, but only 40 OA and 22 OPE reports were scored.
14-N-0358	1

-------
In November 2012, the quality assurance staff, previously located in the
Immediate Office of Inspector General, transferred to the OA and OPE. The
quality assurance staff (known organizationally as the independent referencers)
report to the Deputy Assistant Inspector General and the Assistant Inspector
General in their respective offices. The Office of the Chief of Staff has a Planning
and Quality Assurance Lead responsible for coordinating with the referencers and
reporting on systemic issues identified, and serves as the OIG liaison during
external peer reviews.
Measuring Adherence to Quality Control Elements of OIG Reports
As noted in the Government Auditing Standards (December 2011), an
"... audit organization should analyze and summarize the results of its monitoring
processes at least annually, with identification of any systemic issues needing
improvement, along with recommendations for corrective action."
A measuring process should provide a mechanism to evaluate individual products
against specific quality criteria. The measuring process should also present the
information in a manner that, over time, will allow the OIG to assess trends in
adherence to quality control elements so that necessary adjustments can be made
to policies, procedures and activities. In December 2012, the Inspector General
had signed the revised OIG Policy and Procedure 101, OIG Project Management
Handbook (PMH). The PMH is the EPA OIG's guide book for complying with
the Inspector General Act of 1978, as amended, and the Government Auditing
Standards.
The quality control standards used in this project were:
•	Documentary reliability of evidence.
•	Supervisory reviews of workpapers.
•	Readability of reports.
•	The December 2012 PMH Revision.
With the revision of the PMH in December 2012, two evaluation forms were used
to measure and score the above characteristics: the Quality Scorecard and the
Compliance Monitoring Review (CMR). Projects started prior to January 30,
2013, were scored with the quality scorecard. Projects initiated after January 30,
2013, were scored with the CMR. The reports scored with the project quality
scorecard are listed in appendix A and the specific manner in which we calculated
points shown for the project quality scorecard are in appendix B. The project
scored using the CMR are in listed in appendix C and the specific manner in
which we calculated points for the CMR are in appendix D.
The project quality scorecard reflects the OIG's process for monitoring OIG
products adherence to most, but not all, of GAGAS. This process is part of the
OIG's overall quality control system. All OIG audits, program evaluations and
14-N-0358
2

-------
other reviews are conducted in accordance with GAGAS unless otherwise noted.
The PMH is the OIG's guide for conducting all reviews in accordance with most,
but not all, of GAGAS and other professional standards.
The scoring process encompasses an evaluation of activities from the start of
preliminary research (the "kickoff' meeting) to the point that an OIG team
submits a draft report to the OIG's Office of Congressional and Public Affairs for
edit. The process includes a measurement for report communication that
encompasses the readability, completeness, conciseness and presentation of draft
reports.
The project quality scorecard and CMR do not examine compliance with the
General Standards such as independence, professional judgment, competence and
adherence to Continuing Professional Education requirements. In addition, the
project quality scorecard and CMR exclude analysis that includes confirmation of
compliance with the sections on Recommendations, Reporting Views of
Responsible Officials, and Reporting Confidential and Sensitive Information
under the Reporting Standards for Performance Audits.
The scoring categories associated with the quality scorecard are:
Planning	3 points
Field Work	4 points
Evidence	4 points
Supervision	5 points
Draft Report Preparation and Timeliness 8 points
Report Communication	9 points
The categories associated with CMR are:
Planning and Execution	15 points
Evidence	20 points
Supervision	30 points
Reporting	20 points
Post Report/Data Accuracy	15 points
Quality should also be viewed from the perspective of the customer, client or
stakeholder. A report that complies with GAGAS and receives an excellent score
may not necessarily be useful to the customers, clients or stakeholders. Section
1.05 of GAGAS says that "Audits performed in accordance with GAGAS provide
information used for oversight, accountability, transparency and improvements of
government programs and operations."
Currently, there is no method of validating the scoring of reports, such as testing
the correlation between the total score and the perceived value and effectiveness
of an audit or evaluation. We suggest that quality of the scoring itself should be
14-N-0358
3

-------
validated by factors of report usability such as the percentage of recommendations
or dollar amounts sustained or acted upon. If in fact reports are properly planned
with supportable findings and actionable recommendations, the ultimate evidence
of the report quality will be in the recognized value and usability of the report in
accomplishing the objectives.
Scope and Methodology
We reviewed cost and time data stored in the Inspector General Enterprise
Management System (known as "IGEMS") for each of the OIG audit and
evaluation projects that were scored for quality. We then reviewed the
assignment workpapers in the OIG's Auto Auditฎ workpaper systems and the
final reports using the scoring form. During the scoring process, we also contacted
supervisors as needed on each assignment to obtain additional information. The
scoring form measured each assignment as to Planning (Preliminary Research),
Field Work, Evidence, Supervision, and Reporting (Timeliness and Readability).
The work performed in this review does not constitute an audit conducted in
accordance with generally accepted government auditing standards issued by the
Comptroller General of the United States. We believe these scorecards can be
applied to all OIG assignments conducted in accordance with GAGAS. The
scorecards should allow for enough variety in impact quality measurement to
cover all of our work. However, the limitations of the Scorecard in relation to the
full spectrum of GAGAS should be noted.
Our scope covered final GAGAS-compliant reports issued by OA and OPE from
October 1, 2012, to September 30, 2013, that were reviewed and scored by the
OIG's quality assurance staff. We did not include reports for which the work was
performed by external auditors.
Scoring the Results
The total quality scores are shown in appendices B and D. Each total quality score
measures project and report quality characteristics, including Planning
(Preliminary Research), Field Work, Evidence, Supervision, and Reporting
(Timeliness and Readability). For the scorecard, the maximum number of points
achievable for a draft report issued to the agency is 33 points. For the CMR, the
maximum number of points achievable is 100.
During fiscal year (FY) 2013, the Supervision quality characteristics in the OIG
project management scorecard remained similar to the quality characteristics
identified during FYs 2008 and 2009. The average total project score for FY 2013
was 31.6 points for quality scorecards and 93.0 points for CMRs. The average
project quality scorecard scores for Supervision and Evidence during FY 2013
were 4.8 and 3.6, respectively. The average CMR rating for Supervision and
Evidence during FY 2013 was 28.6 and 18.5, respectively.
14-N-0358
4

-------
Product Line Directors (PLDs), for example, routinely documented their approval
of the project guide prior to the entrance conference. This represents their
approval of the project's objectives and scope and methodology. Supervisors also
approved their team members' workpapers within 30 days of staff completion.
The OIG teams used the discussion document process and held meetings with
agency management and staff to discuss the reports, ensure accuracy and tone,
and present proposed recommendations. The 40 OA and 22 OPE reports scored in
FY 2013 contained 265 recommendations made to the agency, and the agency had
accepted 101 of those recommendations (38 percent) as of the final report dates.
14-N-0358
5

-------
Chapter 2
Notable Improvements Made,
But Further Opportunities Exist
During FY 2013, the OIG continued to make improvements regarding
documentation of workpaper reviews. Supervisory reviews were better documented
and included the supporting workpapers for the draft and final reports. In addition,
staff are responding to the PLD and Project Manager (PM) comments, and
clearance by the PLD/PM is documented in the review sheets and notes.
Many Improvements Made Since Last Quality Assurance Review
Since the last quality assurance review issued on June 2, 2010 (Report No.
10-N-0134), which covered issues regarding FYs 2008 and 2009 reports, the OIG
included all recommendations from that report in the 2012 PMH revision. A
process was also put in place to capture interim updates to the PMH occurring
between formal revisions.
During FY 2013, there were noticeable improvements regarding documentation of
workpaper reviews. Supervisory reviews are better documented, and the
comments were retained in the workpapers as either a master list or via comment
sheets. The supervisory reviews were timelier, as required by the PMH, which
requires that workpapers be reviewed monthly. Only three of 55 reports scored
using the project quality scorecard in FY 2013 had less than a quality scorecard
score of 4.0 for supervision. The average quality scorecard score for supervision
was 4.8 in FY 2013. Only one of the seven reports scored using the CMR in
FY 2013 had less than a CMR score of 26.0 for supervision. The average CMR
score for supervision was 28.6 in FY 2013. Overall, the quality scores for
supervision have improved since our assessment during the quality assurance
review report issued during FY 2010. The quality improvement measures instilled
in the audit and evaluation process provide a direct correlation to higher-quality
OIG reports.
Additional Opportunities for Improvement Exist
Despite the improvements discussed above, we noted the following areas where
further improvements should be made:
•	Workpapers should not be unnecessarily lengthy.
•	Indexing should be updated at various stages.
•	Use of draft agency documents should be better managed and attributed.
•	Dates used to define the scope of work should be more standardized.
14-N-0358
6

-------
Workpaper Preparation
Although improvements have been made since our prior quality assurance
review, one area that continues to need attention is maintaining workpapers of
reasonable length. Some workpapers continue to have more than the results of one
audit or evaluation step or sub-step. They include multiple interviews, emails,
documents and analyses. This has a negative impact on the timeliness of
independent referencing and supervisory review. Workpapers should not be so
lengthy that they impede an effective or timely review, and they should address a
specific audit or evaluation step or sub-step as identified in the audit guide.
Workpaper and audit documentation is an essential element of audit quality.
Workpapers should be clear, concise and easy to follow. Audit and evaluation
documentation must contain sufficient and appropriate evidence to support the
auditor's or evaluator's findings, and recommendations in the audit or evaluation
report. When individual workpapers include multiple interviews, emails,
documents and analyses, they become very lengthy and/or overly complex.
As per GAGAS 6.82, audit documentation serves to (1) provide the principal
support for the auditors' report, (2) aid auditors in conducting and supervising the
audit, and (3) allow for the review of audit quality. In PMH section 1.6, each
workpaper should be able to stand on its own and clearly convey the step being
addressed from the project guide. Summary workpapers contain a compilation of
information from individual audit documents.
Suggestion for Improvement 1: Reinforce to OIG staff the PMH requirement
to include as part of the workpaper preparation and review processes that each
workpaper be able to stand on its own and clearly convey the step being
addressed from the project guide. Upon request, provide training to OIG staff
and PLDs on workpaper preparation within the OIG, to include best practice
methods identified during our scoring processes.
Report Indexing
Report indexing has improved since reported in the 2010 quality assurance
review. As per GAGAS and the PMH, auditors must obtain sufficient, appropriate
evidence to provide a reasonable basis for their findings and conclusions. GAGAS
states that the process of preparing and reviewing audit documentation should
allow for the review of audit quality. PMs and PLDs have directed their staffs to
more precisely index report statements to supporting documentation. Also, the
OIG plans to continue to reemphasize good indexing through training on an as-
needed basis.
However, during referencing of draft reports, indexes to supporting information
often concerned comments provided by the agency that pertained to the
discussion document. In some cases, no further audit work was conducted and the
14-N-0358
7

-------
suggested change by the agency was accepted by the team without any validation
While the purpose of the discussion document is to facilitate discussion with the
auditee, changes by the auditee should be supported by appropriate documentary
evidence. Also, OIG conclusions or opinions are sometimes not included in the
audit workpapers but appear in the audit report with no indexing.
Insufficient indexing of summaries, finding outlines, and spreadsheets is also a
concern. In some cases, reports are indexed to summary workpapers or finding
outlines that are not cross-indexed to supporting workpapers. In other cases,
spreadsheets are not clearly cross-indexed to supporting documentation, or report
indexes do not refer to a specific location in a spreadsheet. Both issues result in
the need for additional time in referencing.
Suggestion for Improvement 2: Reinforce to OIG staff the PMH requirement
on indexing, specifically noting that: (1) OIG conclusions and opinions in the
discussion document and final reports, summaries and finding outlines must be
indexed to supporting audit workpapers that show the complete facts and
rationale for a conclusion or opinion; (2) spreadsheets must be cross-indexed to
supporting documentation; and (3) report indexes must refer to a specific
location in a spreadsheet.
Use of Draft Agency Documents
While this issue is no longer as prevalent as reported in the 2010 quality assurance
review, teams continue to use agency draft documents to support audit
conclusions without proper attribution. In some cases, teams use the draft
documents as support without further validating the information presented in the
OIG draft report to make it current. For example, one report used an EPA
document that was identified as a draft for over 5 years and did not identify any
updated document on hand from the agency. Audit teams should continue to
perform additional audit work as needed to determine whether the issues
identified in the agency's draft document are still valid and whether the document
was or would ever be published.
As per GAGAS 6.71(a), evidence is sufficient and appropriate when it provides a
reasonable basis for supporting the findings or conclusions within the context of
the audit objectives. The 2012 PMH was updated so that when indexing refers to
documentation marked "draft," the report text must clearly attribute the report
statements to the draft source document.
Suggestion for Improvement 3: Reinforce to OIG staff the PMH requirement
that attributed draft sources should be checked shortly before referencing and
submission of the draft report for comment to verify that the OIG report
contains the most up-to-date and current information.
14-N-0358
8

-------
Scope of Work
Teams continue to have problems associated with the consistent use of start and
end dates in reports when describing the scope of a project. Audit research, field
work and reporting are not distinct phases within the audit cycle and may overlap.
These phases are discussed in detail in the PMH. In the PMH, for reporting
purposes, and to better define the audit timeframes, the statement to be included in
the report describing the scope of work will commence with the preliminary
research kick-off meeting with the agency (or, if preliminary research is not
conducted, the entrance conference) and will end when the draft report is
provided to the agency for comment (or the discussion draft if a draft is not
issued). However, teams did not consistently use those dates.
As per GAGAS 6.09, the scope defines the subject matter on which the auditors
will assess and report, such as a particular program or aspect of a program, the
necessary documents or records, the period of time reviewed, and the locations
that will be included. The PMH was updated to inform teams of the correct
timeframe measures to be used.
Suggestion for Improvement 4: Reinforce to OIG staff the PMH requirement
that audit work is to be cited as beginning with the preliminary research kick-
off meeting or entrance conference, and ending on the date the draft report is
provided to the agency (or discussion draft, if no official draft is issued).
Other Considerations
Reinforce Importance of Effective Recommendations
Guidance in the PMH could also be improved to remind teams that effective
recommendations encourage improvements in the conduct of government
programs and operations in accordance with GAGAS 7.28. Because the OIG
evaluates and make recommendations to the agency on the programs and
functioning of operations, we have a special responsibility to ensure that our
recommendations clearly state the actions recommended.
As per GAGAS 7.29, recommendations are effective when they are addressed to
parties that have the authority to act and when the recommended actions are
specific, practical, cost effective, and measurable. Actions on recommendations
are consistent with OIG strategic goals and provide a basis for effective followup.
Suggestion for Improvement 5: Submit an amendment to the PMH for section
4.6 that high quality recommendations should be in accordance with GAGAS
7.28 and 7.29.
14-N-0358
9

-------
Appendix A
OIG Reports Reviewed With Project Quality Scorecards - FY 2013
Publication No.
Assignment No.
Title
13-P-0057
OA-FY12-0333
Status of Corrective Actions in Response to 2008 Report, "Framework for Developing Tribal Capacity Needed in Indian
General Assistance Program"
13-P-0028
OA-FY11-0024
Improvements Needed in Estimating and Leveraging Cost Savings and Across EPA
13-P-0161
OPE-FY11-0010
"EPA Needs to Improve Air Emissions Data for the Oil and Natural Gas Production Sector"
13-P-0163
OA-FY12-0107
EPA is Not Recovering All Its Costs of the Lead-Based Paint Fees Program
13-P-0178
OPE-FY11-0012
Improvements Needed in EPA Training and Oversight for Risk Management Program Inspections
13-R-0092
OA-FY12-0162
American Recovery and Reinvestment Act Site Visit of Combined Sewer Overflow Detention Facility, City of Goshen, Indiana
13-P-0168
OPE-FY12-0018
Response to Congressional Reguest on EPA Enforcement
13-P-0127
OPE-FY12-0010
Congressionally Reguested Information on EPA Utilization of Integrated Risk Information System
13-P-0176
OPE-FY12-0012
Results and Benefits Information Is Needed to Support Impacts of EPA's Superfund Removal Program
13-4-0153
OA-FY12-0696
OAM Reguest - Seagull Environmental
13-4-0116
OA-FY12-0698
Agreed-Upon Procedures Applied to Proposal Submitted Under EPA Solicitation No. SOL-HQ-12-00006 by Toeroek
Associates, Inc., Lakewood, Colorado
13-4-0125
OA-FY 12-0712
Agreed-Upon Procedures Applied to Proposal Submitted Under EPA Solicitation No. SOL-HQ-12-00005 by Advanced
Environmental Management Group, Plymouth, Michigan
13-P-0209
OPE-FY12-0008
Opportunities for EPA-Wide Improvements Identified During Review of a Regional Time and Materials Contract
13-P-0167
OPE-FY11-0021
Efficiency of EPA's Rule Development Process Can Be Better Measured Through Improved Management and Information
13-P-0201
OPE-FY12-0004
The EPA Needs to Improve Management of its School Environmental Health Efforts
13-P-0207
OPE-FY12-0021
Review of Hotline Complaint Regarding Residential Soil Contamination in Cherryvale, Kansas
13-P-0221
OPE-FY10-0012
Better Planning, Execution and Communication Could Have Reduced the Delays in Completing a Toxicity Assessment of the
Libby, Montana, Superfund Site
13-P-0264
OPE-FY12-0003
EPA Oversight Addresses Thermal Variance and Cooling Water Permit Deficiencies But Needs to Address Compliance With
Public Notice Reguirements
13-P-0298
OPE-FY11-0015
Improved Information Could Better Enable EPA to Manage Electronic Waste and Enforce Regulations
13-P-0299
OPE-FY12-0017
Review of Hotline Complaint Concerning the Region 4 Environmental Justice Small Grants Selection Process
13-P-0317
OPE-FY12-0013
EPA's Handling of a Proposed Alternative Method for Measuring Oil and Grease in Wastewater Met Requirements But
Controls Need to Be Strengthened
13-P-0349
OPE-FY12-0006
EPA Can Better Address Risks to the Security of the Nation's Drinking Water Through New Authorities, Plans, and
Information
13-P-0356
OPE-FY13-0007
Public May Be Making Indoor Mold Cleanup Decisions Based on EPA Tool Developed Only for Research Applications
13-P-0370
OPE-FY12-0024
Limited Oil Spill Funding Since the Enbridge Spill Has Delayed Abandoned Oil Well Cleanups; Emergency Oil Responses
Not Impacted
13-P-0364
OPE-FY13-0017
Quick Reaction Report: EPA Must Take Steps to Implement Requirements of Its Scientific Integrity Policy
13-P-0387
OPE-FY12-0001
EPA Can Better Document Resolution of Ethics and Partiality Concerns in Managing Clean Air Federal Advisory Committees
13-P-0162
OA-FY12-0056
EPA Facility Space Management to Optimize Occupancy and Cost
14-N-0358
10

-------
Publication No.
Assignment No.
Title
13-P-0152
OA-FY12-0084
EPA Could Improve Contingency Planning for Oil and Hazardous Substance Response
13-4-0154
OA-FY12-0711
OAM Request - SES Inc.
13-P-0200
OA-FY11-0267
Improvements Needed in EPA's Smartcard Program to Ensure Consistent Physical Access Procedures and Cost
Reasonableness
13-P-0177
OA-FY13-0085
U.S. Chemical Safety and Hazard Investigation Board Complied With Reporting Requirements of the Improper Payments
Elimination and Recover Act
13-P-0175
OA-FY13-0055
Corrective Action Plan Needed in Order to Fully Comply With the Improper Payments Elimination and Recovery Act
13-4-0296
OA-FY12-0497
Labor-Charging Practices at the New Mexico Environment Department
13-R-0297
OA-FY12-0198
Air Quality Objectives for the Baton Rouge Ozone Nonattainment Area Not Met Under EPA Agreement 2A-96694301
Awarded to the Railroad Research Foundation
13-P-0145
OA-FY12-0306
New Procedures Aided Region 5 in Reducing Unliquidated Obligations
13-1-0054
OA-FY12-0400
Audit of EPA's Fiscal 2012 and 2011 Consolidated Financial Statements
13-P-0208
OA-FY11-0594
EPA Should Increase Fixed Price Contracting for Remedial Actions
13-P-0128
OA-FY12-0492
Audit Follow-up Process Needed for the Chemical Safety and Hazard Investigation Board
13-R-0367
OA-FY12-0258
ARRA Cooperative Agreement 2A-97706701 Awarded to Grace Hill Settlement House
13-P-0366
OA-FY13-0047
The EPA Needs to Improve Timeliness and Documentation of Workforce and Workload Management Corrective Actions
13-R-0321
OA-FY12-0260
Projected Emission Reductions Overstated and Buy American Requirements Not Met Under EPA Award to the Tennessee
Department of Transportation
13-R-0353
OA-FY11-A-0061
Examination of Costs Claimed Under EPA Cooperative Agreements 2A-96104501 and 2A-96107201 Awarded Under the
Recovery Act to Chelsea Collaborative Inc., Chelsea, Massachusetts
13-4-0262
OA-FY12-0697
Agreed-Upon Procedures Applied to Proposal Submitted Under EPA Solicitation No. SOL-HQ-12-00006 by Booz Allen
Hamilton, Inc., McLean, Virginia
13-P-0308
OA-FY13-0076
Limitations on the EPA's Authority Under the Safe Drinking Water Act Resulted in Unaddressed Concerns at a Tribal Drinking
Water Plant
13-P-0430
OA-FY13-0293
Implementation Plan With Cost Sharing Methodology Needed for Region 8 Senior Environmental Employee Work on Lead
Risk Reduction
13-R-0413
OA-FY10-A-0208
American Recovery and Reinvestment Act Site Visit of Yauco - La Jurada Community Distribution System, Yauco,
Puerto Rico
13-P-0341
OA-FY13-A-0203
Lead Remediation Association of America
13-P-0271
OA-FY12-0480
Improved Internal Controls Needed in the Gulf of Mexico Program Office
13-P-0337
OA-FY12-0513
U. S. Chemical Safety and Hazard Investigation Board Investigation Needs to Complete More Timely Investigations
13-P-0398
OA-FY12-0494
Improved Contract Administration Needed for the Customer Technology Solutions Contract
13-P-0363
OA-FY13-0013
The EPA Should Improve Chemical Fume Hood Testing Oversight to Reduce Health and Safety Risk
13-P-0373
OA-FY13-0009
The EPA Should Improve Monitoring of Controls in the Renewable Fuel Standard Program
13-P-0220
OMS-FY12-0012
Review of Hotline Complaint on EPA's Pre-Award Activities for Multiple Award Contracts at the National Computer Center
13-P-0252
OMS-FY11-0004
Improvements Needed to Secure IT Assets at EPA Owned Research Facilities
13-P-0359
OMS-FY12-0002
Controls Over EPA's Compass Financial System Need to Be Improved
14-N-0358
11

-------
Appendix B
OIG Project Quality Scorecard Results - FY 2013






Draft


Publication
No.
Elapsed Days
from Kickoff
to OCPA
Planning
Field Work
Evidence
Supervision
Report
Preparation
and
Timeliness
Report
Communication
Total
Assignment
Score
13-P-0057
124.0
3.0
4.0
4.0
5.0
8.0
8.3
32.3
13-P-0028
364.0
2.0
4.0
3.5
5.0
8.0
7.0
29.5
13-P-0161
435.0
3.0
4.0
3.5
5.0
8.0
9.0
32.5
13-P-0163
201.0
2.0
4.0
3.9
4.8
8.0
9.0
31.7
13-P-0178
421.0
2.0
4.0
4.0
5.0
8.0
9.0
32.0
13-R-0092
141.0
3.0
3.0
3.8
4.6
7.0
9.0
30.4
13-P-0168
70.0
3.0
4.0
4.0
5.0
8.0
9.0
33.0
13-P-0127
243.0
3.0
4.0
3.5
4.4
7.0
9.0
30.9
13-P-0176
181.0
3.0
4.0
4.0
4.9
8.0
9.0
32.9
13-4-0153
88.0
3.0
3.0
3.7
4.8
8.0
9.0
31.5
13-4-0116
N/A
3.0
4.0
3.8
5.0
8.0
9.0
32.8
13-4-0125
57.0
3.0
3.0
3.9
5.0
8.0
9.0
31.9
13-P-0209
176.0
3.0
4.0
3.5
5.0
8.0
9.0
32.5
13-P-0167
293.0
3.0
4.0
3.5
4.9
8.0
8.8
32.2
13-P-0201
281.0
3.0
3.0
3.5
5.0
8.0
9.0
31.5
13-P-0207
152.0
3.0
4.0
4.0
5.0
8.0
9.0
33.0
13-P-0221
455.0
3.0
4.0
3.5
4.6
8.0
7.8
30.9
13-P-0264
257.0
2.0
3.0
3.5
4.7
8.0
9.0
30.2
13-P-0298
328.0
3.0
4.0
3.5
5.0
8.0
9.0
32.5
13-P-0299
224.0
3.0
4.0
3.5
4.9
8.0
9.0
32.4
13-P-0317
280.0
3.0
4.0
3.5
5.0
8.0
9.0
32.5
13-P-0349
225.0
3.0
4.0
3.5
5.0
8.0
9.0
32.5
13-P-0356
N/A
3.0
4.0
4.0
5.0
8.0
9.0
33.0
13-P-0370
169.0
3.0
4.0
3.5
5.0
8.0
9.0
32.5
13-P-0364
N/A
3.0
4.0
3.5
4.8
8.0
4.7
28.0
13-P-0387
N/A
3.0
4.0
3.5
5.0
8.0
9.0
32.5
13-P-0162
163.0
3.0
4.0
3.6
4.5
8.0
9.0
32.1
13-P-0152
251.0
3.0
4.0
3.7
4.9
8.0
9.0
32.6
13-4-0154
79.0
3.0
3.0
3.8
5.0
8.0
9.0
31.8
13-P-0200
392.0
2.0
4.0
4.0
4.9
8.0
9.0
31.9
14-N-0358
12

-------
Publication
No.
Elapsed Days
from Kickoff
to OCPA
Planning
Field Work
Evidence
Supervision
Draft
Report
Preparation
and
Timeliness
Report
Communication
Total
Assignment
Score
13-P-0177
47.0
3.0
3.5
3.7
4.9
7.5
9.0
31.6
13-P-0175
69.0
3.0
4.0
3.7
4.9
8.0
9.0
32.6
13-4-0296
178.0
3.0
4.0
3.8
4.7
8.0
9.0
32.5
13-R-0297
216.0
3.0
4.0
3.8
4.0
8.0
8.6
31.4
13-P-0145
219.0
3.0
4.0
3.9
4.9
8.0
9.0
32.8
13-1-0054
160.0
3.0
4.0
4.0
4.9
6.0
7.0
28.9
13-P-0208
395.0
2.0
4.0
3.5
4.9
8.0
9.0
31.4
13-P-0128
34.0
3.0
1.0
3.5
3.4
8.0
8.4
27.3
13-R-0367
N/A
3.0
4.0
3.9
4.4
8.0
9.0
32.3
13-P-0366
N/A
3.0
4.0
2.8
5.0
8.0
8.9
31.7
13-R-0321
N/A
3.0
4.0
4.0
4.7
8.0
9.0
32.7
13-R-0353
388.0
3.0
4.0
3.9
3.9
8.0
9.0
31.8
13-4-0262
132.0
2.0
3.0
3.5
5.0
8.0
9.0
30.5
13-P-0308
76.0
3.0
4.0
3.5
4.9
8.0
9.0
32.4
13-P-0430
N/A
3.0
4.0
3.5
5.0
7.0
9.0
31.5
13-R-0413
N/A
2.0
4.0
3.9
4.5
7.0
9.0
30.4
13-P-0341
N/A
3.0
4.0
3.8
5.0
8.0
9.0
32.8
13-P-0271
246.0
3.0
4.0
3.8
4.9
8.0
8.6
32.3
13-P-0337
252.0
3.0
4.0
3.5
4.3
8.0
7.2
30.0
13-P-0398
280.0
2.0
4.0
3.2
4.6
8.0
8.3
30.1
13-P-0363
211.0
3.0
4.0
3.1
4.9
8.0
9.0
32.0
13-P-0373
169.0
3.0
4.0
3.1
5.0
8.0
8.5
31.6
13-P-0220
N/A
3.0
3.0
2.8
3.6
8.0
9.0
29.4
13-P-0252
305.0
2.0
4.0
3.3
4.1
8.0
9.0
30.4
13-P-0359
N/A
2.0
3.0
3.5
4.2
8.0
8.9
29.6









Total
9,427.0
154.0
207.5
199.7
261.3
433.5
480.0
1736.0









Average
171.4
2.8
3.8
3.6
4.8
7.9
8.7
31.6








No. of Reports
55






14-N-0358
13

-------
Appendix C
OIG Reports Reviewed With CMR - FY 2013
Publication No.
Assignment No.
Title
13-P-0351
OA-FY13-0231
Internal Control Lessons Learned for Hurricane Sandy Disaster Relief Appropriations Act Funds
13-P-0361
OA-FY12-0606
EPA Needs to Improve STAR Grant Oversight
13-P-0432
OA-FY12-0570
Controls and Oversight Needed to Improve Administration of EPA's Customer Service Lines
13-P-0433
OA-FY13-0113
Congressionally Reguested Inguiry Into the EPA's Use of Private and Alias Email Accounts
13-P-0435
OPE-FY13-0011
The EPA Should Assess the Utility of the Watch List as a Management Tool
13-P-0386
OPE-FY13-0004
The EPA's International Program Office Needs Improved Strategic Planning
13-P-0352
OPE-FY13-0002
The EPA's Comments Improve the Environmental Impact Statement Process But Verification of Agreed-Upon Actions Is
Needed
14-N-0358
14

-------
Appendix D
OIG CMR Results - FY 2013
Publication
No.
Elapsed Days
from Kickoff to
OCPA
Planning
Evidence
Supervision
Reporting
Post
Reporting/Data
Accuracy
Compliance
Review Score
13-P-0351
N/A
15.0
18.0
30.0
20.0
11.0
94.0
13-P-0361
N/A
10.0
18.5
29.4
20.0
15.0
92.9
13-P-0432
N/A
10.0
16.0
25.8
20.0
11.0
82.8
13-P-0433
N/A
15.0
17.0
30.0
20.0
10.0
92.0
13-P-0435
N/A
14.0
20.0
30.0
20.0
15.0
99.0
13-P-0386
N/A
15.0
20.0
26.9
20.0
13.5
95.4
13-P-0352
N/A
15.0
20.0
28.0
20.0
12.0
95.0








Total
N/A
94.0
129.5
200.1
140.0
87.5
651.1








Average
N/A
13.4
18.5
28.6
20.0
12.5
93.0








No. of Reports
7





14-N-0358
15

-------