t g% \
1361
U.S. ENVIRONMENTAL PROTECTION AGENCY
OFFICE OF INSPECTOR GENERAL
Catalyst for Improving the Environment
Quality Assurance Report to the
Acting Inspector General
Measuring the Quality of
Office of Inspector General
Reports Issued in
Fiscal Years 2008 and 2009
Report No. 1Q-N-0134
June 2, 2010

-------
Report Contributors:
Carolyn J. Hicks
Kevin L. Christensen
Tina Lovingood
Abbreviations
EPA
U.S. Environmental Protection Agency
FY
Fiscal Year(s)
GAGAS
generally accepted government auditing standards
OA
Office of Audit
OCPL
Office of Congressional and Public Liaison
OCPM
Office of Congressional, Public Affairs and Management (formerly OCPL)
OIG
Office of Inspector General
OMS
Office of Mission Systems
OPE
Office of Program Evaluation
PLD
Product Line Director(s)
PM
Project Manager(s)
PMH
Project Management Handbook
SES
Senior Executive Service

-------
^ฃ0SX
i -jQt-7 5	UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
| X\\/V ?	WASHINGTON, D.C. 20460
V
OFFICE OF
INSPECTOR GENERAL
June 2, 2010
MEMORANDUM
SUBJECT: Measuring the Quality of Office of Inspector General Reports
Issued in Fiscal Years 2008 and 2009
Report No. 10-N-0134
FROM: Kevin L. Christensen, Carolyn J. Hicks, and Tina Lovingood /s/
Special Assistants to the Acting Inspector General
TO:	Bill A. Roderick
Acting Inspector General
Attached is our report on measuring the quality of Office of Inspector General (OIG) reports
issued in Fiscal Years 2008 and 2009. This report, as with last year's report, continues to make
observations and recommendations to you that will continue to enhance and strengthen the audit
and evaluation processes. The reports scored during this review are included in Appendices A
and B. As with the Fiscal Year 2007 report, the focus was on Supervision, Timeliness, and
Evidence. We explain the specific attributes for which we reviewed OIG reports in the Scope
and Methodology section of this report.
If you have any questions about this report and its observations and recommendations, please
contact Kevin Christensen at 202-566-1007, Carolyn Hicks at 202-566-1238, or Tina Lovingood
at 202-566-2906.

-------
Measuring the Quality of Office of Inspector General Reports
Issued in Fiscal Years 2008 and 2009
10-N-0134
Table of C
Chapters
1	Introduction		1
Purpose		1
Measuring the Quality of OIG Reports		1
Scoring the Results		2
Scope and Methodology		3
2	Notable Improvements Made, But Further Opportunities Exist		4
Many Improvements Made Since Last Quality Assurance Review		4
Additional Opportunities for Improvement Exist		5
Work Paper Preparation		5
Report Indexing		5
Use of Draft Agency Documents 		6
Defining When Reports Should Use the Word "Official"		6
Scope of Work		7
Appendices
A Office of Inspector General Project Quality Scorecards - Fiscal Year 2008... 8
B Office of Inspector General Project Quality Scorecards - Fiscal Year 2009... 14

-------
10-N-0134
Chapter 1
Introduction
Purpose
The purpose of this review is to report on the set of criteria the Office of Inspector General (OIG)
of the U.S. Environmental Protection Agency (EPA) uses to measure quality in reports issued by
its Office of Audit (OA), Office of Program Evaluation (OPE), Office of Missions Systems
(OMS), and Office of Congressional, Public Affairs and Management (OCPM) [formerly the
Office of Congressional and Public Liaison (OCPL)]. Measuring the quality of OIG work is
important because it provides data that can be used to identify areas in which OIG processes can
be improved. We applied our quality measurement criteria to 98 EPA OIG reports issued from
October 1, 2007, through September 30, 2009 (47 for Fiscal Year [FY] 2008 and 51 for FY 2009).
Measuring the Quality of OIG Reports
The primary goal of OIG reporting continues to be to keep EPA, the Administration, and
Congress fully informed of issues impacting EPA programs, as well as EPA's progress in taking
action to correct those issues. The Office of Management and Budget is also an important
customer because of its impact on the OIG budget. As noted in the Government Auditing
Standards (July 2007), an "... audit organization should analyze and summarize the results of its
monitoring procedures at least annually, with identification of any systemic issues needing
improvement, along with recommendations for corrective action."
In developing our criteria to measure quality, we continue to recognize the timeliness of our
products is very important to our customers; therefore, timeliness is a high-quality characteristic.
Likewise, compliance with the generally accepted government auditing standards (GAGAS) in
the Government Auditing Standards is required and, thus, is a high quality characteristic. With
that in mind, the OIG should strive to consistently provide products that meet specific quality
characteristics and adhere to all applicable standards and OIG policies and procedures.
Accordingly, a measuring process should provide a mechanism to evaluate individual products
against specific quality criteria. The measuring process should also present the information in a
manner that, over time, will allow the OIG to assess trends in quality so that necessary
adjustments can be made to policies, procedures, and activities. The criteria used in this project
to assess quality in OIG reports were:
•	Project cost
•	Documentary reliability of evidence
•	Timeliness in preparing draft reports
•	Readability of reports
A scoring form to measure and score these characteristics provides the organization with a
measurement of product quality and also serves as a basis for measuring a manager's
performance. The specific manner in which we calculated points is shown in our project quality
scorecard in Appendix A for FY 2008 and Appendix B for FY 2009 (in each appendix, the first
1

-------
10-N-0134
table shows the scores and the second table identifies the assignment numbers and report titles).
The OIG's scoring process was fully implemented in the beginning of FY 2007. An Inspector
General Statement issued on October 10, 2006, fully explains the scoring process. Beginning
with FY 2008, the Acting Inspector General decided that significance would no longer be graded
because it was subjective. The scorecard will be amended to remove this section for FY 2011
scorecards.
The project quality scorecard reflects the OIG's process for measuring the quality of audits,
evaluations, and other reviews. The process to measure quality is part of the OIG's overall
quality control system that serves as a basis for ensuring our results will consistently meet
customers' needs and withstand challenges. All OIG audits, program evaluations, and other
reviews are conducted in accordance with GAGAS unless noted. The Project Management
Handbook (PMH) is the OIG's guide for conducting all reviews in accordance with GAGAS and
other professional standards.
The scoring process encompasses an evaluation of activities from preliminary research to the
point that an OIG team submits a draft report to the OIG's OCPM for edit. The process includes
a measurement for report communication that encompasses the readability, completeness,
conciseness, and presentation of draft reports. Staff days are measured based on a goal of
providing the report to OCPM within 200 days; teams receive +5 points if a report comes in
under 200 days and a point is deducted for every 50 days beyond 250 days.
The maximum number of points that can be earned in each specific phase is:
Planning	3 points
Field Work	4 points
Evidence	4 points
Supervision	5 points
Draft Report Preparation and Timeliness	8 points
Report Communication	9 points
Scoring the Results
The total quality scores, as well as the timeframes and project costs for major OIG reports, are
shown in Appendices A and B. Each total quality score measures project and report quality
characteristics including Planning (Preliminary Research), Field Work, Evidence, Supervision,
and Reporting (Timeliness and Readability). The maximum number of points achievable for a
draft report issued to the Agency in under 200 days is 38 points. A draft report issued between
201 and 250 days can earn a maximum score of 33 points.
During FYs 2008 and 2009, the cost of reports increased and the supervision quality
characteristics in the OIG project management scorecard increased overall. The average cost of
an OIG report (excluding the audit of the Agency's financial statements) increased from
$309,032 in FY 2008 to $341,539 in FY 2009.1 The average project scores for both FYs 2008
and 2009 round to 33 points. Supervisory scores increased over both fiscal years. Product Line
1 The relatively significant rise in average cost includes an assignment (09-P-0125) that was open for over 2 years
and had a relatively large cost. If this report had been excluded, the average project cost would have increased more
moderately to an average cost of $314,203.
2

-------
10-N-0134
Directors, for example, routinely documented their approval of the project guide prior to the
entrance conference. This represents their approval of the project's objectives and scope and
methodology. Supervisors also approved their team members' work papers within 30 days of
staff completion.
The OIG teams used the discussion draft report process and held meetings with Agency
management and staff to discuss the reports, ensure accuracy and tone, and present proposed
recommendations. The 47 reports scored in FY 2008 contained 181 recommendations made to
the Agency, and the Agency accepted 152 of those recommendations (84 percent) as of the final
report dates. The 51 reports scored in FY 2009 contained 177 recommendations made to the
Agency, and the Agency accepted 149 of those recommendations (again, 84 percent), as of the
final report dates. The percentage of recommendations that had been accepted for the FY 2007
reports was also 84 percent.
Scope and Methodology
We reviewed cost and time data stored in the Inspector General Enterprise Management System
(known as "IGEMS") for each of the OIG audit projects that were scored for quality. We then
reviewed the assignment work papers in the OIG's Auto Auditฎ and TeamMate automated work
paper systems and the final reports using the scoring form. During the scoring process, we also
contacted supervisors as needed on each assignment to obtain additional information. The
scoring form measured each assignment as to Planning (Preliminary Research), Field Work,
Evidence, Supervision, and Reporting (Timeliness and Readability). Beginning in FY 2008, the
"significance" portion of the scorecard was not scored because it involved more subjectivity than
intended for scoring purposes. We believe these scorecards can be applied to all OIG
assignments in accordance with GAGAS. The scorecards should allow for enough variety in
impact quality measurement to cover all of our work.
Our scope covered final performance audit and evaluation reports issued by OA, OPE, OMS, and
OCPM from October 1, 2007, through September 30, 2009, that were reviewed and scored by
OIG's quality assurance staff. We did not include Defense Contract Audit Agency contract audit
reports or other reports in which the work was performed by external auditors. During our
review, we took into account changes made since our prior quality assurance reviews involving
OIG measures related to FY 2007 reports (Report No. 08-A-0081, issued February 12, 2008).
3

-------
10-N-0134
Chapter 2
Notable Improvements Made,
But Further Opportunities Exist
During FYs 2008 and 2009, the OIG made noticeable improvements regarding documentation of
work paper reviews. Supervisory reviews were better documented, and reviews were more
timely. Product Line Directors (PLDs) and Project Managers (PMs) demonstrate that they are
reviewing the supporting work papers for the draft and final reports. The staffs are responding to
the PLD/PM comments, and clearance by the PLD/PM is documented in the review sheets and
notes. Nonetheless, we noted the following areas where improvements can be made:
•	Work papers should be a more reasonable length.
•	Indexing should be updated at various stages.
•	Use of draft Agency documents should be managed better.
•	Proper attribution should be provided in reports.
•	Dates used to define the scope of work should be more standardized.
Many Improvements Made Since Last Quality Assurance Review
Since the last quality assurance review issued on February 12, 2008 (Report No. 08-A-0081),
which covered issues regarding FY 2007 reports, the OIG added policy guidance in the PMH
regarding supervisory reviews and better reviewer notes being kept in a central location of the
work paper files. The independent referencing function has been established in the Immediate
Office with a Special Projects staff to review all OIG draft and final audit and evaluation
products. Where the independent referencer took significant exception to proposed OIG reports,
the Acting Inspector General was notified directly of the concern and the issue was resolved.
Several recommendations from the February 2008 quality assurance report have been
implemented and have helped to improve the quality of reports and work processes. These
improvements include:
•	Revision of the PMH to clarify that supervisors are responsible for reviewing the status of
work and not just work papers that staff have deemed complete. PLDs, in addition to
reviewing the work papers of the PM during field work, should also be reviewing the
work papers of other staff to determine the effectiveness of the PM's review.
•	Revision of the Report Formatting and Style Guide to ensure that staff fully understand
when to provide attribution in OIG reports, as well as revision to provide additional
clarification to ensure OIG reports do not use ambiguous terms.
•	Revision of the PMH to clarify that decisions by all OIG staff, including senior OIG
officials, be completely documented. Where officials do not provide such explanation,
the PLD will advise the appropriate Assistant Inspector General or other senior OIG
official and request an explanation regarding their decision.
•	Better assurance through the project scorecard that assignment guides are reviewed and
approved by the PLD prior to field work.
4

-------
10-N-0134
•	Revision of the PMH to clarify that interview write-ups with Agency staff and officials
contain conclusions by OIG staff about the statements made and additional steps needed
to validate the statements made.
•	Revision of the PMH to clarify the aspects of the team's methodology that need to be
discussed at the entrance conference and require discussion of those aspects in the
entrance conference write-up.
During FYs 2008 and 2009, there were noticeable improvements regarding documentation of
work paper reviews. Supervisory reviews are better documented, and the comments were
retained in the work papers as either a master list or via comment sheets. The reviews were more
timely, as required by Inspector General Statement No. 2 issued October 10, 2006. The
guidance, which is incorporated into the PMH, requires that work papers be reviewed monthly
for GS-1 Is and above and twice monthly for GS-9s and below. Only 3 of 47 reports scored in
FY 2008 and 2 of the 51 scored in FY 2009 had less than a score of 4.0 for supervision. The
average score for supervision was 4.0 in FY 2007, 4.63 in FY 2008, and 4.77 for FY 2009.
As a result of the above actions, the quality improvement measures instilled in the audit process
provide a direct correlation to higher-quality OIG reports. Areas such as audit supervision are
more consistent over time as measured by the project scorecard. However, enhancements to the
project quality scorecard can always be made and are discussed below.
Additional Opportunities for Improvement Exist
Work Paper Preparation
Though there has been improvement, one area of work paper preparation that continues to need
attention is maintaining work papers of reasonable length. Work papers continue to have more
than the results of one audit or evaluation step or sub-step. They include multiple interviews,
e-mails, documents, and analyses. This issue has a negative impact on the timeliness of
independent referencing. Work papers should not be so lengthy that they impede an effective or
timely review, and they should address a specific audit or evaluation step or sub-step as
identified in the audit guide.
Recommendation 1: Revise the PMH to include as part of the work paper preparation
and review processes that each work paper addresses only one audit or evaluation step or
sub-step.
Report Indexing
Report indexing has improved. As per GAGAS, auditors must obtain sufficient, appropriate
evidence to provide a reasonable basis for their findings and conclusions. GAGAS states that the
process of preparing and reviewing audit documentation should allow for the review of audit
quality. PMs and PLDs have directed their staffs to more precisely index report statements to
supporting documentation. Also, the OIG plans to continue to reemphasize good indexing
through training on an as-needed basis.
5

-------
10-N-0134
However, during quality control reviews of official draft reports, indexes to supporting
information often pertained to the discussion draft provided to the Agency rather than the formal
draft. In some cases, no further audit work was conducted and the suggested change by the
Agency was accepted by the OIG without the OIG doing any validation. While the purpose of
the discussion draft is to facilitate discussion with the auditee, changes by the auditee should be
supported by appropriate documentary evidence. Also, OIG conclusions or opinions are
sometimes not included in the audit work papers but materialize in the audit report with no
indexing.
Insufficient indexing of summaries, finding outlines, and spreadsheets is also a concern. In some
cases, reports are indexed to summary work papers or finding outlines that are not cross-indexed
to supporting work papers. In other cases, spreadsheets are not clearly cross-indexed to
supporting documentation, or report indexes do not refer to a specific location in a spreadsheet.
Both issues result in the need for additional time in referencing.
Recommendation 2: Amend the PMH with additional guidance on indexing,
specifically noting that: (1) OIG conclusions and opinions in the draft and final reports,
summaries, and finding outlines must be indexed to supporting audit work papers which
show the complete facts and rationale for a conclusion or opinion; (2) spreadsheets must
be cross-indexed to supporting documentation; and (3) report indexes must refer to a
specific location in a spreadsheet.
Use of Draft Agency Documents
Audit teams used Agency draft documents to support audit conclusions without proper
attribution and, in some cases, without any further validation of the information presented in the
OIG draft report to make it current. For example, one report utilized an EPA guidance document
that had been in draft for over 5 years and did not identify the document as draft in the OIG
report. The team should have performed more audit work to determine whether the issues
identified in the Agency's draft document were still valid and whether the document was or
would ever be published. In another example, a team did not verify that criteria used were the
most current and up-to-date information before the report was submitted for referencing.
Recommendation 3: Revise the PMH to specify that reports should clearly attribute
draft sources, and that attributed draft sources should be checked shortly before
referencing and submission of the draft report for comment to verify that the OIG report
contains the most up-to-date and current information.
Defining When Reports Should Use the Word "Official"
While attribution in OIG reports improved in FY 2008, it continues to need improvement. Staff
did not always provide sufficient attribution in reports regarding the level of the Agency staff
making comments. This issue was reported in prior quality assurance reviews and the Acting
Inspector General provided guidance. In his view, Agency staff at the Senior Executive Service
6

-------
10-N-0134
(SES) or higher level should normally be referred to as Agency officials. However, report
statements either did not attribute the report statements to any official or used the term "Agency
official" when the employee was below the SES level. Without identifying the support for the
statement, including the title of the individual as needed, the reader is less likely to be able to
judge the credibility of the statement provided. For example, in one assignment, the entire
"Noteworthy Accomplishments" section was indexed to the statement of an Agency Office
Director without any supporting documentation. No attribution was given and no documentation
was obtained to verify the statements of the Agency official making the statement. In another
example, an EPA mass e-mail was indexed as support for a statement that appeared to be a
position obtained during an interview.
Recommendation 4: Update the PMH to provide the guidance on the proper use of
indexing and the proper use of the term "official," and provide examples, if possible, of
when indexing and the use of the term "official" are inappropriate.
Scope of Work
Audit research, field work, and reporting are not distinct phases within the audit cycle and may
overlap. These phases are discussed in detail in the PMH. For reporting purposes, and to better
define the audit timeframes, the statement contained in the report describing the scope of work
will commence with the preliminary research kick-off meeting with the Agency (or, if
preliminary research is not conducted, the entrance conference) and will end when the draft
report is provided to the Agency for comment (or the discussion draft if a draft is not issued).
Recommendation 5: Update the PMH to state audit work is conducted from preliminary
research kick-off meeting/entrance conference to the date the draft report (or discussion
draft if there is no official draft) is provided to the Agency.
7

-------
Report Number
08-P-0020
08-1-0032
08-2-0039
08-2-0045
08-P-0049
08-P-0055
08-P-0062
08-P-0080
08-P-0083
08-P-0084
08-2-0095
08-P-0093
08-2-0099
08-P-0116
08-P-0120
Staff
Days
1,505
3,081
102
53
597
173
46
179
564
64
88
417
62
612
330
Project
Cost
(000s)
$1,288.6
$2,575.6
$84.8
$44.1
$571.7
$136.7
$38.5
$145.5
$470.2
$52.5
$73.6
$336.9
$54.2
$478.9
$260.1
Appendix A
Office of Inspector General Project Quality Scorecards - Fiscal Year 2008
Elapsed
Days from
Kickoff to
OCPL
Planning Field Work Evidence Supervision
Draft Report
Preparation
and Timeliness
Report
Communication
Total
Assignment
Score
807
196
45
128
394
232
85
87
199
120
59
251
123
351
3
3
3
3
3
3
3
3
3
3
3
3
3
3.4
3
4
4
4
4
3.5
4
4
3.8
4
3.3
4
4
4
4
4
4
3.5
4
4
4
4
2.5
3.2
4
3.3
4.0
4.6
4.7
4.1
2.8
4.6
4.1
5.0
4.6
4.0
4.6
4.7
4.5
3.5
13
13
7
5
8
13
13
7.0
8.0
13
8
13
6.0
13.0
7.1
9.0
9.0
7.4
9.0
9.0
9.0
9.0
9.0
9.0
9.0
9.0
9.0
30.3
34.7
37.7
31.1
26.2
32.1
36.6
38
31.6
31.8
36.1
31.2
37.5
28.8
8

-------
Report Number
08-P-0121
08-P-0141
08-04-0156
08-P-0169
08-4-0154
08-P-0245
08-2-0226
08-P-0213
08-2-0241
08-P-0200
08-P-0199
08-P-0266
08-P-0264
08-P-0184
Staff
Days
503
414.4
450
632
287
561
84
99
119
188
347
1052
192
Project
Cost
(000s)
$419.5
$327.0
$370.0
$986.3
$140.0
$443.6
$73.5
$84.0
$64.1
$161.5
$274.1
$830.9
$165.1
$91.0
Appendix A
Office of Inspector General Project Quality Scorecards - Fiscal Year 2008
Elapsed
Days from
Kickoff to
OCPL
Planning Field Work Evidence Supervision
Draft Report
Preparation
and Timeliness
Report
Communication
Total
Assignment
Score
230
567
226
606
193
299
79
94
41
161
399
351
291
2
3
3
3
3
3
3
3
3
3
3
3
4
3.9
4
2
4
4
4
4
4
4
4
4
3.5
3.0
4.0
4
4
3
4
3
4
3
4
4
3
4.4
4.4
4.6
4.9
4.7
5.0
4.8
5.0
5.0
4.8
5.0
5.0
2.0
8.0
1
12
7
13
13
13
13
6
6
7
9.0
9.0
9.0
9.0
8.9
9.0
9.0
9.0
7.9
8.4
8.3
9.0
30.1
24.4
32.3
25.6
34.9
30.6
38
36.8
38
35.9
30.2
30.3
31
9

-------
Report Number
08-P-0235
08-1-0149
08-1-0194
08-2-0142
08-P-0174
O8-P-0196
08-2-0204
08-P-0206
08-P-0265
08-P-0278
08-1-0277
08-4-0270
08-P-0271
08-P-0267
08-P-0186
Staff
Days
757
211
275
94
800
1094
153
268
763
662
144
43
455
155
377
Project
Cost
(000s)
$811.6
$175.8
$228.6
$43.1
$634.1
$865.1
$130.4
$231.6
$603.2
$528.2
$123.4
$35.4
$386.3
$121.5
$314.2
Appendix A
Office of Inspector General Project Quality Scorecards - Fiscal Year 2008
Elapsed
Days from
Kickoff to
OCPL
Planning Field Work Evidence Supervision
Draft Report
Preparation
and Timeliness
Report
Communication
Total
Assignment
Score
618
105
164
38
416
272
177
170
334
267
86
197
354
82
332
3
3
3
3
3
3
3
3
3
3
3
3
2.5
4
4
4
4
4
4
4
3
4
4
4
4
3.5
4
4
4
3
3.3
3.6
3.7
3.8
3.7
3.6
3.8
4
3
5.0
5.0
5.0
5.0
4.6
4.8
4.2
4.5
5.0
5.0
4.8
5.0
5.0
13
13
13
5
8
13
13
6
7
13
13
6
13
6.4
8.0
9.0
8.8
9.0
9.0
9.0
9.0
9.0
9.0
9.0
9.0
7.4
35.4
37
38
28.8
31.9
37.4
36.9
29.3
31.7
37.6
37.6
31
34.4
29.4
10

-------
Appendix A
Office of Inspector General Project Quality Scorecards - Fiscal Year 2008
Staff
Report Number Days
Project
Cost
(000s)
Elapsed
Days from
Kickoff to
OCPL
Planning Field Work Evidence Supervision
Draft Report
Preparation
and Timeliness
Report
Communication
Total
Assignment
Score
08-2-0309
08-P-0291
08-P-0276
AVG
27	$22.5
NUMBER OF REPORTS
344 $293.3
236 $200.3
420.58 $357.26
47
47
257
183
235.72
2.9
3.84
3.68
5.0
5.0
5.0
4.63
13
13
9.49
9.0
7.7
8.69
38
29
35.7
33.23
11

-------
Appendix A
Office of Inspector General Project Quality Scorecards - Fiscal Year 2008
Report Numbers Assignment Numbers Titles
08-P-0020
2005-1117
Improvements Needed in Air Toxics Emissions Data Needed to Conduct Residual Risk Assessments
08-1-0032
2007-590
Audit of EPA's Fiscal 2007 and 2006 (Restated ) Consolidated Financial Statements
08-2-0039
2007-950
Village of Laurelville, Ohio-Unallowable Costs Claimed Under EPA Grant XP97579701
08-2-0045
2007-312, 2007-865
Unallowable Federal Funds Drawn on EPA Grant No. XP98247201 Awarded to the Wayne County Water and Sewer Authority,
New York
08-P-0049
2006-1287
Despite Progress, EPA Needs to Improve Oversight of Wastewater Upgrades in the Chesapeake Bay Watershed
08-P-0055
2007-573
EPA Should Continue to Improve Its National Emergency Response Planning
08-P-0062
2007-958
City of Elizabeth, New Jersey-Excess Clean Water State Revolving Funds Claimed
08-P-0080
2007-926
EPA's Office of Air and Radiation Needs to Improve Compliance with Audit Followup Process
08-P-0083
2007-539
Framework for Developing Tribal Capacity Needed in the Indian General Assistance Program
08-P-0084
2007-956
Borough of Carteret, New Jersey-Unallowable Costs Claimed Under EPA Grant XP98247001
08-2-0095
2008-128
City of Bad Axe, Michigan-Unallowable Costs Claimed Under EPA Grant XP98578301
08-P-0093
2007-442
EPA Should Further Limit Use of Cost Plus Award Fee Contracts
08-2-0099
2007-979
Followup on Information Concerning Superfund Cooperative Agreements with New York and New Jersey
08-P-0116
2007-491
EPA Can Recover More Federal Superfund Money
08-P-0120
2007-952
Summary of Recent Developments in EPA's Drinking Water Program and Areas for Additional Focus
08-P-0121
2007-641
Improvements Needed to Ensure Grant Funds for U.S. Mexico Border Water Infrastructure Program are Spent More Timely
08-P-0141
2006-1400
EPA Needs to Track Compliance with SF Cleanup Requirements
08-4-0156
2007-815
Canaan Valley Institute, Inc.
08-P-0169
2006-1433
Improved controls Would reduce Superfund Backlogs
08-4-0154
2007-994
Tetra Tech Charging Verification Review
08-P-0245
2007-903
Border 2012 Program Needs to Improve Program Management to Ensure Results
08-2-0226
2008-167-OA-FY08-0063
Passaic Valley Sewerage Commissioners-Unallowable Costs Claimed Under EPA Grant XP98237601
08-P-0213
2008-97-OA-FY08-0006
Oglala Sioux Single Audits-Corrective Actions Taken but Improvements Needed in Resolving Costs
08-2-0241
2008-000175
Agreed-Upon Procedures on EPA's Fiscal Year 2008 Second Quarter Financial Statements
08-P-0200
OCPL-FY07-0005
Follow-Up Review on Progress at Escambia Treating Company Superfund Site, Pensacola, Florida
08-P-0199
2007-000479
EPA Needs to Better Report Chesapeake Bay Challenges - A Summary Report
08-P-0266
2007-0873
EPA Assisting Tribal Water Systems but Needs to Improve Oversight
08-P-0264
OCPL-FY07-0006
Corrective Actions Were Generally Implemented at Stauffer Chemical Company Superfund Site, Tarpon Springs, FL
12

-------
Appendix A
Office of Inspector General Project Quality Scorecards - Fiscal Year 2008
Report Numbers Assignment Numbers Titles
08-P-0184
2007-990
Millions in Federal Dollars Remain for the Colonias Projects
08-P-0235
2006-1402
EPA Decisions to Delete Superfund Sites Should Undergo Quality Assurance Review
08-1-0149
2007-000848
Fiscal Year 2007 and 2006 Financial Statements for the Pesticide Registration Fund
08-1-0194
2007-000846
Fiscal Year 2007 and 2006 Financial Statements for the Pesticides Reregistration and Expedited Processing Fund
08-2-0142
OA-FY08-0064
Agreed-Upon Procedures on EPA's Fiscal Year 2008 First Quarter Financial Statements
08-P-0174
2007-0308
More Action Needed to Protect Public from Indoor Radon
08-P-0196
2007-000727
Making Better Use Of Stringfellow Superfund Special Accounts
08-2-0204
2008-0144
Village of Wellsville, Ohio - Ineligible Costs Claimed Under EPA Grant XP97582801
08-P-0206
2007-0748
Voluntary Greenhouse Gas Reduction Programs Have Limited Potential
08-P-0265
2008-0114
EPA Should Continue Efforts to Reduce Unliquidated Obligations in Brownfields Pilot Grants
08-P-0278
2007-0967
EPA Needs to Improve Strategic Planning for Priority Enforcement Areas
08-1-0277
2008-0152
National Caucus and Center on Black Aged, Inc., Incurred Cost Audit of Eight EPA Cooperative Agreements
08-4-0270
2008-0145
Final Mixed Funding Claim for Old Southington Superfund Site (United Technologies)
08-P-0271
2007-000557
EPA Personnel Access and Security System Would Benefit from Improved Project Management to Control Costs and
Timeliness of Deliverables
08-P-0267
OMS-FY08-0009
Identification Proofing, Incident Handling, and Badge Disposal Procedures Needed for EPA's Smartcard Program
08-P-0186
2007-0985
EPA Can Improve the Awarding of Noncompetitive Contracts
08-2-0309
OA-FY08-0064
Agreed-Upon Procedures on EPA's Fiscal Year 2008 Third Quarter Financial Statements
08-P-0291
2007-000900
A Region 5 Penalty Reduction Was Unjustified and Undocumented
08-P-0276
2008-0163
EPA Actions Should Lead to Improved Grants Accountability
13

-------
Appendix B
Office of Inspector General Project Quality Scorecards - Fiscal Year 2009
Report
Number
Staff
Days
Project Cost
Elapsed
Days
from
Kickoff
to OCPL
Planning Fieldwork Evidence Supervision
Draft Report
Preparation and
Timeliness
Report
Communication
Total Assignment
Score
09-1-0026
09-1-0107
09-1-0172
09-2-0011
09-2-0078
09-2-0161
09-2-0195
09-2-0200
09-2-0247
09-4-0112
09-4-0133
09-4-0134
09-4-0135
09-P-0029
09-P-0061
2,791
187
203
177
121
103
1,170
106
106
307
138
108
189
1,143
658
$2,174,361
$158,338
$171,671
$147,541
$103,202
$69,913
$152,290
$89,714
$89,714
$346,181
$117,152
$89,639
$158,189
$398,750
$542,142
82
126
182
148
103
637
78
41
300
137
278
154
364
295
3
3
3
3
3
3
3
3
3
3
3
2
3
4
4
4
4
4
4
4
4
3
3
4
4
4
4
4
3.8
3.8
4
3.5
4
4
4
3.3
3.6
3.5
4
3.9
5
4.7
4.6
4.7
5
4.9
5
5
4.5
4.8
4.7
4.8
4.8
13
13
13
13
13
1
13
13
7
13
7
13
5
6.1
8.8
9
9
9
9
9
9
9
9
9
9
9
35.9
35.1
37.5
37.4
37.5
38
25.4
38
38
30.5
36.1
31.3
36.3
29.8
30.4
14

-------
Appendix B
Office of Inspector General Project Quality Scorecards - Fiscal Year 2009
Report
Number
Staff
Days
Project Cost
Elapsed
Days
from
Kickoff
to OCPL
Planning Fieldwork Evidence Supervision
Draft Report
Preparation and
Timeliness
Report
Communication
Total Assignment
Score
09-P-0085
09-P-0086
09-P-0087
09-P-0088
09-P-0089
09-P-0092
09-P-0110
09-P-0119
09-P-0125
09-P-0127
09-P-0128
09-P-0129
09-P-0130
09-P-0131
451
260
516
810
827
75
1,205
2,135
574
186
2,271
840
$114,643
$375,316
$216,000
$435,018
$641,157
$663,986
$62,215
$947,442
$1,680,991
$508,312
$156,121
$665,405
$674,546
$131,243
184
234
334
407
322
260
158
670
422
147
236
219
3
3
3
3
2
3
3
3
2
3
3
2
4
4
4
4
4
4
4
4
1
3
4
4
4
4
3
3
3.3
4
3
3
3
3.1
3.7
3.2
5
5
4.5
4.3
5
4.9
4.8
5
5
5
4.9
5
13
8
6
5
6
7
13
0
1
11
8
8
9
7.8
7.8
8.9
9
9
9
8.2
8.2
9
9
9
37.8
38
31.8
28.3
28.2
29.3
31.9
36.8
23.2
20.2
34.1
32.6
31.2
15

-------
Appendix B
Office of Inspector General Project Quality Scorecards - Fiscal Year 2009
Report
Number
Staff
Days
Project Cost
Elapsed
Days
from
Kickoff
to OCPL
Planning Fieldwork Evidence Supervision
Draft Report
Preparation and
Timeliness
Report
Communication
Total Assignment
Score
09-P-0144
09-P-0147
09-P-0151
09-P-0152
09-P-0162
09-P-0197
09-P-0203
09-P-0206
09-P-0222
09-P-0223
09-P-0225
09-P-0229
09-P-0231
09-P-0232
09-P-0233
742
340
489
110
502
279
201
193
253
611
43
661
448
386
$621,682
$277,837
$397,760
$89,833
$419,023
$240,019
$212,746
$159,144
$207,350
$505,399
$36,416
$575,867
$382,325
$515,791
$123,034
276
208
243
110
342
133
134
441
233
195
30
182
249
310
285
3
3
3
1
3
3
2
3
3
2
3
3
3
4
4
4
3
4
4
2
3.5
4
3
4
4
4
3.5
3
3.1
4
3.3
4
3
2.9
3.8
3
3.8
3.4
3.5
3
5
4.8
5
4.2
5
5
4.4
2.2
5
5
4.7
5
5
13
8
13
6
13
13
2.5
6.5
13
13
13
8
5.5
8.8
9
9
9
9
5.2
7.5
8.3
9
9
9
9
7.7
36.8
31.9
38
26.5
38
33.2
21.3
27.3
37
35.8
37.1
32.5
28.2
29.4
16

-------
Appendix B
Office of Inspector General Project Quality Scorecards - Fiscal Year 2009
Report
Number
Staff
Days
3,455
593
204
418
419
613
15
570.04
NUMBER OF REPORTS
09-P-0235
09-P-0240
09-P-0241
09-P-0242
09-P-0243
09-P-0176
09-X-0217
AVG
Project Cost
$368,706
$489,860
$260,513
$352,137
$568,898
$355,483
$10,298
$377,477
Elapsed
Days
from
Kickoff
to OCPL
237
188
287
339
407
97
245
Planning Fieldwork Evidence Supervision
3
3
2
3
2
2
2.75
4
4
4
4
3
4
3.75
3.9
4
3.6
4
3
4
3.57
5
5
4.6
4.7
5
5
4.77
Draft Report
Preparation and
Timeliness
8
13
7
7
5
13
9.25
Report
Communication
9
8.3
9
9
9
9
8.62
Total Assignment
Score
28.3
32.9
37.3
30.2
31.7
27
37
32.69
17

-------
Appendix B
Office of Inspector General Project Quality Scorecards - Fiscal Year 2009
Report Numbers Assignment Numbers Titles
09-1-0107
OA-FY09-0062
FY 2008 PRIA Financial Statements
09-1-0172
OA-FY09-0061
FY 2008 FIFRA Financial Statements
09-2-0011
OA-FY08-0061
SAAP Audit- Washoe County - NV
09-2-0078
OA-FY08-0256
SAAP Grant Awarded to Rupert ID
09-2-0161
OA-FY09-0809
Agreed Upon Procedures - EPA's FY 2009 Quarterly Financial Statements
09-2-0195
IGOR-FY07-0582
AA - Worthington WV FY 2004 Desk Review
09-2-0200
OA-FY09-0809
Agreed Upon Procedures - EPA's FY 2009 Quarterly Financial Statements
09-2-0247
OA-FY09-0809
Agreed Upon Procedures - EPA's FY 2009 Quarterly Financial Statements
09-4-0112
IGOR-FY07-1001
AA - SAAP Mille Lacs Band of Chippewa Indians
09-4-0133
OA-FY09-0052
STN Environmental Contract Review
09-4-0134
IGOR-FY07-1009
Call Henry Labor Verification Review
09-4-0135
OA-FY08-0128
Tetra Tech EM Inc Base Year Labor Verification Review
09-P-0029
IGOR-FY07-0880
SF Site Sampling
09-P-0061
OPE-FY07-0002
Evaluation of Energy Star Program Effectiveness Claims
09-P-0085
OA-FY08-0276
Alaska Village Safe Water Program Followup
09-P-0086
OA-FY08-0039
IAG Unliquidated Obligations
09-P-0087
OA-FY08-0018
CIPP Followup
09-P-0088
OA-FY07-0006
PART Assessment #1
09-P-0089
IGOR-FY07-0731
EPA Climate Change Programs and Science
09-P-0092
OPE-FY08-0001
Evaluation of EPA's CAA Section 112(r) Risk Management
09-P-0110
OPE-FY08-0022
Neal's Dump
09-P-0119
IGOR-FY07-0727
Utilization of SF Special Accounts
09-P-0125
2007-296
Air Emissions at Ports
09-P-0127
IGOR-FY07-0399
Freedom of Information Act
09-P-0128
OMS-FY08-0016
Management Oversight Review of the Institutional Controls Tracking System
09-P-0129
IGOR-FY07-0445
Working Capital Fund
09-P-0130
OPE-FY08-0001
Evaluation of EPA's CAA Section 112(r) Risk Management
09-P-0131
OPE-FY09-0001
Hotline - 2008-402 - CTS Printex
09-P-0144
OPE-FY08-0004
Recovery of Removal Costs at Non-NPL Sites
18

-------
Appendix B
Office of Inspector General Project Quality Scorecards - Fiscal Year 2009
Report Numbers Assignment Numbers Titles
09-P-0147
OPE-FY08-0007
EPA Peer Review Panels
09-P-0151
OPE-FY08-0003
Accuracy and Reliability of Radon Testing
09-P-0152
OPE-FY09-0006
Antimicrobial Testing Program Hotline
09-P-0162
OCPL-FY08-0002
Old Mission
09-P-0176
OCPL-FY08-0003
OSWER Regional Public Liaison
09-P-0189
OMS-FY09-0003
FY 2009 EPA FISMA Audit
09-P-0197
OMS-FY09-0002
EPA's System Development Activities
09-P-0203
OA-FY08-0323
ORD FMFIA Implementation
09-P-0206
OCPL-FY08-0011
OARM Reorganization IT Issues
09-P-0222
OPE-FY08-0013
Potential Impediments to OIG Oversight
09-P-0223
OPE-FY08-0027
EPA's Efforts to Establish Water Quality Standards to Protect the Nation's Waters from Excess Nutrients
09-P-0225
OA-FY09-0894
CERCLA Credit Claim - Concord NC
09-P-0229
OA-FY08-0255
Use of Independent Government Cost Estimates
09-P-0231
OPE-FY08-0028
Evaluation of EPA's Response to Great Lakes Areas of Concern
09-P-0232
OA-FY08-0323
ORD FMFIA Implementation
09-P-0233
OA-FY08-0374
HSPD 12 Hotline on Equipment
09-P-0235
IGOR-FY07-0877
Independent Evaluation of CEMS Calibration Gases
09-P-0240
OMS-FY08-0001
Follow-up: EPA's Efforts to Remediate Identified Information Security Weaknesses
09-P-0241
OA-FY09-0762
Unliquidated Obliqations on Superfund Cooperative Aqreements
09-P-0242
OA-FY08-0373
Controls for Contractor Invoices
09-P-0243
OPE-FY08-0017
Jones Sanitation
09-X-0217
OA-FY09-0876
Grant Accruals for Stimulus Payments
19

-------