$
<
73
\
&
o
PRO'S4-
OFFICE OF INSPECTOR GENERAL
Catalyst for Improving the Environment
Quality Assurance Report to the
Deputy Inspector General
Measuring the Quality of
Office of Inspector General Reports
Issued in Fiscal Year 2007
Report No. 08-A-0081
February 12, 2008

-------
Report Contributor:	Robert K. Bronstrup
Abbreviations
EPA
FY
IGEMS
IGOR
OCPL
OIG
U.S. Environmental Protection Agency
Fiscal Year
Inspector General Enterprise Management System
Inspector General Operations and Reporting System
Office of Congressional and Public Liaison
Office of Inspector General

-------
$
<
73
\
Ml
(T
b
2
ui
O
T
UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
WASHINGTON, D.C. 20460
OFFICE OF
INSPECTOR GENERAL
February 12, 2008
MEMORANDUM
SUBJECT: Measuring the Quality of Office of Inspector General Reports
Issued in Fiscal Year 2007
Report No. 08-A-0081
FROM:
Robert K. Bronstrup /s/
Special Assistant to the Deputy Inspector General
TO:
Bill A. Roderick
Deputy Inspector General
This is the final report to measure the quality of Office of Inspector General (OIG) reports issued
during Fiscal Year 2007. The OIG continued to use its process to score specific quality
characteristics of major OIG reports issued between October 1, 2006, and September 30, 2007.
Also, this report, as with last year's report, makes observations and recommendations that will
enhance the audit, evaluation, and liaison processes. There are few formal recommendations in
this year's report because the new quality assurance process helps to ensure quality issues are
timely resolved. During Fiscal Year 2007, specific issues were quickly brought to your attention
and the attention of the Assistant Inspectors General, Directors, and staff, and the issues were
resolved as described in this report.
We explain the specific attributes for which we reviewed OIG reports in Appendix A, which in
addition to discussing the review's scope and methodology also includes a listing of reports
reviewed. The project quality scoring form used in this review is included as Appendix B and is
the same as the one used last year. The scoring form the Office of Congressional and Public
Liaison used to assess draft reports is included as Appendix C.
If you have any questions about the final report or its observations and recommendations, please
contact me at 312-886-7169.

-------
Measuring the Quality of Office of Inspector General Reports
Issued in Fiscal Year 2007
Table of Contents
Chapters
1	Introduction		1
Purpose		1
Improvements Resulting from FY 2006 Quality Assurance Report		1
Measuring the Quality of OIG Reports		2
2	Scoring the Results		3
Positive Trends		7
Agency Accepted High Percentage of Report Recommendations		7
3	Working Paper Enhancements		8
Working Paper Preparation 		8
Indexing Reports 		10
Documenting OIG Decisions on Reportable Issues		10
Documenting Discussions with Agency on Scope and Methodology		11
Documenting Regular Meetings with Agency on Issues		12
4	Reporting Enhancements		13
Providing Attribution to Statements and Obtaining Factual Support		13
Use of Ambiguous Terms 		14
Reports Better Describe Approach for Each Objective		14
Visual Aids Show Source of Data		14
5	Administrative Enhancements		15
Calculating Project Costs		15
Entering Performance Measurement and Results System Data 		15
Appendices
A Scope and Methodology		16
B Project Quality Scorecard		19
C Report Quality Scoresheet for Draft Submissions		21

-------
Chapter 1
Introduction
Purpose
The purpose of this annual quality assurance review is to report on the set of
criteria the Office of Inspector General (OIG) of the U.S. Environmental
Protection Agency (EPA) used to measure quality in the audit and evaluation
reports issued in Fiscal Year (FY) 2007 (October 1, 2006, through September 30,
2007). Measuring the quality of OIG work is important because it provides data
that can be used to identify areas in improving OIG processes. The quality
measurement criteria were applied to 58 major OIG reports. Reports reviewed, as
well as scope and methodology information, are in Appendix A.
Improvements Resulting from FY 2006 Quality Assurance Report
Several recommendations from last year's quality assurance report on FY 2006
reports have been implemented and have helped improve the quality of reports
and work processes. These actions included:
•	The issuance of a policy, now incorporated into the OIG Project
Management Handbook {Handbook), to ensure timely supervisory reviews
and better assurance that reviewer notes are kept in a central location of
the working papers.
•	Better assurance through the project scorecard that assignment guides are
reviewed and approved by the Director prior to fieldwork.
•	Improving the quality assurance process by requiring certifying and checking
indexes supporting OIG reports by Project Managers and Directors.
•	Strengthening the independent referencing of OIG reports. The OIG
independent referencer is a GS-15 directly assigned to the Deputy
Inspector General. Where the independent referencer took significant
exception to proposed OIG reports, he directly notified the Deputy
Inspector General of the concern for resolution.
•	Reports clearly identify the source of information in tables and charts.
•	Updating the Handbook to reflect changes in generally accepted
government auditing standards presented in the January 2007 revision of
the Government Auditing Standards.
•	Improved descriptions in reports of the OIG methodology used to address
each objective.
•	Implementing a policy that will better ensure staff uniformly charge time
to direct assignments and indirect job codes. As a result, the OIG can
more accurately determine the actual costs of each project.
•	Strengthening the OIG followup process so that the final impact of our
work can be better determined.
1

-------
Additionally, the Handbook has been revised and now requires a separate
communication section to be included in each project's working papers. This
section will allow the OIG and outside reviewers to see the trail of discussions
with Agency/auditee officials about the development and reporting of issues.
This step will better ensure transparency of OIG decision making. As a result of
these actions, OIG reports are more timely, more cost effective, and of improved
quality as measured by the project scorecard.
Measuring the Quality of OIG Reports
The primary goal of OIG reporting, as stated in the FY 2006 quality assurance
report, continues to be to keep the Agency, Administration, and Congress fully
informed of issues impacting EPA programs and EPA's progress in taking action
to correct those issues. Another customer, based on its impact on our budget, is
the Office of Management and Budget.
The Government Auditing Standards (July 2007), paragraph 3.54, states: "The
audit organization should analyze and summarize the results of its monitoring
procedures at least annually, with identification of any systemic issues needing
improvement along with recommendations for corrective action." In developing
our criteria to measure quality, we continue to recognize that customers view
timeliness of our products as very important; therefore, timeliness is a high
quality characteristic. Compliance with generally accepted government auditing
standards is required and, thus, is also a high quality characteristic. With that in
mind, the OIG should strive to consistently provide products that meet specific
quality characteristics and adhere to all applicable standards and OIG policies and
procedures. Accordingly, a measuring process provides a mechanism to evaluate
individual products against specific quality criteria. This process also presents the
information in a manner that allows the OIG to assess trends in quality so that
necessary adjustments can be made to policies, procedures, and activities. The
criteria used in this project to assess quality in OIG reports were:
•	Project cost
•	Documentary reliability of evidence
•	Timeliness in preparing draft reports
•	Readability of reports, including whether the reports are clear, concise,
convincing, logical, and relevant
A scoring form enables the OIG to measure product quality and also serves as a
basis for measuring a manager's performance. The project quality scorecard in
Appendix B shows the specific manner in which points were calculated. The
report quality scoresheet the Office of Congressional and Public Liaison (OCPL)
Publications Unit used to score draft reports during FY 2007 is in Appendix C.
An Inspector General Statement was issued on October 10, 2006, that fully
explained the scoring process and all the criteria in both scoresheets. The OIG
fully implemented this scoring process in FY 2007.
2

-------
Chapter 2
Scoring the Results
The total quality scores, as well as the timeframes and project costs for major OIG
reports, are shown in Table 1. The full titles for each report are in Appendix A.
Reports that were either contracted or contained Confidential Business
Information are not included. Each total quality score is the sum of the two
scoring systems: one for project quality characteristics and the second for report
quality characteristics. Table 2 provides a more detailed description of the
scoring for project quality. Table 3 shows the number of days the OIG took from
the date OIG staff first met with the Agency/auditee to the date of the final report.
The ability to track trends using the OIG project scorecard will improve when all
products being compared have been initiated after the scorecard's
implementation. Some products in Table 1 were begun before the scorecard's
implementation. The higher the score means the extent to which teams met and
documented the criteria measured by the scorecards.
Table 1: Overall Scorecard
Report No.
Staff
Days
Project Cost
($000s)
Elapsed Days
(Kickoff to OCPL
Reviewing Draft)
Elapsed Days
(Kickoff to Final
Report Date)
[Table 3]
Total Project
Score
[Table 2]
Total
Weighted
Report Score
Total
Quality
Score
1st Quarter







2007-P-00001
1,301
$962
544
666
17.3
6.1
23.4
2007-P-00002
253
$205
22
77
26.7
5.5
32.2
2007-P-00003
403
$293
237
348
14.5
5.9
20.4
2007-P-00004
546
$420
448
544
18.0
7.2
25.2
2007-P-00005
641
$501
645
797
11.0
6.5
17.5
2007-P-00006
690
$530
589
747
12.0
5.9
17.9
2007-2-00003
371
$307
177
244
28.3
8.5
36.8
2007-4-00027
524
$229
284
402
21.4
7.5
28.9
2007-1-00019
3,421
$2,561
213
224
20.9
7.9
28.8
2007-4-00019
103
$75
102
194
22.8
8.4
31.2
2007-4-00026
283
$220
297
470
17.9
7.5
25.4
2007-1-00001
259
$216
492
582
17.7
6.8
24.5
2007-4-00034
45
OO
CO
226
247
21.1
8.7
29.8
2nd Quarter







2007-P-00007
639
$466
189
345
27.5
4.7
32.2
2007-P-00009
173
$361
205
295
23.5
7.8
31.3
2007-P-00010
106
CO
oo
117
180
30.0
8.0
38.0
2007-P-00011
356
$287
195
300
26.0
7.3
33.3
2007-P-00012
488
$402
156
240
21.0
8.6
29.6
3

-------
Report No.
Staff
Days
Project Cost
($000s)
Elapsed Days
(Kickoff to OCPL
Reviewing Draft)
Elapsed Days
(Kickoff to Final
Report Date)
[Table 3]
Total Project
Score
[Table 2]
Total
Weighted
Report Score
Total
Quality
Score
2007-P-00013
378
$229
217
294
24.5
6.1
30.6
2007-P-00015
934
$108
256
371
24.5
8.0
32.5
2007-P-00016
321
$255
190
300
26.0
8.6
34.6
2007-P-00017
478
$356
284
415
27.1
7.4
34.5
2007-1-00037
129
$107
319
480
15.3
5.1
20.4
2007-1-00044
220
$135
280
497
17.0
7.2
24.2
2007-4-00045
93
$221
302
462
36.2
6.4
42.6
2007-4-00052
150
$330
352
480
21.7
7.0
28.7
3rd Quarter







2007-P-00021
282
$217
177
285
27.0
8.3
35.3
2007-P-00022
568
$473
352
464
23.1
8.1
31.2
2007-P-00023
1,220
$932
476
833
15.0
5.4
20.4
2007-P-00024
341
$284
131
215
31.4
8.7
40.1
2007-P-00025
139
$105
196
274
29.0
7.7
36.7
2007-1-00070
329
$275
98
160
30.7
7.2
37.9
2007-1-00071
299
$250
48
160
30.3
8.2
38.5
2007-S-00001
164
$148
102
182
27.3
8.3
35.6
2007-P-00026
728
$580
560
678
16.5
8.2
24.7
2007-P-00027
542
$440
313
495
21.0
4.8
25.8
2007-4-00064
27
$23
134
144
31.6
8.9
40.5
2007-4-00065
318
$265
228
301
26.1
5.9
32.0
4th Quarter







2007-B-00002
503
$397
144
180
31.8
*
31.8
2007-4-00068
247
$206
184
287
30.7
8.8
39.5
2007-P-00028
430
$338
297
400
24.1
7.0
31.1
2007-P-00029
313
$246
273
399
21.8
7.3
29.1
2007-2-00030
*
*
160
204
22.8
8.6
31.4
2007-P-00030
762
$637
430
585
25.2
8.4
33.6
2007-P-00031
997
$783
491
629
20.7
8.2
28.9
2007-P-00032
234
$185
161
239
27.8
8.4
36.2
2007-P-00033
836
$684
177
267
29.1
6.2
35.3
2007-P-00034
468
$375
699
812
3.1
6.3
9.4
2007-P-00035
110
$135
97
201
29.7
8.4
38.1
2007-P-00036
240
$189
246
405
25.8
8.4
34.2
2007-P-00037
238
$198
73
128
28.4
8.3
36.7
2007-P-00038
62
$55
124
127
27.0
8.2
35.2
2007-2-00039
11
$9
30
70
29.0
8.8
37.8
2007-P-00039
720
$545
428
476
22.2
7.8
30.0
2007-4-00078
289
$241
254
362
23.7
8.6
32.3
2007-2-00040
*
*
162
212
24.1
8.6
32.7
2007-P-00040
390
$307
253
365
24.3
6.8
31.1
2007-P-00041
172
$136
188
294
27.8
5.4
33.2
Source: FY 2007 OIG Project Quality Scorecards and Report Quality Scoresheets
4

-------
Table 2: Project Quality Scorecard1
Report
Number
Planning
Fieldwork
Evidence
Supervision
Draft Report
Preparation
and
Timeliness
Signifi-
cance
Total
Project
Score
1st Quarter







2007-P-00001
1.0
4.0
4.0
4.3
1.0
3.0
17.3
2007-P-00002
3.0
4.0
4.0
3.7
9.0
3.0
26.7
2007-P-00003
2.0
2.0
3.0
3.5
3.0
1.0
14.5
2007-P-00004
2.0
3.0
3.0
1.0
6.0
3.0
18.0
2007-P-00005
1.0
3.0
3.0
1.0
1.0
2.0
11.0
2007-P-00006
2.0
2.0
4.0
3.0
-1.0
2.0
12.0
2007-2-00003
3.0
2.0
4.0
4.3
12.0
3.0
28.3
2007-4-00027
3.0
2.0
4.0
3.4
6.0
3.0
21.4
2007-1-00019
3.0
2.0
4.0
3.9
5.0
3.0
20.9
2007-4-00019
3.0
4.0
4.0
1.8
7.0
3.0
22.8
2007-4-00026
3.0
2.0
4.0
2.9
3.0
3.0
17.9
2007-1-00001
3.0
3.0
4.0
2.7
2.0
3.0
17.7
2007-4-00034
1.0
3.0
4.0
4.1
6.0
3.0
21.1
2nd Quarter







2007-P-00007
3.0
3.0
3.0
4.5
12.0
2.0
27.5
2007-P-00009
2.0
4.0
3.0
4.5
7.0
3.0
23.5
2007-P-00010
3.0
4.0
4.0
5.0
12.0
2.0
30.0
2007-P-00011
2.0
4.0
4.0
2.0
12.0
2.0
26.0
2007-P-00012
1.0
3.0
4.0
3.0
7.0
3.0
21.0
2007-P-00013
3.0
4.0
4.0
3.5
7.0
3.0
24.5
2007-P-00015
3.0
3.0
4.0
4.5
7.0
3.0
24.5
2007-P-00016
3.0
2.0
4.0
2.0
12.0
3.0
26.0
2007-P-00017
2.0
4.0
4.0
4.1
11.0
2.0
27.1
2007-1-00037
3.0
1.0
4.0
1.0
5.0
1.0
15.0
2007-1-00044
3.0
2.0
4.0
1.0
6.0
1.0
17.0
2007-4-00045
3.0
3.0
4.0
4.2
12.0
10.0
36.2
2007-4-00052
2.0
4.0
4.0
4.5
4.0
3.0
21.5
3rd Quarter







2007-P-00021
2.0
4.0
4.0
2.0
13.0
2.0
27.0
2007-P-00022
2.0
3.0
4.0
3.1
8.0
3.0
23.1
2007-P-00023
2.0
3.0
4.0
2.5
1.5
2.0
15.0
2007-P-00024
3.0
3.9
3.5
5.0
13.0
3.0
31.4
2007-P-00025
3.0
3.5
3.5
5.0
13.0
1.0
29.0
2007-1-00070
3.0
3.4
4.0
4.8
12.5
3.0
30.7
2007-1-00071
3.0
3.0
4.0
4.8
12.5
3.0
30.3
2007-S-00001
3.0
4.0
3.0
5.0
11.3
1.0
27.3
1 The specific characteristics in the project scorecard as shown in Appendix B have been combined for the purposes
of presentation in Table 2.
5

-------


Draft Report




Preparation

Total
Report

and
Signifi-
Project
Number
Planning
Fieldwork Evidence Supervision Timeliness
cance
Score
2007-P-00026
1.0
4.0
3.5
5.0
1.0
2.0
16.5
2007-P-00027
2.0
3.0
3.0
4.0
6.0
3.0
21.0
2007-4-00064
3.0
4.0
4.0
4.6
13.0
3.0
31.6
2007-4-00065
2.0
3.0
4.0
4.1
7.0
6.0
26.1
4th Quarter







2007-B-00002
3.0
4.0
4.0
4.8
13.0
3.0
31.8
2007-4-00068
3.0
3.5
4.0
4.5
13.0
2.7
30.7
2007-P-00028
3.0
4.0
4.0
4.1
7.0
2.0
24.1
2007-P-00029
2.0
4.0
4.0
3.8
7.0
1.0
21.8
2007-2-00030
3.0
4.0
4.0
3.8
7.0
1.0
22.8
2007-P-00030
3.0
3.5
3.0
4.7
8.0
3.0
25.2
2007-P-00031
3.0
3.5
4.0
4.2
3.0
3.0
20.7
2007-P-00032
3.0
3.0
4.0
4.8
13.0
-
27.8
2007-P-00033
3.0
4.0
4.0
3.6
12.5
2.0
29.1
2007-P-00034
-
2.0
3.0
2.1
-4.0
-
3.1
2007-P-00035
3.0
4.0
4.0
3.7
13.0
2.0
29.7
2007-P-00036
3.0
4.0
4.0
4.8
8.0
2.0
25.8
2007-P-00037
3.0
4.0
4.0
4.4
12.0
1.0
28.4
2007-P-00038
2.0
4.0
4.0
3.0
13.0
1.0
27.0
2007-2-00039
3.0
4.0
4.0
4.0
13.0
1.0
29.0
2007-P-00039
3.0
3.8
4.0
4.4
4.0
3.0
22.2
2007-4-00078
3.0
2.9
4.0
3.8
7.0
3
23.7
2007-2-00040
3.0
3.0
4.0
3.1
7.0
4.0
24.1
2007-P-00040
3.0
4.0
4.0
4.8
7.0
1.5
24.3
2007-P-0041
3.0
4.0
4.0
3.8
11.0
2.0
27.8
Source: OIG Project Quality Scorecards
Table 3: Days From Kickoff to Final Report
No. of Days
No. of
No. of
No. of
No. of

from Kickoff to
Final Report Date
Reports
1st Quarter
Reports
2nd Quarter
Reports
3rd Quarter
Reports
4th Quarter
Total
Reports
Less than 100 Days
1
0
0
1
2
100-199 Days
1
1
4
3
9
200-299
3
3
3
7
16
300-399
1
4
1
3
9
400-499
2
5
2
3
12
500-599
2
0
0
1
3
600-699
1
0
1
1
3
700-799
2
0
0
0
2
800-899
0
0
1
1
2
Average Days by Quarter
426
358
349
332
363
(Avg. for year)
Source: Analysis of OIG Project Quality Scorecards
6

-------
Positive Trends
Several positive trends occurred during FY 2007. First, as Table 1 shows, the
average cost of an OIG report (excluding the audit of the Agency's financial
statements) decreased from about $333,000 to $315,000 from the 1st to the 4th
quarters. That represents a 5.4-percent decrease. Second, as Table 2 illustrates,
teams' efforts to meet the quality characteristics in the OIG project quality
scorecard improved as the year progressed. The average project score increased
from 19.2 in the 1st quarter to 25.0 in the 4th quarter, a 30-percent improvement.
Teams accomplishing specific quality characteristics more regularly contributed
to the improvement in average project scores from the 1st to the 4th quarter. For
example, a specific quality characteristic in fieldwork is a requirement for the
Director to approve the project guide that describes the project's objectives,
scope, and methodology prior to the entrance conference with the Agency or
auditee. Quality assurance reviews showed that Directors routinely documented
their approval of the project guide prior to the entrance conference. Likewise,
supervisory scores, reflecting the extent to which supervisors timely review
working papers and accept staff responses to reviewer notes, increased from 3.0 in
the 1st quarter to 4.0 in the 4th quarter.
Table 3 shows that from the 1st to the 4th quarter the number of calendar days
from kickoff date with Agency staff to final report date decreased from 426 days
to 332 days. That represents a 22-percent decrease in time to issue a final report.
In part these statistics are impacted by quick reaction and early warning reports
issued by the OIG during the fiscal year. These reports show that the OIG is more
timely providing the Agency with issues needing prompt attention.
Agency Accepted High Percentage of Report Recommendations
Adhering to the quality assurance characteristics helps to ensure a high percentage
of OIG recommendations are accepted by the Agency. During FY 2007, OIG
made 147 recommendations in major performance reports. The OIG teams used
the discussion draft report process and draft report process, and held meetings
with Agency officials to discuss and refine proposed recommendations. The
Agency accepted 123 (83.6 percent) of the recommendations. For these 123
recommendations the OIG also concurred with the Agency's proposed actions to
implement them.
In March 2007, the OIG issued Policy Number 5, "OIG Followup Policy." The
purpose of the policy is to conduct and report the results of followup reviews to
the Agency on the status of Agency actions taken on OIG recommendations.
Also, the Agency's Deputy Administrator is now the deciding official on
disagreements between the Agency and the OIG on recommendations. These
actions should help ensure the continuance of a high percentage of Agency
acceptances of recommendations and that the Agency implements
recommendations in a way that has the impact the OIG intended.
7

-------
Chapter 3
Working Paper Enhancements
The working papers supporting OIG reports now have supervisory review notes
or comments located centrally in the working papers. The comments are
maintained more consistently and, as a result, external reviewers will be able to
consistently determine that supervisory review comments have been resolved
before the OIG report is submitted for independent referencing and a quality
assurance review.
Additionally, working papers now allow reviewers to determine the frequency of
supervisory reviews and clearance of reviewer notes, in accordance with the
guidance issued by the Deputy Inspector General. That guidance, as incorporated
into the Handbook, states:
To help ensure effective supervisory reviews, staff need to prepare
and place working papers in AutoAudit as they gather and develop
evidence.... Reviews of working papers prepared by GS-9s or
below will occur no less than twice monthly and all other working
papers will be reviewed every 30 days.
An analysis of the OIG's supervisory quality characteristics, as measured by the
project scorecard, shows a higher percentage of supervisory scores achieving 4.0.
During the 1st quarter, immediately after the guidance was issued, the average
supervisory score was about 3.0. Only 3 of 12 reports (25 percent) issued had
supervisory scores above a 4.0. During the 4th quarter, the average supervisory
score rose to 4.0. Of 20 reports issued, 11 reports (55 percent) had a supervisory
score above 4.0.
Quality assurance reviews also show that Directors and Project Managers
consistently reviewed the supporting working papers to the indexed copy of the
report and also documented their review comments. Likewise, staff responses
and the clearance by the Director or Project Manager of the review comments
were also documented. Working papers also show that the Directors and the
Project Managers then documented through a formal certification that the report
was supported by sufficient and appropriate evidence.
Quality assurance reviews of projects showed that some aspects in the following
areas still need some further attention. Details follow.
Working Paper Preparation
One area of working paper preparation needing attention is that of maintaining
working papers of reasonable length. Quality assurance reviews noted that
working papers either had more than the results of one work segment or included
8

-------
many emails, documents, and analyses. This can result in working papers of
undue length impeding timely supervisory reviews. On one assignment, the
Director noted that the working paper in AutoAudit was so long it would not
open. Working papers should capture a reasonable amount of work for a specific
work segment as defined in the audit guide. Working papers should not be of
such length that they impede an effective or timely review by the supervisor in
accordance with the guidance in the Handbook.
Additionally, other staff and supervisors interpreted the requirement to review
working papers to extend only to working papers deemed to be completely
finished by staff. When this occurs, the status of the work on the working papers
may not be reviewed for months. One supervisor noted that incomplete working
papers do not make sense, even though they may remain open for months. As a
result, interviews, analyses, or other evidence may not be timely reviewed by the
supervisor, and reviewer notes may not be timely prepared and addressed by staff.
Issues, including reportable issues that an experienced supervisor can help
identify during fieldwork, will less likely be resolved in a timely manner.
During the course of the year, one Director issued additional guidance advising
staff that:
Unless you are waiting for information, work papers should not be
left open as "in-process "for several months. If work papers are
in process for several months, the reviewer should be looking at
them to find out what the problem is. It is helpful to put the
document in edit mode when reviewing, even if the supervisor does
not make comments, so that it is recorded as part of the history.
Work papers need to be broken down into manageable sections
and summaries created. If a workpaper were printed and it were
several pages long, there should be headings or other information
that will assist the reviewer in finding particular information.
That guidance is an example of instructions that can be issued by Directors to
teams at the start of projects. Supervisors should be reviewing the status of all
work and not just working papers that staff have deemed complete. Working
papers should be kept in a state so that if one person leaves the OIG or is placed
on another assignment, another person can readily assume the task of completing
the work. During FY 2007, this issue was discussed with supervisors and staff,
and the above guidance issued by a Director was provided as an example.
Although supervisory scores increased in FY 2007, some Directors did not
selectively review staff working papers during fieldwork other than those
prepared by the Project Manager. Other Directors selectively reviewed certain
working papers of their staff during fieldwork to ensure the effectiveness of the
Program Manager's reviews. The Deputy Inspector General has agreed that
Directors should selectively review working papers during fieldwork to ensure
Project Managers effectively carry out their review of working papers. The
benefit of their reviews will help ensure all reportable issues are identified.
9

-------
Directors will also retain flexibility to the extent that working papers are reviewed
based on the complexity of the project and the experience of the team.
Recommendation 1: Revise the Handbook to clarify that Project Managers and
Directors are responsible for continually reviewing the status of work and not
just working papers that staff have deemed completed. Directors, in addition to
reviewing the working papers of the Project Manager during fieldwork, should
selectively review other staff working papers to ensure the effectiveness of the
Project Managers reviews and that all reportable issues have been identified.
Indexing Reports
In some reports indexing was not precise. Project Managers and Directors should
direct staff to more precisely index report statements to supporting documentation.
Also, some report statements were supported on the indexed copy of the report by
a statement that the lack of evidence was negated because the Agency did not take
exception to the statement in the discussion draft report. The fact that the Agency
did not take exception does not mean the OIG has adequate and competent
evidence in the way of documentation, observations, analysis, etc. Agency staff
may well have assumed the OIG had sufficient evidence. This issue was discussed
with Directors and was resolved during FY 2007.
In FY 2007, quality assurance reviews noted that a Director in one OIG office sent
reports to independent referencing and then to the editor. In one report the text
was materially different between the edited version and the indexed version that
was independently referenced. The process used by the Director was modified to
better ensure that the copy submitted for independent referencing did not differ
significantly from the edited version. A Director in another office said the office
supported concurrent processing by the editor and independent referencing when it
made sense to do so. This could increase the risk that statements will be included
in the final report that had not been independently referenced. However, the
Directors said they closely monitor report changes to ensure the text does not differ
significantly between the indexed version of the report and the issued report.
Accordingly, no formal recommendations are needed.
Documenting OIG Decisions on Reportable Issues
During the course of the year, a quality assurance report2 disclosed that the report
for one project was significantly altered before being made public while the report
for another project was not issued at all, due to decisions made by senior OIG
officials. The rationale for the decisions was not fully documented in the OIG
working papers. This resulted in the appearance of unprofessional work and a
lessened credibility of the OIG. In both instances, reports had cleared the existing
quality assurance processes. Therefore, the Handbook should be amended to
ensure decisions by senior OIG officials are fully documented in working papers.
2 EPA OIG Quality Assurance Review of Two Assignments, Report No. 08-A-0074, January 30, 2008.
10

-------
The OIG Office of Counsel initiated steps to determine what additional actions
should be taken when the OIG hires or details an Agency employee into the OIG
who may work on a project that can present a potential impairment. The former
Acting Deputy Inspector General for Planning, Audit and Evaluation, who was
involved in the decisions for both of the above projects, was a former Agency
official who had supervised some Agency staff with program responsibilities
under review by OIG staff on one of these two projects. Paragraph 3.05 of the
generally accepted government auditing standards states:
When auditors use the work of a specialist, auditors should assess
the specialists' ability to perform the work and report results
impartially as it relates to their relationship with the program or
entity under audit. If the specialist's work is impaired, auditors
should not use the work of the specialist.
Office of Counsel was drafting a checklist to find out more about the type of work
the employee was engaged in, what major projects the person was involved with,
and who the employee's former supervisors were. This will better ensure
potential impairments can be identified and whether a cautionary memorandum
should be issued to the employee.
Recommendation 2: Revise the Handbook to ensure that the OIG determines
the independence of consultants, specialists, former Agency employees hired by
the OIG, and any other Agency staff detailed to the OIG for an assignment.
These determinations must be documented in the working papers.
Recommendation 3: Revise the Handbook to clarify that decisions involving
an assignment's scope, methodology, and reporting of issues by all OIG staff,
including senior OIG officials, be completely documented. Where officials do
not provide such explanation the Director will advise the appropriate Assistant
Inspector General or other senior OIG official and request an explanation
regarding the decision. The request should be documented in the working papers.
Documenting Discussions with Agency on Scope and Methodology
Project quality scorecards for FY 2007 assignments showed that teams normally
discussed some aspect of the scope and methodology for assignments during
entrance conferences. However, some teams documented their discussions in
greater detail than others. A best practice observed is the way some teams
described the following information during the entrance conference:
•	Discussion of the project objectives
•	Methodology the team plans to use to answer the objectives
11

-------
•	Information the team anticipates it will need to collect and the sources it
plans to use, unknown sources the team may obtain from the Agency/auditee,
and points of contact
•	Discussion of any potential obstacles for collecting data of which the
Agency may be aware
When teams discuss all of the above at the entrance conference, they reduce the
risk of criticism to the team's methodology and improve customer satisfaction. As
this was noted as a best practice, no formal recommendation is made.
Documenting Regular Meetings with Agency on Issues
The OIG Handbook appropriately calls for regular meetings with Agency officials
to discuss issues under development during fieldwork. These discussions should
be part of a "Communications" section established in the assignment's working
papers. Specifically, the Handbook states:
The team should meet regularly with action officials responsible
for the program or activity to discuss issues under development.
To facilitate open exchange of information the team should provide
a one page point sheet for each issue. The point sheet can follow
the format of the finding outline. However, the point sheet does
not have to have all the elements of a finding fully developed prior
to giving it to officials.
Although these meetings were held according to the scorecards, the detail of
documentation varied, including the extent to which issues were discussed. With
the requirement that a Communications section be established in the working
papers for each assignment, teams need to consistently document these status
meetings and other types of internal briefings. Also, they should ensure point
sheets provided to the Agency for discussion during these regular meetings are
documented in the working papers. This activity will be monitored during
FY 2008 to ensure teams are following the Deputy Inspector General's guidance
for establishing a Communications section and including the proper type of
information. Accordingly, no formal recommendation is made.
12

-------
Chapter 4
Reporting Enhancements
Providing Attribution to Statements and Obtaining Factual Support
OIG reports continued in FY 2007 to use the word "official" when citing lower-
level Agency staff when the word should only be used for higher-level staff.
When this occurs, the reader is less likely to be able to judge the credibility of the
comments. In other instances, no attribution to statements provided by Agency
staff was given in reports. Thus, the reader could infer the report statements were
derived through analysis, observation, or documentation. In some instances,
statements of program accomplishments were not supported by documentation
but oral statements provided by Agency staff. Some teams expressed concern that
attributing by title could impede open discussions with the auditee, especially for
subsequent reviews, since people may be concerned about statements being
attributed to them in a report.
Generally accepted government auditing standards3 note the objectivity of a report
is enhanced when it explicitly states the source of evidence and the assumptions
used in the analysis. To help resolve this issue, the Deputy Inspector General
provided guidance that Agency officials at an SES level can be referred to as
officials. Further, OCPL has proposed language to address the Deputy Inspector
General's criteria and provide additional instructions in the OIG Report
Formatting and Style Guide. The proposed language states:
When citing the source of a statement, identify the individual by title
when possible. When we cannot cite the source by title, refer to an
SES or higher level employee as an "official, " and to an employee
below the SES level as a "staff member " or, if appropriate,
"management, " or a more general title that conveys the employee 's
knowledge of the subject under review (i.e., regional contracting
officer). This will help the reader to judge the credibility of the
statement. As noted in the Yellow Book, the objectivity of a report is
enhanced when the report explicitly state the source of the evidence
and the assumptions used in the analysis.
This proposed guidance should be helpful. Recently, the Deputy Inspector
General also said that accomplishments stated by officials or lower-graded
Agency staff should be supported by documentation or other appropriate and
sufficient evidence. As reports are reviewed in FY 2008, a check will be made to
determine that the above guidance is used uniformly in OIG reports. As a result
of actions initiated to provide guidance, no formal recommendation is needed.
3 GAO Yellow Book, Appendix I, Supplemental Guidance, paragraph A8.02(b)
13

-------
Use of Ambiguous Terms
Several OIG reports use the ambiguous words "some" or "many" when describing
condition statements instead of quantifying. Indexes supporting these statements
did not always show the quantification for these terms in the detailed section of the
finding. Also, the working papers did not reflect that Agency officials had been
informed as to what the terms meant. Unless the terms are defined for the Agency
in exit conferences, the Agency may not respond appropriately to a
recommendation and the issue may not be sufficiently resolved.
OCPL recently drafted language to address this topic in the OIG Report
Formatting and Style Guide. The proposed language states:
Avoid the use of indefinite words such as "some " or "many" when
describing conditions. Specific quantification should be provided to
support our positions. It is acceptable to use such words in an
introductory sentence that is immediately followed by the details, if
including the quantification in the introductory sentence would be
awkward. Such wording should otherwise be kept to a minimum.
OCPL's proposed action should resolve this issue, and no formal recommendation
is needed.
Reports Better Describe Approach for Each Objective
Audit results should be responsive to the audit objectives. In response to last
year's quality assurance report, OCPL revised the OIG Report Formatting and
Style Guide to provide guidance on the discussion of methodology in a report.
OCPL's guidance states: "The methodology should address our general review
approach, such as noting what types of transactions we reviewed, as well as
provide details on the analysis techniques we used (such as statistical sampling)."
The Guide states the report should "describe the review approach by objective"
when appropriate. During FY 2007, Scope and Methodology sections were
clearer as to how each objective was developed. This assists the reader in
determining that evidence obtained by the OIG was sufficient/competent and
relevant to support the finding and recommendations. The actions taken during
FY 2007 resolved this issue, and no formal recommendation is needed.
Visual Aids Show Source of Data
In response to last year's quality assurance report, OCPL revised the Report
Formatting and Style Guide to provide guidance to ensure visual aids, such as
tables and charts, showed the source of data. In FY 2007, OIG reports with
tables, charts, and other visual aids always contained the source of the
information. Because the actions taken during FY 2007 resolved this issue,
no formal recommendation is needed.
14

-------
Chapter 5
Administrative Enhancements
Calculating Project Costs
In the transmittal memorandum that accompanies formal reports, the OIG states
the cost of each assignment. At the outset of FY 2007, quality assurance reviews
noted that on two assignments the project costs were substantially understated in
the transmittal memorandum.
•	For one assignment the cost was understated by about $450,000. This
occurred because the team's Inspector General Operations and Reporting
System (IGOR) codes for the assignment were not properly established
and all costs were not captured.
•	For the other assignment, the cost was understated by about $325,000.
This occurred because staff, including the Director, had not completed all
of their IGOR timesheets, which capture the time that each person spent
on the assignment.
The Deputy Inspector General and responsible Assistant Inspector General took
action to ensure staff completed timesheets and properly calculated costs of
projects as reported in each transmittal memo. Additionally, the OIG has
implemented the Inspector General Enterprise Management System (IGEMS).
The new system should more accurately capture the time staff has spent on
assignments, including project costs. During FY 2008, as the OIG develops
management reports through IGEMS, the calculation of project costs will be
monitored. No formal recommendation is needed at this time.
Entering Performance Measurement and Results System Data
The FY 2006 quality assurance report noted that not all teams had entered the
results from their reports in the OIG's Performance Measurement and Results
System. The OIG took steps to better ensure that teams entered results in this
system. For FY 2007, a test showed that for the 58 major OIG reports reviewed,
all teams had entered the report results into the Performance Measurement and
Results System. Accordingly, no recommendations are made in this quality
assurance report regarding that issue.
15

-------
Appendix A
Scope and Methodology
To perform our review, we received printouts from the OIG Office of Planning, Analysis, and
Results on OIG reports issued, and also reports of time expended on the assignments. We then
reviewed the assignment work papers in the OIG's AutoAudit working paper system and the
final reports using the Project Quality Scorecard (see Appendix B). We also contacted
supervisors as needed on each assignment to obtain additional information. The Project Quality
Scorecard measured each assignment as to evidence rating, timeliness, reviews, report phase,
preliminary research, fieldwork, and finding outlines. The OCPL Publications Unit developed a
Report Quality Scoresheet for Draft Submissions for assessing the quality of draft reports (see
Appendix C), and we reviewed those scoresheets prepared in FY 2007. We believe these
scoresheets can be applied to all OIG assignments in accordance with generally accepted
government auditing standards (be well written, timely, and have impact). The primary
difference should only be the type of impact. The scorecards should allow for enough variety in
impact quality measurement to cover all of our work.
Our scope covered final performance audit and evaluation reports prepared by the OIG Office of
Audit, Office of Program Evaluation, Office of Mission Systems, and OCPL from October 1,
2006, through September 30, 2007. We did not include Single Audit Act reports, audit reports
performed by the Defense Contract Audit Agency, or other reports where the work was
performed by external auditors. The listing of reports reviewed follows.
Master List of OIG Products Reviewed for FY 2007
Report No.
Subject
Date
2007-P-00001
EPA's Oversight of the Vehicle Inspection and Maintenance Program
Needs Improvement
10/5/2006
2007-P-00002
EPA Needs to Plan and Complete a Toxicity Assessment for the Libby
Asbestos Cleanup
12/5/2006
2007-P-00003
Partnership Programs May Expand EPA's Influence
11/14/2006
2007-P-00004
Saving the Chesapeake Bay Watershed Requires Better Coordination of
Environmental and Agricultural Resources
11/20/2006
2007-P-00005
EPA's Management of Interim Status Permitting Needs Improvement to
Ensure Continued Progress
12/4/2006
2007-P-00006
EPA Has Improved Five-Year Review Process forSuperfund Remedies,
But Further Steps Needed
12/5/2006
2007-2-00003
Information Concerning Superfund Cooperative Agreements with New
York and New Jersey
10/30/2006
2007-4-00027
Examination of Financial Management Practices of the National Rural
Water Association, Duncan, Oklahoma
11/30/2006
2007-1-00019
Audit of EPA's Fiscal 2006 and 2005 Consolidated Financial Statements
11/15/2006
2007-4-00019
Ecology and Environment Cost Impact Proposal-Subcontract
Administration for Cost Accounting Standard 402 Noncompliance
Subcontract Administrator's Labor Charging Practices
11/2/2006
2007-4-00026
International City/County Management Association Reported Outlays
Under Seven Selected Cooperative Agreements
11/28/2006
16

-------
Report No.
Subject
Date
2007-1-00001
Fiscal 2005 and 2004 Financial Statements for the Pesticides
Reregistration and Expedited Processing Fund
10/10/2006
2007-4-00034
Agreed Upon Procedures Applied To Hurricane Katrina and Rita Task
Orders 13, 14, 15 and 16 Under BOA DACW56-02-6-1001
12/21/2006
2007-P-00007
EPA Could Improve Processes for Managing Contractor Systems and
Reporting Incidents
1/11/2007
2007-P-00009
EPA Relying on Existing Clean Air Act Regulations to Reduce
Atmospheric Deposition to the Chesapeake Bay and its Watershed
2/28/2007
2007-P-00010
U.S. Chemical Safety and Hazard Investigation Board Should Track
Adherence to Closed Recommendations
3/26/2007
2007-P-00011
Interagency Agreements to Use Other Agencies' Contracts Need
Additional Oversight
3/27/2007
2007-P-00012
EPA's Allowing States to Use Bonds to Meet Revolving Fund Match
Requirements Reduces Funds Available for Water Projects
3/29/2007
2007-P-00013
Performance Track Could Improve Program Design and Management to
Ensure Value
3/29/2007
2007-P-00015
New Housing Contract for Hurricane Katrina Command Post Reduced
Costs but Limited Competition
3/29/2007
2007-P-00016
Environmental Justice Concerns and Communication Problems
Complicated Cleaning Up Ringwood Mines/Landfill Site
4/2/2007
2007-P-00017
EPA Needs to Strengthen Financial Database Security Oversight and
Monitor Compliance
3/29/2007
2007-1-00037
State of New Hampshire Clean Water State Revolving Fund Program
Financial Statements for the Year Ended June 30, 2005
2/7/2007
2007-1-00044
State of New Hampshire Drinking Water State Revolving Fund Program
Financial Statements for the Year Ended June 30, 2005
2/26/2007
2007-4-00045
America's Clean Water Foundation Incurred Costs for EPA Assistance
Agreements X82835301, X783142301, and X82672301
2/20/2007
2007-4-00052
Ecology & Environment: CFY 2001 Incurred Costs
3/30/2007
2007-P-00021
EPA Can Improve Its Managing of Superfund Interagency Agreements
with U.S. Army Corps of Engineers
4/30/2007
2007-P-00022
Promoting Tribal Success in EPA Programs
5/3/2007
2007-P-00023
Better Enforcement Oversight Needed for Major Facilities with Water
Discharge Permits in Long-Term Significant Noncompliance
5/14/2007
2007-P-00024
Number of and Cost to Award and Manage EPA Earmark Grants, and
the Grants' Impact on the Agency's Mission
5/22/2007
2007-P-00025
EPA Can Improve Its Oversight of Audit Followup
5/24/2007
2007-1-00070
Fiscal Year 2006 and 2005 Financial Statements for the Pesticides
Reregistration and Expedited Processing Fund
5/30/2007
2007-1-00071
Fiscal Year 2006 and 2005 Financial Statements for the Pesticide
Registration Fund
5/30/2007
2007-S-00001
U.S. Chemical Safety and Hazard Investigation Board Did Not Adhere to
Its Merit Promotion Plan
6/4/2007
2007-P-00026
EPA Needs to Take More Action in Implementing Alternative
Approaches to Superfund Cleanups
6/6/2007
2007-P-00027
Overcoming Obstacles to Measuring Compliance: Practices in Selected
Federal Agencies
6/20/2007
17

-------
Report No.
Subject
Date
2007-4-00065
The Environmental Careers Organization Reported Outlays for Five EPA
Cooperative Agreements
6/25/2007
2007-4-00064
Mixed Funding Claim No. 2 Submitted by Morrison & Foerster, LLP on
Behalf of U.S. Borax, Incorporated for the Armor Road SF Site, North
Kansas City, Missouri
6/4/2007
2007-B-00002
Assessment of EPA's Projected Pollutant Reductions Resulting from
Enforcement Actions and Settlements
7/24/2007
2007-4-00068
Ozone Transport Commission Incurred Costs Under EPA Assistance
Agreements XA98379901, OT83098301, XA97318101, and
OT83264901
7/31/2007
2007-P-00028
ENERGY STAR Program Can Strengthen Controls Protecting the
Integrity of the Label
8/1/2007
2007-P-00029
Superfund's Board of Directors Needs to Evaluate Actions to Improve
the Superfund Program
8/1/2007
2007-2-00030
Excess Federal Funds Drawn on EPA Grant No. XP98838901 Awarded
to the City of Huron, South Dakota
8/1/2007
2007-P-00030
Improved Management Practices Needed to Increase Use of Exchange
Network
8/20/2007
2007-P-00031
Development Growth Outpacing Progress in Watershed Efforts to
Restore the Chesapeake Bay
9/10/2007
2007-P-00032
Federal Facilities in Chesapeake Bay Watershed Generally Comply with
Major Clean Water Act Permits
9/5/2007
2007-P-00033
Using the Program Assessment Rating Tool as a Management Control
Process
9/12/2007
2007-P-00034
Complete Assessment Needed to Ensure Rural Texas Community Has
Safe Drinking Water
9/11/2007
2007-P-00035
EPA Needs to Strengthen Its Privacy Program Management Controls
9/17/2007
2007-P-00036
Total Maximum Daily Load Program Needs Better Data and Measures to
Demonstrate Environmental Results
9/19/2007
2007-P-00037
Progress Made in Improving Use of Federal Supply Schedule Orders,
but More Action Needed
9/20/2007
2007-P-00038
Decision Needed on Regulating the Cooling Lagoons at the North Anna
Power Station
9/20/2007
2007-2-00039
Ineligible Federal Funds Drawn on EPA Grant No. XP98284701
Awarded to the City of Middletown, New York
9/25/2007
2007-P-00039
Limited Investigation Led to Missed Contamination at Ringwood
Superfund Site
9/25/2007
2007-4-00078
Cheyenne River Sioux Tribe Outlays Reported Under Five EPA
Assistance Agreements
9/24/2007
2007-2-00040
Cost and Lobbying Disclosure Issues Under EPA Grant Numbers
X98981901 and XP97914901 Awarded to the City of Fallon, Nevada
9/26/2007
2007-P-00040
Strategic Agricultural Initiative Needs Revisions to Demonstrate Results
9/26/2007
2007-P-00041
Voluntary Programs Could Benefit from Internal Policy Controls and a
Systematic Management Approach
9/25/2007
18

-------
Appendix B
Project Quality Scorecard
The Project Quality Scorecard objectively evaluates the work leading up to the submission of
draft reports to OCPL for review. The scorecard is presented on the following page. Once
received by OCPL, draft reports received in FY 2007 were scored using the OCPL Report
Quality Scoresheet for Draft Submissions (see Appendix C for additional details).
As stated by the current edition of the Government Auditing Standards, evidence may be
categorized as physical, documentary, testimonial, and analytical. The scoring system reflects
the strength of each type of evidence. The following comments are provided to help the reader
better understand how the evidence elements in the Project Quality Scorecard are measured:
•	Physical evidence is obtained by auditors' direct inspection or observation of people,
property, or events. Such evidence may be documented in memoranda, photographs,
drawings, charts, maps, or physical samples.
•	Documentary evidence consists of created information such as letters, contracts,
accounting records, invoices, and management information on performance.
•	Testimonial evidence is obtained through inquiries, interviews, or questionnaires.
•	Analytical evidence includes computations, comparisons, separation of information into
components, and rational arguments.
19

-------
Project Quality Scorecard

Background Information
Report Title:
Report #

Date of Kickoff

Assignment #

Date of Entrance Conference

Total IGOR Days

Date of Draft Report sent to
OCPL for Review

Total Hours

Date of Draft Report

Project Cost

Date of Final Report


Evidence Rating
Evidence supporting the condition/main fact. Note: If there are multiple conditions
/main facts in an audit or evaluation, the score will be determined by averaging the
scores for each condition or main fact.

Documentary evidence (4 points)

Analytical (3 points)

Observation (3 points)

Testimonial (1 points)


Report Phase
Number of days from kickoff to date draft report sent to OCPL for
review

Subtract: One point for each 50 days exceeding 200


Preliminary Research Guide
Preliminary research guide completed prior to kickoff meeting: Add 1 point


Fieldwork Guide
Fieldwork guide completed prior to entrance conference: Add 1 point


Finding Outlines
Finding outlines completed prior to Message Agreement Meeting: Add 1 point

Total Quality Score

20

-------
Appendix C
Report Quality Scoresheet for Draft Submissions
The then Acting Inspector General directed OCPL in FY 2006 to develop a system to evaluate
the quality of incoming draft reports. This was to include assessing the readability of reports.
Given these parameters, the OCPL Publications Unit created the Report Quality Scoresheet for
Draft Submissions based on existing report requirements and guidance included in the Project
Management Handbook, the Report Formatting and Style Guide, and writing principles taught in
Write to the Point. This Scoresheet was used to score reports in FY 2007.
The Publications Unit assigned point values to the criteria so the total points would equal 90, to
be more easily incorporated into the overall scoring system. There is no direct correlation
between the number of requirements and the number of points possible, so the scoring is
subjective. While some of the elements of the Scoresheet can be objectively evaluated, objective
criteria and tools cannot address all the important elements of reports, such as organization,
structure, clarity, and the ability of the report to communicate the message. Therefore, the
Publications Unit included subjective measures in the Scoresheet to address whether the report
elements are clear, concise, convincing, logical, and relevant, and provide the proper perspective.
The Publications Unit assigned 30 points of the 90 points possible to meet the then Acting
Inspector General's direction to emphasize a readability index. Readability indices are tools that
help determine how readable documents are. The Publications Unit chose the Flesch-Kincaid
Index, similar to the Fog Index, for readability scoring. The formula considers the average
number of words per sentence and the average number of syllables per word. While a good
readability score does not ensure that a document is well written, it is an indicator of the
difficulty a reader will have understanding the message.
The Report Quality Scoresheet for Draft Submissions is shown starting on the next page.
21

-------
Report Quality Scoresheet for Draft Submissions
Report Title:
Assignment Number:
Product Line Director:
Project Manager:
Date Received by OCPL:
OCPL Reviewer:
Date Review Completed:
Total Score: XX out of 90
Preliminary Information
Requirements
Points
Possible
Points
Earned
Report Cover
-	Is the cover in the proper format?
-	Is a position taken in the title?
-	Is the assignment number included on the draft?
2

Inside Cover
-	Are all abbreviations in the report included in the list?
-	If there is a photo on the cover, is a caption included, with source?
1

At a Glance
-	Is it in the proper format and confined to one page?
-	Is the purpose of the report in the "Why...." section?
-	Is there a "Background" section?
-	Is a "snapshot" of findings presented in the "What We Found" section?
-	Are all the objectives addressed in the "What We Found" section?
-	Are recommendations summarized in "What We Recommend"
section?
5

Transmittal Memo
-	Is it in the proper format?
-	Is the template language used?
-	Are phone and email contacts listed?
1

Table of Contents
- Are the appropriate entries included, in the proper format?
1

Subtotal
10

Remarks:
22

-------
Introductory Information
(usually Chapter 1)
Requirements
Points
Possible
Points
Earned
Purpose
- Is the purpose stated?
3

Background
-	Is detail on what was reviewed provided?
-	Are data provided for perspective (dates, dollars, quantities)?
-	Are the responsible offices noted?
3

Scope and Methodology (including appendix information)
-	Is the extent of the work performed to accomplish objectives noted?
-	Is the approach for each objective described?
-	Are the universe and what was reviewed noted?
-	Are the organizations visited and their locations noted?
-	Is the period for when the review began and ended noted?
-	Is the period of transactions covered noted?
-	Are evidence gathering and analysis techniques described?
-	Is review for compliance described, if appropriate?
-	Is a sample design noted?
-	Is the quality of data discussed?
-	Is a Government Auditing Standards statement included?
4

Prior Coverage (can be part of "Scope and Methodology")
-	Are the name, number, and date for prior audits provided?
-	If no prior coverage occurred, is that acknowledged?
1

Internal Control (can be part of "Scope and Methodology")
-	Is the scope of management control reviews noted?
-	Are applicable management controls identified?
-	Is what was found regarding internal controls noted?
-	If internal controls were not reviewed, is that explained?
1

Subtotal
12

Remarks:
23

-------
Rest of Report
Requirements
Points
Possible
Points
Earned
Chapters/Findings
-	Do chapter and section headings take a position?
-	Is each finding organized as required?
2

"Charge" Paragraphs
-	Do they include condition, criteria, cause, and effect?
-	Is the condition presented in the first sentence?
-	Are all the objectives answered?
8

Condition
- Is what was right, wrong, or needing improvement discussed?
3

Criteria
- Are the criteria by which the condition was judged noted?
3

Cause
- Is the underlying reason for the condition identified?
3

Effect
-	Is the ultimate effect on public health and the environment noted?
-	Are quantities/potential cost benefits noted, when applicable?
3

Recommendations
- Are they action-oriented (avoiding weak words)?
2

Status of Recommendations and Potential Monetary Benefits
-	Is the table provided?
-	Are all elements presented accurately?
2

Appendices
-	Are they necessary?
-	Are they clearly presented?
-	Are they referenced in the report?
2

Subtotal
28

Remarks:
24

-------
Overall Formatting, Style, and Readability
Requirements
Points
Possible
Points
Earned
Is the Flesch-Kincaid Index 14.0 or lower?
30

Does the report follow OIG writing guidance for elements such as active
voice, subject/verb agreement, capitalization, etc.?
5

Are the chapters and/or sections properly formatted?
3

Are tables/charts/photos properly numbered, labeled, and formatted, and
do they include the source of the information?
2

Subtotal
40

Remarks:

Total Score
Sections
Points
Possible
Points
Allowed
Preliminary Information
10

Introductory Information
12

Rest of Report
28

Overall Formatting, Style, and Readability
40

Total
90

25

-------