$
<
33
\
mj
¦p..	,\s
PRO'S4-
OFFICE OF INSPECTOR GENERAL
Catalyst for Improving the Environment
Report to the Acting Inspector General
Special Project: Measuring the
Quality of OIG Reports
Report No. 2006-M-00015
September 19, 2006

-------
Report Contributors:
Robert K. Bronstrup
Patrick Gilbride
Office of Congressional and Public Liaison
Abbreviations
AIG
Assistant Inspector General
DCAA
Defense Contract Audit Agency
EPA
U.S. Environmental Protection Agency
GAGAS
Generally Accepted Government Auditing Standards
GAO
Government Accountability Office
GAS
Government Auditing Standards (2003 Revision)
IGEL
Inspector General E-Learning
IGOR
Inspector General Operations and Reports System
OA
Office of Audit
OCPL
Office of Congressional and Public Liaison
OIG
Office of Inspector General
OMB
Office of Management and Budget
OPE
Office of Program Evaluation
QA
Quality Assurance

-------
^tos%
0VO
^ s	UNITED STATES ENVIRONMENTAL PROTECTION AGENCY
\ ^ I	WASHINGTON, D.C. 20460
*l PRO"*^
OFFICE OF
INSPECTOR GENERAL
September 19, 2006
MEMORANDUM
SUBJECT: Special Project: Measuring the Quality of OIG Reports
Report No. 2006-M-00015
FROM: Robert K. Bronstrup /s/
Director, Central Resource Center
Patrick Gilbride /s/
Director, Western Resource Center
TO:	Bill Roderick
Acting Inspector General
Attached is the final report of our special project on Measuring the Quality of Office of Inspector
General (OIG) reports. Specifically, we tested a process to score specific quality characteristics
of 26 OIG reports issued between October 1, 2005, and March 31, 2006. Also, we made
observations and recommendations to you that we believe will strengthen the audit, evaluation,
and liaison processes. The scoring form we used in our review is included as Appendix A. The
scoring form the Office of Congressional and Public Liaison used to assess draft reports is
included as Appendix B. We explain the specific attributes for which we reviewed OIG reports
in the Scope and Methodology section which is included as Appendix C.
We received comments from the Assistant Inspectors General (AIGs) and provided them to you.
We also provided to you and to the AIGs a summary of all of the comments received on the
report and the scorecards. We used their comments in preparing the final report and
recommendations and also in revising individual scorecards. If you have any questions about the
final report or our observations and recommendations, please contact Robert Bronstrup at
312-886-7169 or Patrick Gilbride at 303-312-6969.
cc: Acting Deputy Inspector General

-------
Special Report: Measuring the Quality of OIG Reports
Table of Contents
Chapters
1	Introduction		1
Purpose		1
Summary of Results		1
Measuring the Quality of OIG Reports		1
Scoring the Results		2
2	Workpaper Enhancements		6
Documenting Supervisory Review of Workpapers		6
Cross-Referencing Assignment Guide Work Steps
to Supporting Workpapers		8
Enhancing the Independent Referencing Process		9
Approving Preliminary Research and Assignment Guides		11
3	Reporting Enhancements		12
Defining When Reports Should Use the Word "Official"		12
Ensuring Visual Aids Show Source of Data		12
Describing the Approach for Each Objective in
Scope and Methodology		13
4	Administrative Enhancements		15
Ensuring Staff Charge Time in a Uniform Manner		15
Entering Performance Measurement and Results System
(PMRS) Data		17
Appendices
A Project Quality Scorecard		18
B Report Quality Scoresheet for Draft Submissions		21
C Scope and Methodology		26

-------
Chapter 1
Introduction
Purpose
The purpose of this pilot project was to apply a set of experimental criteria to measure quality in
Office of Inspector General (OIG) reports issued by the Office of Audit (OA), Office of Program
Evaluation (OPE), and the Office of Congressional and Public Liaison (OCPL). Measuring the
quality of OIG work is important because it provides data that can be used to identify areas in
improving OIG processes. We developed quality measurement criteria and applied it to 26 EPA
OIG reports issued between October 1, 2005, and March 31, 2006.
We did not include single audit reports, Defense Contract Audit Agency (DCAA) contract audit
reports, or the Audit of EPA's Fiscal 2005 Financial Statements. Should single audits and the
audit of EPA's Financial Statements remain with the OIG, the scoring system will also include
those audits.
Summary of Results
To improve the quality of reports and work processes, the OIG should:
•	Issue an interim policy to ensure supervisory reviewer notes are kept in a central location of
the workpapers.
•	Enhance workpapers to better ensure (1) assignment guides are reviewed and approved,
(2) assignment guides are fully cross-referenced to the workpapers, and (3) the work of the
independent referencer is fully documented in the workpapers.
•	Improve reports by ensuring reports (1) identify the specific titles of agency employees or
others who are cited in OIG reports, and use the term "official" for only SES-level
employees, (2) show the source of information for tables and charts, and (3) specifically
describe the methodology for addressing each objective.
•	Develop a policy that will better ensure staff charge time to their direct assignments and to
indirect job codes in a more uniform manner in order to accurately determine the actual cost
of each assignment.
•	Strengthen its followup process so that the final impact of our work can be better
determined.
Measuring the Quality of OIG Reports
The primary goal of OIG reporting is to keep the Administration and Congress fully informed of
issues impacting EPA programs as well as EPA's progress in taking action to correct those
issues. Other customers, based on their impact on our budget, are the Government
Accountability Office (GAO) and the Office of Management and Budget (OMB). In developing
our criteria to measure quality, we know that these customers view timeliness of our products as
1

-------
very important; therefore, timeliness is a high quality characteristic. Likewise, compliance with
the Generally Accepted Government Auditing Standards (GAGAS), found in Government
Auditing Standards (GAS), is required, and thus, is a high quality characteristic. Further,
potential cost savings, improving policy, or the environment are other important quality
characteristics for an organization that is a "catalyst for change."
With that in mind, the OIG should strive to consistently provide products that meet specific
quality characteristics and adhere to all applicable standards and OIG policies and procedures.
Accordingly, a measuring process should provide a mechanism to evaluate individual products
against specific quality criteria. The measuring process should also present the information in a
manner that, over time, will allow the OIG to assess trends in quality so that necessary
adjustments can be made to policies, procedures, and activities. The criteria used in this project
to assess quality in OIG reports were:
•	Project cost.
•	Documentary reliability of evidence.
•	Supervision.
•	Timeliness in preparing draft reports.
•	Readability of reports, including whether the reports are clear, concise, convincing,
logical, and relevant.
A scoring form to measure and score these characteristics provides the organization with a
measurement of product quality and also serves as a basis for measuring a manager's
performance. The specific manner in which we calculated points is shown in our project quality
scorecard in Appendix A. The report quality scoresheet the OCPL Publications Unit uses to
score draft reports is in Appendix B.
Scoring the Results
The total quality scores, as well as the timeframes and project costs for the 26 OIG reports, are
shown in Table 1. Each total quality score is the sum of the two scoring systems: one for project
quality characteristics (Table 2) and the second for report quality characteristics (Table 3). Two
of the project quality characteristics we did not score. As noted in our observations, supervisory
review notes are not maintained consistently in the workpapers. As a result, we did not score
Supervision for the reports issued. Also, we did not score Significance, because the full impact
for certain reports could not be determined since the reports had just been issued. An Inspector
General Statement will be issued by October 1, 2006, that fully explains the scoring process and
all the criteria in both scoresheets.
In addition to scoring reports using the quality criteria, we are also are providing observations on
specific aspects of OIG work activities and processes that can be improved. Our observations
fall into three categories: (1) workpaper enhancements, (2) reporting enhancements, and (3)
administrative enhancements.
2

-------
Table 1. Total Quality Scores
Report No.
Staff
Days
Project
Cost
($000s)
Elapsed
Days
(Kickoff to
OCPL
Reviewing
Draft)
Elapsed
Days
(Kickoff
to Final
Report
Date)
Total
Project
Score
(Tbl.2)
Total
Weighted
Report
Score
(Tbl. 3)
Total
Quality
Score
2006-P-00001
387
302
181
237
6.0
5.0
11.0
2006-P-00002
832
694
168
214
6.0
4.7
10.7
2006-P-00003
508
397
533
651
1.0
4.5
5.5
2006-P-00004
96
80
143
232
6.0
5.5
11.5
2006-P-00006
567
443
374
471
3.0
4.6
7.6
2006-P-00007
453
367
410
536
4.0
7.9
11.9
2006-P-00008
209
169
361
422
5.0
8.4
13.4
2006-P-00009
558
436
413
544
3.0
5.2
8.2
2006-P-00010
See P-2
See P-2
260
320
5.0
5.2
10.2
2006-P-00011
428
334
98
125
3.0
7.0
10.0
2006-P-00012
103
86
120
161
7.0
7.9
14.9
2006-P-00013
1,126
879
279
511
4.0
4.3
8.3
2006-P-00014
See P-11
See P-11
87
146
3.0
8.4
11.4
2006-P-00015
663
553
377
495
5.0
7.2
12.2
2006-P-00016
896
700
400
546
4.0
4.9
8.9
2006-P-00017
631
493
319
440
3.0
5.5
8.5
2006-P-00018
240
187
90
145
6.0
7.3
13.3
2006-P-00019
See P-2
See P-2
320
376
5.0
5.0
10.0
2006-P-00020
See P-2
See P-2
329
377
5.0
4.4
9.4
2006-P-00021
See P-2
See P-2
320
378
5.0
6.1
11.1
2006-M-000004
284
222
199 -NO GO
N/A
N/A
9.2
9.2
2006-01-00018
231
193
512
589
0.0
8.3
8.3
2006-01-00021
228
190
199
213
5.0
9.7
14.7
2006-01-00024
See 00018
See 00018
542
644
0.0
9.4
9.4
2006-4-00026
49
41
93
105
7.0
8.6
15.6
2006-4-00027
38
32
61
67
7.0
9.6
16.6
3

-------
Table 2. Project Quality Scorecard
Report
Number
Evidence
Rating
Report
Timeliness
Deduction
Prelim.
Res
Guide
Fieldwork
Guide
Finding
Outlines
Total
Project
Score
2006-P-00001
4
0
N/A*
1
1
6.0
2006-P-00002
4
0
N/A
1
1
6.0
2006-P-00003
4
-6
1
1
1
1.0
2006-P-00004
4
0
0
1
1
6.0
2006-P-00006
4
-3
1
1

3.0
2006-P-00007
3
-1
0
1
1
4.0
2006-P-00008
4
-2
1
1
1
5.0
2006-P-00009
3
-3
1
1
1
3.0
2006-P-00010
4
-1
N/A
1
1
5.0
2006-P-00011
3
0
0


3.0
2006-P-00012
4
0
1
1
1
7.0
2006-P-00013
4
-1
N/A

1
4.0
2006-P-00014
3
0
0


3.0
2006-P-00015
4
-1
1
1

5.0
2006-P-00016
4
-3
1
1
1
4.0
2006-P-00017
4
-2
0

1
3.0
2006-P-00018
4
0
N/A
1
1
6.0
2006-P-00019
4
-1
N/A
1
1
5.0
2006-P-00020
4
-1
N/A
1
1
5.0
2006-P-00021
4
-1
N/A
1
1
5.0
2006-M-000004
N/A
N/A
0

0
N/A
2006-01-00018
4
-5
0
1
0
0.0
2006-01-00021
4
0
0
1
0
5.0
2006-01-00024
4
-5
0
1
0
0.0
2006-4-00026
4
0
1
1
1
7.0
2006-4-00027
4
0
1
1
1
7.0
Note: In certain assignments a preliminary research guide was not necessary and N/A is shown.

-------
Table 3. Report Quality Scorecard
Report No.
Readability Index
Grade Level
Readability
Score
Complete,
Concise,
Clear
Report Score
Total Weighted
Report Score
(Report Score
Divided by 10)
2006-P-00001
16.9
1
49
50
5.0
2006-P-00002
23.4
0
47
47
4.7
2006-P-00003
16.4
6
39
45
4.5
2006-P-00004
16.4
6
49
55
5.5
2006-P-00006
16.9
1
45
46
4.6
2006-P-00007
10.2
30
49
79
7.9
2006-P-00008
10.7
30
54
84
8.4
2006-P-00009
15.4
16
36
52
5.2
2006-P-00010
16.8
2
50
52
5.2
2006-P-00011
15.3
17
53
70
7.0
2006-P-00012
13.6
30
49
79
7.9
2006-P-00013
16.6
4
39
43
4.3
2006-P-00014
14.5
25
59
84
8.4
2006-P-00015
14.9
21
51
72
7.2
2006-P-00016
17.4
0
49
49
4.9
2006-P-00017
16.7
3
52
55
5.5
2006-P-00018
14.9
21
52
73
7.3
2006-P-00019
18.7
0
50
50
5.0
2006-P-00020
18
0
44
44
4.4
2006-P-00021
15.7
13
48
61
6.1
2006-M-000004
14.3
27
65
92
9.2
2006-01-00018
14.8
22
61
83
8.3
2006-01-00024
13.5
30
64
94
9.4
2006-01-00021
12.9
30
67
97
9.7
2006-4-00026
14.8
22
64
86
8.6
2006-4-00027
14
30
66
96
9.6
2006-S-00001*
18.7
0
67
67
6.7
*Note that 2006-S-00001, Fiscal Year 2005 Status of EPA's Computer Security Program, is a synopsis of the results of the other
OIG FY 2005 information security audits. We did not score this report on the Project Scorecard and it is not included in Table 1 or
Table 2.

-------
Chapter 2
Workpaper Enhancements
Documenting Supervisory
Review of Workpapers
The workpapers for the 26 OIG reports had
some supervisory review notes or comments
located throughout the workpapers.
However, the comments were not
maintained consistently and they were not
located in one section of the workpapers.
As a result, we could not score the reports
for "Supervision" using the scoring criteria.
We could not determine the frequency of
supervisory reviews in accordance with
proposed guidance provided to us at the beginning of our review by the Acting Inspector
General. The Acting Inspector General stated that supervisory reviews of workpapers prepared
by staff at or below the GS-9 level should occur at least twice a month and all other workpapers
should be reviewed at least once a month. Further, unless supervisory review comments are
maintained in one location of the workpapers, an independent referencer or peer review team
will have difficulty determining whether all supervisory review comments are resolved before
beginning independent referencing of the report.
The current 01G Project Management Handbook {Handbook) provides the following guidance
on supervisory review of workpapers:
All working papers must be reviewed by a member of the team who did not prepare the
working paper. Project or Assignment Managers' review of working papers will be
conducted to the extent necessary for the manager to ensure himself or herself that
working papers comply with standards. Working papers prepared by the Project or
Assignment Manager should be reviewed by an experienced team member or respective
Product Line Director. Evidence of working paper review must be recorded in the
working papers. (January 14, 2005, edition, p. 23)
The Handbook, however, does not require that supervisory review comments be kept in a central
location of the workpapers, nor does it state how frequently reviews should occur.
When supervisors prepare review comments, AutoAudit® does not capture those comments and
maintain them in a central location. AutoAudit® could be enhanced to more easily capture
supervisory review comments and to maintain them in one location of the workpapers. That step
would help independent referencers and peer reviewers determine whether all supervisory review
Evidence
§7.48 Sufficient, competent, and relevant evidence is
to be obtained to provide a reasonable basis for the
auditors' findings and conclusions.
§7.68 Audit documentation serves to (1) provide the
principal support for the auditors' report, (2) aid
auditors in conducting and supervising the audit, and
(3) allow for the review of audit quality. Audit
documentation should be appropriately detailed to
provide a clear understanding of its purpose and
source and the conclusions the auditors reached, and it
should be appropriately organized to provide a clear
link to the findings, conclusions, and recommendations
contained in the audit report.
GAS 2003 Revision
6

-------
comments were resolved. It would also allow a determination of the frequency of reviews
according to the guidance given by the Acting Inspector General. If workpaper reviews result in
no comments, the reviewer could state that the workpapers were complete and that no additional
work was needed.
To help strengthen the review of OIG workpapers the OIG should:
Recommendation 1: Issue an interim policy to clarify how to record and maintain
reviewer comments in a central location of the workpaper file for each assignment. The
reviewer comments must contain the dates of review, who performed the review,
disposition, and clearance of the response by the reviewer.
OIG Actions Taken and Planned: Inspector General Statement No. 1 was issued July
27, 2006, and provides direction in response to this recommendation.
All of the requirements of Inspector General Statement No. 1 should be incorporated into
the next revision of the Project Management Handbook.
Recommendation 2: Require that all workpapers be reviewed by the Project Manager
and that the Assignment Manager review all workpapers prepared by the Project Manager.
If the reviewer has no comments, the supervisor should add a short description such as "I
have reviewed the working papers and found them to be satisfactory."
OIG Actions Taken and Planned: Inspector General Statement No. 1 incorporates this
recommendation.
This recommendation does not prevent peer review of workpapers which is a means by
which team members can stay abreast of ongoing work. However, peer review of
workpapers does not constitute supervisory review.
Recommendation 3: Require the assignment manager and the product line director
review the workpapers that support the report.
OIG Actions Taken and Planned: The Handbook should be revised to incorporate this
recommendation. Until it is revised, each office should issue instructions to staff to ensure
this recommendation is timely implemented.
Product Line Directors' and Assignment Managers' review of the specific workpapers
supporting the report does not duplicate the responsibility of the independent referencer
whose responsibility is part of the OIG's quality assurance process. The Directors and
Assignment Managers have the specific responsibility as managers on the assignment to
perform reviews of the work performed and reviews of audit documentations supporting
the report. Implementing this recommendation will help to ensure that adequate evidence
supports each of the findings and recommendations in OIG reports.
7

-------
Recommendation 4: Require that all workpapers prepared by staff at the GS-9 or below
grade levels be reviewed no less than twice a month and all other workpapers should be
reviewed at least once a month.
OIG Actions Taken and Planned: This recommendation has been incorporated into
Inspector General Statement No. 1 and will be incorporated into the next revision of the
Handbook.
Recommendation 5: Require all reviewer comments to be resolved before the report is
submitted to the independent referencer.
OIG Actions Taken and Planned: This recommendation has been incorporated into
Inspector General Statement No. 1 and will be incorporated into the next revision to the
Handbook.
Cross-Referencing Assignment Guide Work
Steps to Supporting Workpapers
Evaluating assignment guides for the 26 reports reviewed showed 6 assignments for which not all
of the work steps were indexed back to the workpapers. The work steps may have been deleted as
unnecessary and the staff may have forgotten to note the rationale in the assignment guide. The
work steps may have been performed but the appropriate workpapers were not cross-referenced in
the assignment guide. In one instance, several work steps for a recent Katrina review were not
cross-referenced to workpapers in the assignment guide. When we contacted the Director, the
Director explained that the team decided the work was easier to track by EPA region rather than
by objective, which was how the assignment guide was initially set up. As a result, workpapers
were maintained by region and not linked to the objectives and work steps as stated in the
assignment guide. After notifying the Director, the team completed the assignment guide.
Not having all work steps cross-referenced to the workpapers, without some explanation, raises
concern as to whether the work necessary to complete the assignment was performed. The
Handbook states "Any steps omitted from the guide should be approved by the Assignment
Manager." Under "Field Work Conducted" is the statement, "As field work progresses, the team
continually updates finding outlines and maintains the Quality Assurance (QA) Checklist as
various field work activities are completed."
8

-------
Recommendation 6: Amend the Handbook to include the following:
a)	Insert under "Field Work Conducted" language that requires the team to
continually update each assignment guide section with appropriate cross references
to the workpapers as work progresses;
b)	Insert "Have the steps in the assignment guide been fully indexed to the supporting
workpapers or otherwise noted as to why the step has not been completed" into the
QA Checklist;
c)	Insert, as part of the independent referencer's responsibilities, language into
Appendix 7 "Independent Referencing Guidance/Certification Memo" regarding
the need to determine if the guide has been fully indexed to supporting workpapers
(or reasons why the team did not complete steps) prior to undertaking referencing.
In those instances where the assignment guide is incomplete, the referencer will
notify the assignment manager and require completion before referencing begins.
OIG Actions Taken and Planned: The next revision to the Handbook will incorporate
this recommendation. Until the revision is issued each office should issue instructions to
ensure staff implement this recommendation.
Enhancing the Independent Referencing Process
The following areas in the OIG's independent referencing activity could be enhanced:
Grade Level and Independence of Staff Performing Independent Referencing
As noted in the Handbook, independent referencing should be assigned to experienced staff who
have knowledge of the Government Auditing Standards and OIG policies. Specifically, the
Handbook states:
Product Line Directors assign experienced staff to reference draft products. Product Line
Directors should select auditors/program analysts (usually a GS-12 or higher) for the
referencing assignment. The selected auditor or program analyst must possess a high
degree of independence, objectivity, experience, and knowledge of the Government
Auditing Standards and OIG reporting policies and standards.
However, under current OIG promotion guidelines, staff who are hired as GS-9s could be
promoted to GS-12s and then selected to independently reference reports with as little as 2 years'
experience. Also, staff from OA, OPE, and OCPL are assigned to independently reference
reports within their own product lines. Independent referencing should be performed by
individuals with several years of experience who are at a higher grade level and who are
independent of the product line. This will provide additional assurance that the referencers will
have the experience, knowledge, and independence necessary to carry out the independent
referencing. We noted that the Director of Assistance Agreements requires that a GS-14 perform
all of the independent referencing in that product line.
9

-------
Recommendation 7: The OIG should establish a centralized group of experienced
independent referencers separate from OA, OPE, OMS, and OCPL.
OIG Actions Taken and Planned: The OIG is establishing an Office of Quality
Assurance and Inspections which will have the responsibility for implementing this
recommendation.
Documenting the Independent Referencer's Work More Consistently
For the reports we reviewed, the independent referencers documented their comments, and
included their comment sheets in the workpapers. However, we noted that the independent
referencers do not always indicate review and acceptance of the supporting materials as required
by the Handbook. Specifically, Appendix 7, "Independent Referencing Guidance/Certification
Memo" in the Handbook states: "Use a colored pencil for placing tick marks on the document to
indicate verification and satisfaction with the supporting material. For example, place a tick
mark over each figure, date, citation to legal or other reference material, and proper name."
Further, for 11 of 26 reports reviewed, a copy of the indexed version of the report with the
independent referencer's tickmarks could not be found in the workpapers. As a result, there is
uncertainty as to whether each line of text of the indexed copy of the report was properly
referenced. We did not always see an affirmative statement by the independent referencer that
he or she believed the opinions and conclusions in the report were reasonable and consistent with
the facts presented and that the recommendations logically followed from the facts and
conclusions, as required by Appendix 7.
The independent referencer may have printed out a hard copy of the report that was indexed and
complied with the requirement in the referencing guidance, but he/she did not scan in or
otherwise ensure that the indexed version of the report with the referencer's tick marks was
placed in the workpapers. The hard copy of the indexed report the independent referencer used
may either still be with him or her or the Assignment Manager.
Recommendation 8: Require the indexed copy of the report with the tick-marks of the
independent referencer be kept in AutoAudit®. The independent referencer should include
a statement in his/her comments that the opinions and conclusions in the report are
reasonable and consistent with the facts presented and that recommendations logically
follow from these facts and conclusions.
OIG Actions Taken and Planned: The OIG will incorporate this recommendation into
the next revision of the Handbook Until the Handbook is revised, each office should
issue instructions to staff to ensure this recommendation is followed. In implementing this
recommendation, teams are not required to scan in the referencer's copy with colored
pencil tick marks. Teams can use a version of the report with electronic tickmarks.
During our review, we noted that teams had developed an electronic method for creating
and capturing tick-marks within AutoAudit® and this is acceptable.
10

-------
Approving Preliminary Research and Assignment Guides
With respect to the Assignment Guide for carrying out the fieldwork, the Handbook states:
In most cases, project guide changes may be approved by the Project Manager. Any
steps omitted from the guide should be approved by the Assignment Manager. The
guide is to be signed by the Product Line Director (or the Project or Assignment Manager
if delegated that authority by the Director). Significant changes to the guide must be
justified and approved in writing by the Director, in consultation with the applicable
Assistant Inspector General.
This paragraph in the Handbook is not clear as to what constitutes significant changes but that
most changes can be approved by the Project Manager (the GS-13 level). Also, the Handbook
does not state when the guide can be signed by the Product Line Director or delegated to the
Assignment Manager. Since OIG reviews can take hundreds of staff days, the assignment guide
is an important document to guide fieldwork. The Director should be responsible for signing the
assignment guide and the circumstances under which the project guide can be changed by the
Assignment/Project Manager should be clarified.
As a best practice, we observed in one assignment that the Product Line Director signed and
dated the front page of the Assignment Guide and then scanned it back into AutoAudit®. This
step showed the guide was in place before the entrance conference as required by the Handbook
Recommendation 9: Clarify the Handbook regarding what constitutes significant changes
and when approvals and subsequent changes to the assignment guide be made by the
Assignment Manager and Product Line Director.
OIG Actions Taken and Planned: The Handbook will be revised to clarify
responsibilities, and indicate that significant changes include dropping an objective or
deciding to implement a vastly different approach to accomplishing the objective.
Until the Handbook is revised, we recommend that each office issue instructions to staff to
ensure this recommendation is timely implemented.
11

-------
Chapter 3
Reporting Enhancements
Defining When Reports Should Use the Word "Official"
OIG reports do not always identify the title of the individual providing comments. Instead
reports use the word "official" even for lower level Agency personnel. When "official" is used
frequently in a report, the reader has difficulty judging the credibility of the comments.
A more reasonable approach is to use the titles of the Agency employees who are providing
comments in OIG reports. As stated in a U.S. Government Accountability Office Report Style
Manual.
In the body of the report we normally identify the official by title, making the comments
so that the reader of the report will be in a position to judge the credibility of the
comments.
The Handbook and the OIG Report Formatting and Style Guide do not specifically address this
issue.
Recommendation 10: The Handbook and Report Formatting and Style Guide should be
revised to reflect these following concepts and managers should ensure the guidance is
followed when preparing reports:
a.	Use the word "official" to represent SES or higher level employees when the specific
title of the individual providing comments cannot be used in OIG reports.
b.	For employees below the SES level, when reports cannot refer to their title, the
employee should be referred to as staff member of a specific office/division etc.
OIG Actions Taken and Planned: The OIG will incorporate this recommendation into
the next revision of the Handbook and the Report Formatting and Style Guide. Until it is
revised, each office should issue instructions to staff to ensure the recommendation is
timely implemented.
Ensuring Visual Aids Show Source of Data
We noted examples in OIG reports of tables, charts, and other visual aids that do not contain the
source of the information. As a result, the reader has difficulty assessing the source of
information provided in the visual aid. As a best practice when visual aids, such as tables,
charts, etc., are used in reports, the source should be named either in the text or in the credit line,
in small type, just below the illustrations. For example, if the table or chart is OIG-constructed,
it should be identified as such and an explanation provided as to where the data originated. The
12

-------
OIG Report Formatting and Style Guide and Report Quality Scoresheet for Draft Submissions do
not specifically address this issue.
Recommendation 11: OCPL should revise the Report Formatting and Style Guide and
Report Quality Scoresheet for Draft Submissions to require that the source of information
for all tables, charts, graphs, or other visual aids be clearly stated either in the report or in
the visual aid. Editors should check to ensure the source is provided during the editing
process.
OIG Actions Taken and Planned: OCPL will make changes to the Report Formatting
and Style Guide and to the Report Quality Scoresheet for Draft Submissions incorporating
this recommendation. Each office should also issue instructions to their staff to help
ensure this recommendation is timely implemented.
Describing the Approach for Each Objective in
Scope and Methodology
Audit results should be responsive to the audit objectives. Accordingly, the report should
describe in the Scope and Methodology section how each objective was addressed. We noted
that in one report it was very clear in the Scope and Methodology section as to how each
objective was developed. For each objective a paragraph began with the phrase "in order to
determine how OECA (objective 1 stated) we (then the report provides a description of the
comparison, analysis, or interviews made)." This step assists the reader in determining that
evidence obtained by the OIG was sufficient/competent and relevant to support the finding and
recommendations. In other reports the description of how each objective was addressed was not
clearly described.
Currently, the OIG'?, Report Formatting and Style Guide discusses methodology and states "the
methodology should address our general review approach, such as noting what types of
transactions we reviewed, as well as provide details on the analysis techniques we used (such as
statistical sampling)." The Guide should be revised to insert a statement that the review
approach be discussed by objective where feasible to assist the reader in judging the approach
and the whether the approach results in sufficient, competent, and relevant evidence to support
the finding.
13

-------
Recommendation 12: Amend the OIG's Report Formatting and Style Guide to instruct
staff to describe how each objective was addressed in the report's Scope and Methodology
section. Editors should also ensure the report clearly describes how each objective was
addressed. To avoid redundancy, reports should only list once those steps that address all
objectives.
OIG Actions Taken and Planned: OCPL will revise the Report Formatting and Style
Guide to state that the report should "describe the review approach by objective." To
avoid redundancy, the guide will direct writers to only list once steps that address all
objectives.
Each office should issue instructions to staff to help ensure timely implementation of the
recommendation until the OCPL Guide is revised.
14

-------
Chapter 4
Administrative Enhancements
Ensuring Staff Charge Time in a Uniform Manner
Over the past several years, one of the OIG's goals was to have professional staff charge 1,600
of the 2,087 work hours per year to direct time (specific assignments) or about 77 percent. The
remaining hours (about 500) were to be used for indirect time such as audit planning, training,
and sick leave.
In a March 1, 2006 email, OPE staff were directed to use newly created IGOR codes for
planning and other indirect charges as follows:
•	Planning: "to be used for project work before initiation of preliminary research. Your
Assignment Manager and/or Product Line Director will notify you when this code is to be
used."
•	Training: "to be used when you are in training unless that training is specific to a project
you are working on. So, for example, if you are doing the Data Mining module in IGEL,
you should use this code, but if you are attending a conference on small drinking water
systems for a job on small drinking water systems, you should charge your time in IGOR
against the project code."
•	Management: "to be used for management activities such as creating an Individual
Development Plan, staff development, or other kinds of management activities."
•	Supervision: "is only for the Product Line Directors and Assignment Managers. Use this
code for activities such as writing PERFORMS or giving performance feedback."
The former Acting AIG for OPE stated that, previously, the OIG had a lot of codes, some codes
were duplicative, and field staff had different codes or could create codes. She noted that there is
no OIG policy on IGOR codes and how staff should charge their time (direct or indirect). She
added that there is still a presumption that offices continue to develop their own approaches.
As a result, in March 2006, OPE managers established the four codes described above to capture
indirect time as well as direct time associated with assignments before a specific IGOR assignment
code is established. With respect to the planning code, the former Acting Director of OPE stated:
.. .the "planning" code captures direct time associated with assignment work because
it includes planning and research associated with assignments_where we haven't set
up an IGOR code. We don't set up "direct" IGOR codes for assignments until
notification memos go out. In my area, all staff time prior to getting that notification
memo sent out, that involves planning and research for the new assignment is charged to
planning. So not all planning charges are indirect.
15

-------
In one Resource Center, a review of 6 pay periods (Pay Periods 12-17) showed a fairly high
percentage of time charged by 6 of 15 OPE employees to the planning and management indirect
codes:
Table 4. Time Charged by Six OPE Employees to the Planning and Management Indirect Codes

Hours Charged
Percent of Total Time (480 hours)
Employee 1
99
20%
Employee 2
183
38%
Employee 3
86
18% (GS-14 Assignment Manager)
Employee 4
181
38%
Employee 5
217
45%
Employee 6
128
27%
In certain instances, staff may be performing general research and using the planning code is
completely appropriate. However, as noted above, direct planning time associated with specific
assignments that is charged to the planning code will not be charged to the IGOR code
established when the notification letter goes out. Therefore the actual cost of the assignment will
not be captured unless both codes are combined. Combining the two codes will also allow the
OIG to determine whether staff met the goal of 1,600 hours on assignments. Finally, the term
"other management activities" under management is ambiguous and needs to be better defined.
We found that OA did not have the exact same definitions for IGOR codes as OPE. We noted
that one OA employee in the same Resource Center for the same recent 6 pay periods charged 52
percent (250 hours) of indirect time to an IGOR code titled "administrative activities." The
auditor likewise explained that the time included preliminary research activities on two or three
assignments, online training, and other general research on potential audit issues.
Recommendation 13: OIG should issue a formal policy to (a) standardize the use and
definition of certain administrative/indirect time codes to better assess efficiency and the
true cost of operations; (b) formalize any time goals regarding time charging by staff; and
(c) ensure managers review time charges by staff for accuracy.
OIG Actions Taken and Planned: Various offices have worked on establishing a
uniform set of codes for capturing time or to ensure managers have access to time charges
of their staff.
The Acting Inspector General has asked the Office of Quality Assurance and Inspections
to develop a time policy in coordination with all OIG offices. That policy and any
associated goals for staff on charging time should be issued so that offices can implement
the policy beginning with FY 07. As noted in the tables, certain assignments can generate
more than one report. The policy will also address when teams should establish separate
job codes for each report where appropriate to enhance the OIG's ability of capturing
relevant costs associated with specific reports.
16

-------
Entering Performance Measurement and Results System
(PMRS) Data
During our review we noted that data were accurately entered - with four exceptions. For one
report, cost efficiencies of about $800 million were claimed and entered into PMRS. The final
report showed about $500 million in cost efficiencies. On two other assignments, involving
State Revolving Fund audits, the entries had not been made into PMRS. When contacted, the
audit teams for these three assignments made proper entries into PMRS. Finally, for one report,
PMRS shows the results for another report and the Director has been contacted.
For 11 of the 26 assignments, the QA Checklist was not completed for the Post Reporting
section, which asks whether results were entered into PMRS. Each Director should check and
ensure the QA Checklist is completed for Post Reporting. This step will help ensure accurate
entries are made in PMRS for all assignments and that the full impact of OIG work is captured.
Recommendation 14: Require each Director to review the QA Checklist at the end of
each assignment to ensure the QA checklist has been fully completed, including the
section for Post Reporting.
OIG Actions Taken and Planned: The next revision to the Project Management
Handbook will incorporate this recommendation. Until it is revised, each office should
issue instructions to their staff to ensure the timely implementation of this
recommendation.
17

-------
Appendix A
Project Quality Scorecard
The project quality scorecard objectively evaluates the activities of work that leads to the draft
reports that are submitted to OCPL for review. Once received by OCPL, the draft report is
scored using the OCPL Report Quality Scoresheet for Draft Submissions. Additional
information on that scoresheet is provided in Appendix B.
The following comments are provided to help the reader better understand how the elements in
the Project Quality Scorecard are measured:
Evidence
As stated in Section 7.50 of Government Auditing Standards, evidence may be categorized as
physical, documentary, testimonial, and analytical. The scoring system reflects the strength of
each type of evidence.
Physical evidence is obtained by auditors' direct inspection or observation of people, property, or
events. Such evidence may be documented in memoranda, photographs, drawings, charts, maps,
or physical samples.
Documentary evidence consists of created information such as letters, contracts, accounting
records, invoices, and management information on performance.
Testimonial evidence is obtained through inquiries, interviews, or questionnaires.
Analytical evidence includes computations, comparisons, separation of information into
components, and rational arguments.
18

-------
Project Quality Scorecard

Background Information
Report Title:
Report #

Date of Kickoff

Assignment #

Date of Entrance Conference

Total IGOR Days

Date of Draft Report sent to
OCPL for Review

Total Hours

Date of Draft Report

Project Cost

Date of Final Report


Significance Rating
Monetary benefits (each $1 million = 1 point)

Recommendations to change EPA policy or regulation (each 1 = 1 point)

Recommendations to implement new EPA policy or regulation (each 1 = 1 point)

Specifically answer a customer request (each 1 = 1 point)

Recommendation to Congress (each 1 = 1 point)


Evidence Rating
Evidence supporting the condition/main fact. Note: If there are multiple conditions
/main facts in an audit or evaluation, the score will be determined by averaging the
scores for each condition or main fact.

Documentary evidence (4 points)

Analytical (3 points)

Observation (3 points)

Testimonial (1 points)


Supervision Rating and Reviews
Note: The rating assigned to review comments/disposition and to the number of
supervision reviews are added for a net supervision score as follows:

Reviewer notes (compute score using the following steps)

a. Number of reviewer notes:


b. Number with comment, response, and acceptance by reviewer (b/a)
.00

c. Percentage x 100 divided by 4 = Supervision points for Reviewer Notes Score

Supervisory reviews (compute score using the following steps)

A. Identify number and grades of staff working on the audit/evaluation list:
Name Grade Level
1.
2.
3.

19

-------
4.
5.

B. Compute number of months of field work (date of kickoff to date of
message agreement meeting)


C. Number of Supervisory reviews that should occur:


( GS 5-9: 2 reviews a month) Number that occurred:


(GS 11-13: 1 review a month) Number that occurred:


D. Percentage of required reviews accomplished for all grade levels


E. Percentage of required reviews accomplished x 100/4
Points for Supervisory reviews


F. Sum of Reviewer notes and Supervisory review = Supervision rating points
N/A*

Report Phase
Number of days from kickoff to date draft report sent to OCPL for
review


Subtract: One point for each 50 days exceeding 200


Preliminary Research Guide
Preliminary research guide completed prior to kickoff meeting: Add 1 point


Fieldwork Guide
Fieldwork guide completed prior to entrance conference: Add 1 point


Finding Outlines
Finding outlines completed prior to Message Agreement Meeting: Add 1 point

Total Quality Score

Note: The score for supervision will be given a "weighted" score in future assignments when
supervision can be measured.
20

-------
Appendix B
Report Quality Scoresheet for Draft Submissions
The Acting Inspector General directed OCPL to develop a system to evaluate the quality of
incoming draft reports, providing OCPL with a Navy Audit report scoresheet, and directing
OCPL to include readability in the scoring. Given these parameters, the OCPL Publications Unit
created the Report Quality Score Sheet for Draft Submissions based on existing report
requirements and guidance included in the Project Management Handbook, the Report
Formatting and Style Guide, and writing principles taught in Write to the Point©. Once we
implement this scoring process, the Publications Unit will score incoming drafts during the
editing process.
While some of the elements of the Scoresheet can be objectively evaluated, objective criteria and
tools cannot address all the important elements of reports, such as organization, structure, clarity,
and the ability of the report to communicate the message. Therefore, the Publications Unit
included subjective measures in the Scoresheet to address whether the report elements are clear,
concise, convincing, logical, and relevant, and provide the proper perspective.
The Publications Unit assigned point values to the criteria so the total points would equal 100, to
be more easily incorporated into the overall scoring system. There is no direct correlation
between the number of requirements and the number of points possible, so the scoring is
subjective.
The Publications Unit assigned 30 points of the 100 points possible to meet the Acting Inspector
General's direction to emphasize the readability index. Readability indices are tools that help
determine how readable documents are. The Publications Unit chose the Flesch-Kincaid Index,
similar to the Fog Index, for three reasons: the index uses a fairly simple formula; the Federal
Government frequently uses the index; and it was developed based on adult training manuals, not
school textbooks. The formula considers the average number of words per sentence and the
average number of syllables per word.
The Publications Unit selects a portion of text from the At a Glance, the Introduction, and a
Finding Chapter. In cases where a report may not have all these sections, the Publications Unit
improvises. If the team writes consistently throughout a report, following the existing report
requirements, guidance, and templates, no concern over bias should occur.
While a good readability score does not ensure that a document is well written, it is an indicator
of the difficulty a reader will have understanding our message. Regardless of how complex an
assignment is, we need to explain our message in a manner that the uninformed reader will easily
understand. However, we took the technical nature of our reports into consideration by
proposing an educational grade level of 14 as our goal, which is equivalent to the New York
Times, as opposed to an educational grade level of 10, which is equivalent to Time or Newsweek
magazines. To improve readability and achieve a good score, writing teams need only write
shorter sentences, mix shorter sentences with longer ones, avoid words with several syllables,
and use plain language.
21

-------
Report Quality Scoresheet for Draft Submissions
Report Title:
Assignment Number:
Product Line Director:
Assignment/Project Manager:
Reviewer:
Review Date:
Total Score: 100
Preliminary Information
Requirements
Points
Possible
Points
Earned
Report Cover
-	Is the cover in the proper format?
-	Is the report title sufficiently descriptive yet concise?
-	Is a position taken in the title?
-	Is the assignment number included on the draft?
2

Inside Cover
-	Are all abbreviations in the report included in the list?
-	If there is a photo on the cover, is a caption included, with source?
1

At a Glance
-	Is it in the proper format and confined to one page?
-	Is the purpose of the report in the "Why...." section?
-	Is the necessary perspective presented in the "Background" section?
-	Is a "snapshot" of findings presented in the "What We Found" section?
-	Are all the objectives addressed in the "What We Found" section?
-	Are recommendations summarized in "What We Recommend" section?
5

Transmittal Memo
-	Is it in the proper format?
-	Is the template language used?
-	Are phone and e-mail contacts listed?
1

Table of Contents
- Are the appropriate entries included, in the proper format?
1

Subtotal
10

Remarks:
22

-------
Introductory Information
(usually Chapter 1)
Requirements
Points
Possible
Points
Earned
Purpose
- Are the objectives clearly and concisely presented?
3

Background
-	Is sufficient yet concise detail on what was reviewed provided?
-	Are data provided for perspective (dates, dollars, quantities)?
-	Are the responsible offices noted?
3

Scope and Methodology (including appendix information)
-	Is the extent of the work performed to accomplish objectives noted?
-	Are the universe and what was reviewed noted?
-	Are the organizations visited and their locations noted?
-	Is the period for when the review began and ended noted?
-	Is the period of transactions covered noted?
-	Are evidence gathering and analysis techniques described?
-	Is review for compliance described, if appropriate?
-	Is a sample design noted?
-	Is the quality of data discussed?
-	Is a Government Auditing Standards statement included?
4

Prior Coverage (can be part of "Scope and Methodology")
-	Are the name, number, and date for prior audits provided?
-	If no prior coverage occurred, is that acknowledged?
1

Internal Control (can be part of "Scope and Methodology")
-	Is the scope of management control reviews noted?
-	Are applicable management controls identified?
-	Is what was found regarding internal controls noted?
-	If internal controls were not reviewed, is that explained?
1

Subtotal
12

Remarks:
23

-------
Rest of Report
Requirements
Points
Possible
Points
Earned
Chapters/Findings
-	Do chapter and section headings take a position and make sense?
-	Is each finding organized clearly and logically?
-	Are results and conclusions logical and concise?
8

"Charge" Paragraphs
-	Is the charge paragraph for each chapter a reasonable length?
-	Do they include condition, criteria, cause, and effect?
-	Is the condition presented in the first sentence?
-	Are all the objectives answered?
-	Are the main points clear and concise?
8

Condition
- Is what was right, wrong, or needing improvement adequately discussed?
3

Criteria
- Are the criteria by which the condition was judged noted?
3

Cause
- Is the underlying reason for the condition identified?
3

Effect
-	Is the ultimate effect on public health and the environment noted?
-	Are quantities/potential cost benefits noted, when applicable?
3

Recommendations
-	Are they action-oriented (avoiding weak words)?
-	Do they address the underlying causes and weaknesses?
-	Do they flow logically from the findings?
6

Status of Recommendations and Potential Monetary Benefits
-	If there are any recommendations, is the table provided?
-	Are all elements presented accurately?
2

Appendices
-	Are they necessary?
-	Are they clearly presented?
-	Are they referenced in the report?
2

Subtotal
38

Remarks:
24

-------
Overall Formatting, Style, and Readability
Requirements
Points
Possible
Points
Earned
Is the Flesch-Kincaid Index lower than 14.0?
30

Does the report follow grammar rules and OIG writing guidance for elements such
as active voice, subject/verb agreement, capitalization, etc.?
5

Are the chapters and/or sections properly formatted?
3

Are tables/charts/photos properly numbered, labeled, and formatted?
2

Subtotal
40

Remarks:
Total Score
Sections
Points
Possible
Points
Allowed
Preliminary Information
10

Introductory Information
12

Rest of Report
38

Overall Formatting, Style, and Readability
40

Total
100

25

-------
Appendix C
Scope and Methodology
To perform our review we received printouts from the Office of Planning, Analysis, and Results
on OIG reports issued and also reports of time expended on the assignments. We then reviewed
the assignment workpapers in the OIG's AutoAudit® workpaper system and the final reports
using the Scoring Form attached as Appendix A. We also contacted supervisors as needed on
each assignment to obtain additional information. The Scoring Form measured each assignment
as to Significance, Evidence Rating, Supervision Rating and Reviews, Report Phase, Preliminary
Research, Fieldwork, and Finding Outlines. The OCPL Publication Unit developed a Report
Quality Scoresheet for Draft Submissions for assessing the quality of draft reports. We believe
these scorecards can be applied to all OIG assignments in accordance with the GAGAS standards
(be well written, timely, and have impact). The primary difference should be only in the type of
impact. The scorecards should allow for enough variety in impact quality measurement to cover
all of our work.
Accordingly, our scope covered final reports issued by OA, OPE, and OCPL from October 1,
2005, through March 31, 2006. We did not include single audit reports, DCAA contract audit
reports, or other reports where the work was performed by external auditors. This project did not
include the Audit of EPA 's Fiscal 2005 Financial Statements. We did not attempt to re-verify
the evidence supporting the report from an independent referencer perspective.
We did not score the "Significance" of the assignment unless the report clearly demonstrated that
the Agency had either fully implemented the recommendation or responded to a customer
request. Because of the manner in which workpaper review notes were maintained in the
workpapers, we were unable to score the Supervision Rating and frequency of supervisory
reviews.
Master List of OIG Products Reviewed for this Project
1.	Rulemaking on Solvent Contaminated Industrial Wipes (OPE), Report No.
2006-P-00001, October 4, 2005.
2.	EPA Could Improve Its Information Security by Strengthening Verification and
Validation Processes (OA), Report No. 2006-P-00002, October 17, 2005.
3.	Changes Needed to Improve Public Confidence in EPA '.s Implementation of the Food
Quality Protection Act (OPE), Report No. 2006-P-00003, October 19, 2005.
4.	2006-P-00004-Ecology and Environment, Inc., Needs to Improve Information
Technology General Controls (OA), Report No. 2006-P-00004, November 22, 2005.
Report No. 2006-P-00005 - We did not score this report because the work was performed by
outside auditors (KPMG).
26

-------
5.
6.
7.
8.
9.
10
11
12
13
14
15
16
17,
18
EiM Performance Measures Do Not Effectively Track Compliance Outcomes (OPE),
Report No. 2006-P-00006, December 15, 2005.
More Information Is Needed On Toxaphene Degradation Products (OCPL), Report
No. 2006-P-00007, December 16, 2005.
Review of Complaint on the University of Nevada, Reno, Regional Environmental
Monitoring and Assessment Program Cooperative Agreement CR 826293-01 (OCPL),
Report No. 2006-P-00008, December 28, 2005.
Opportunities to Improve Data Quality and Children '.s Health through the Food
Quality Protection Act (OPE), Report No. 2006-P-00009, January 10, 2006.
Information Security Series: Security Practices - Integrated Contract Management
System (OA), Report No. 2006-P-00010, January 31, 2006.
EPA '.s and Mississippi's Efforts to Assess and Restore Public Drinking Water Supplies
After Hurricane Katrina (OPE), Report No. 2006-P-00011, February 14, 2006.
Office of Underground Storage Tanks Has Improved Contract Administration, But
Further Action is Needed (OA), Report No. 2006-P-00013, February 28, 2006.
EPA Can Better Manage Superfund Resources (OPE), Report No. 2006-P-00013,
February 28, 2006.
EPA '.s and Louisiana '.s Efforts to Assess and Restore Public Drinking Water Systems
after Hurricane Katrina (OPE), Report No. 2006-P-00014, March 7, 2006.
EPA Office of Air and Radiation and Office of Water Can Further Limit Use of Level
of Effort Contracts (OA), Report No. 2006-P-00015, March 14, 2006.
EPA Can Better Implement Its Strategy for Managing Contaminated Sediments
(OPE), Report No. 2006-P-00016, March 15, 2006.
EPA Can Improve Emissions Factors Development and Management (OPE), Report
No. 2006-P-00017, March 22, 2006.
EPA Provided Quality and Timely Information Regarding Wastewater after Hurricane
Katrina (OPE), Report 2006-P-00018, March 28, 2006.
Information Security Series: Security Practices - Comprehensive Environmental
Response, Compensation, and Liability Information System (OA), Report No.
2006-P-00019, March 28, 2006.
27

-------
19.	Information Security Series: Security Practices - Integrated Compliance Information
System (OA), Report No. 2006-P-00020, March 29, 2006.
20.	Information Security Series: Security Practices - Safe Drinking Water Information
System (OA), Report No. 2006-P-00021, March 30, 2006.
21.	Federal Information Security Management Act-FY 2005 Status of EPA '.s Computer
Security Program (OA), Report No. 2006-S-00001, October 3, 2005.
22.	Evaluation of the Effectiveness of EPA '.s Emergency Response Activities (OPE),
Report No. 2006-M-000004, February 24, 2006.
23.	State of Nevada Drinking Water State Revolving Fund Program Financial Statements
for the Year Ended 6/30/2004 (OA), Report No. 2006-1-00018, November 29, 2005.
24.	State of Oregon Clean Water State Revolving Fund Program Financial Statements for
the Year Ended June 30, 2005 (OA), Report No. 2006-1-00021, January 12, 2006.
25.	State of Nevada Clean Water State Revolving Fund Program Financial Statements for
the Year Ended 6/30/2004 (OA), Report No. 2006-1-00024, January 23, 2006.
26.	State of Illinois' Credit Claim for the Ottawa Radiation Site, Ottawa, Illinois (OA),
Report No. 2006-4-00026, October 31, 2005.
27.	Mixed Funding Claim, Whitehouse Oil Pits Superfund Site, Duval County, Florida
(OA), Report No. 2006-4-00027, October 31, 2005.
28

-------