U.S Environmental Protection Agency

      FY 2011 Annual Performance Report
      FY 2013 Annual Plan

      DATA QUALITY RECORDS
      Beginning with the EPA's FY 2013 budget, the Agency has developed Data Quality Records
      (DQRs) to present validation/verification information for selected performance measures,
      consistent with guidance from the Office of Management and Budget. A DQR documents the
      management controls, responsibilities, quality procedures, and other metadata associated with
      the data lifecycle for an individual performance measure, and is intended to enhance the
      transparency, objectivity, and usefulness of the performance result.


      Caveat:  All Attachments are in Adobe PDF  format,  regardless of icon
190R12002                          1               Back to Table of Contents

-------
Table of Contents


      Information about DQRs and Fields

      Enabling and Support  Programs

           Measures Not Associated With Any Objective

                007   Percent of GS employees (DEU) hired within 80 calendar
                      days.

                008   Percent of GS employees (all  hires) hired within 80
                      calendar days

                009   Increase in number and  percentage of certified acquisition
                      staff (1102)

                010   Cumulative percentage reduction in Greenhouse Gas
                      (GHG) Scopes 1 & 2 emissions.

                052   Number of major EPA environmental systems that use the
                      CDX electronic requirements enabling faster receipt,
                      processing, and  quality checking of data.

                053   States, tribes and territories will be able to exchange data
                      with CDX through nodes in real time, using standards and
                      automated data-quality checking.

                098   Cumulative percentage reduction in energy consumption.

                35A   Environmental and business actions taken for improved
                      performance or risk reduction.

                35B   Environmental and business recommendations or risks
                      identified for corrective action.

                35C   Return on the annual dollar investment, as a percentage
                      of the OIG budget, from  audits and investigations.

                35D   Criminal, civil, administrative, and fraud prevention
                      actions.

                998   EPA's TRI program will work with partners to conduct
                      data quality checks to enhance accuracy and reliability of
                      environmental data.

                999   Total  number of active unique users from states, tribes,
                      laboratories, regulated facilities and other entities that
                      electronically report environmental data to EPA through
                      CDX.
                                                     Back to Table of Contents

-------
Goal 1  :  Taking Action on  Climate Change and
Improving Air Quality

      Measures Not Associated With Any Objective

           ACl  Percentage of products completed by Air, Climate, and
                Energy.

      Objective 1 :  Address Climate Change

           ADI  Cumulative number of major scientific models and
                decision support tools used in implementing
                environmental management programs that integrate
                climate change science data

           AD2  Cumulative number of major rulemakings with climate
                sensitive, environmental impacts, and within existing
                authorities, that integrate climate change science data

           ADS  Cumulative number of major grant, loan, contract, or
                technical assistance agreement programs that integrate
                climate science data into climate sensitive projects that
                have an environmental outcome

           G02  Million metric tons of carbon equivalent (mmtco2e) of
                greenhouse gas reductions in the buildings sector.

           G06  Million metric tons of carbon equivalent (mmtco2e) of
                greenhouse gas reductions in the transportation sector.

           G16  Million metric tons of carbon equivalent (mmtco2e) of
                greenhouse gas reductions in the industry sector.

      Objective 2 :  Improve Air Quality

           001  Cumulative percentage reduction in tons of toxicity-
                weighted (for cancer risk) emissions of air toxics from
                1993 baseline.

           A01  Annual emissions of sulfur dioxide (S02) from electric
                power generation sources.

           M9   Cumulative percentage reduction in population-weighted
                ambient concentration of ozone in monitored counties
                from 2003 baseline.

           M91  Cumulative percentage reduction in population-weighted
                ambient concentration of fine particulate matter (PM-2.5)
                in  all monitored counties from 2003 baseline.
                                                Back to Table of Contents

-------
           034  Cumulative millions of tons of Nitrogen Oxides (NOx)
                reduced since 2000 from mobile sources

           P34  Cumulative tons of PM-2.5 reduced since 2000 from
                mobile sources

     Objective 3 :  Restore  the Ozone Layer

           SOI  Remaining US Consumption  of hydrochlorofluorocarbons
                (HCFCs), chemicals that deplete the Earth's protective
                ozone layer, measured in tons of Ozone Depleting
                Potential (ODP).

Goal  2 :  Protecting  America's Waters

     Measures Not Associated  With Any Objective

           SWl  Percentage of planned research products completed on
                time by the Safe and Sustainable Water Resources
                research program.

     Objective 1 :  Protect Human  Health

           dw2  Percent of person months during which community water
                systems provide drinking water that meets all applicable
                health-based standards.

           E    Percent of the population in  Indian country served by
                community water systems that receive drinking water
                that meets all applicable health-based drinking water
                standards

     Objective 2 :  Protect and Restore Watersheds
     and Aquatic Ecosystems

           bps  Number of TMDLs that are established or approved by
                EPA [Total TMDL] on a schedule consistent with national
                policy (cumulative). [A TMDL is a technical plan for
                reducing pollutants in order  to attain water quality
                standards. The terms "approved" and "established" refer
                to the completion and approval of the TMDL itself.]

           L    Number of waterbody segments identified by States in
                2002 as not attaining standards, where water quality
                standards are now fully attained (cumulative).
                                               Back to Table of Contents

-------
Goal 3 :  Cleaning Up Communities and
Advancing Sustainable Development

      Measures Not Associated With Any Objective

          HCl   Percentage of planned research products completed on
                time by the Safe and Healthy Communities research
                program.

      Objective 1 :  Promote Sustainable and Livable
      Communities

          B29   Brownfield properties assessed.

          B33   Acres of Brownfields properties made ready for reuse.

          CH2   Number of risk management plan audits and  inspections
                conducted.

      Objective 2 :  Preserve Land

          HWO  Number of hazardous waste facilities with new or updated
                controls.

      Objective 3 :  Restore Land

          112   Number of LUST cleanups completed that meet risk-based
                standards for human exposure and groundwater
                migration.

          151   Number of Superfund sites with human exposures under
                control.

          CA1   Cumulative percentage of RCRA facilities with human
                exposures to toxins under control.

          CA2   Cumulative percentage of RCRA facilities with migration of
                contaminated groundwater under control.

          CAS   Cumulative percentage of RCRA facilities with final
                remedies constructed.

          S10   Number of Superfund sites ready for anticipated use site-
                wide.
                                             Back to Table of Contents

-------
     Objective 4 :  Strengthen Human Health and
     Environmental Protection in Indian Country

           5PQ  Percent of Tribes implementing federal regulatory
                environmental programs in Indian country (cumulative).

           5PR  Percent of Tribes conducting EPA approved environmental
                monitoring and assessment activities in Indian country
                (cumulative.)

Goal  4 :  Ensuring the Safety of Chemicals and
Preventing Pollution

     Measures Not Associated With Any Objective

           CSl  Percentage of planned research products completed on
                time by the Chemical Safety for Sustainability research
                program.

           HS1  Percentage of planned research products completed on
                time by the Homeland Security research program.

     Objective 1 :  Ensure Chemical Safety

           009  Cumulative number of certified Renovation Repair and
                Painting firms

           091  Percent of decisions completed on time (on or before
                PRIA or negotiated due date).

           164  Number of pesticide registration review dockets opened.

           230  Number of pesticide registration review final work plans
                completed.

           C18  Percentage of existing CBI claims for chemical identity in
                health and safety studies reviewed and challenged, as
                appropriate.

           E01  Number of chemicals for which Endocrine Disrupter
                Screening Program (EDSP) decisions have been
                completed

           E02  Number of chemicals for which EDSP Tier 1 test orders
                have been issued

           HC1  Annual number of hazard characterizations completed for
                HPV chemicals
                                              Back to Table of Contents

-------
           RA1   Percentage of planned research products completed on
                 time by the Human Health Risk Assessment research
                 program.

      Objective 2 :  Promote Pollution Prevention

           297   Metric Tons of Carbon Dioxide Equivalent (MTC02e)
                 reduced, conserved, or offset through pollution
                 prevention.

Goal 5 :  Enforcing Environmental Laws

      Objective 1 :  Enforce  Environmental Laws

           400   Millions of pounds of air pollutants reduced, treated, or
                 eliminated through concluded enforcement actions.

           402   Millions of pounds of water pollutants reduced, treated,  or
                 eliminated through concluded enforcement actions.

           404   Millions of pounds of toxic and pesticide pollutants
                 reduced, treated, or eliminated through concluded
                 enforcement actions.

           405   Millions of pounds of hazardous waste reduced, treated,
                 or eliminated through concluded enforcement actions.

           410   Number of civil judicial  and administrative enforcement
                 cases initiated.

           411   Number of civil judicial  and administrative enforcement
                 cases concluded.

           418   Percentage of criminal cases having the most significant
                 health, environmental,  and deterrence  impacts.

           419   Percentage of criminal cases with individual defendants.

           420   Percentage of criminal cases with charges filed.

           421   Percentage of conviction rate for criminal defendants.
                                                 Back to Table of Contents

-------
Additional Information about Data Quality
Records  (DQRs)

Follow the links below to learn more about the background and content of DQRs and to find additional
resources. Alternatively, you may download a complete PDF copy of all DQRs here.

Background

Beginning with the U.S. Environmental Protection Agency's FY 2013 budget, the EPA has developed
data quality records (DQRs) for selected performance measures, consistent with the Agency's graded
approach to quality and guidance from the Office of Management and Budget (OMB). A DQR
documents the management controls, quality procedures and other metadata associated with the data
lifecycle for an individual performance measure. Each DQR is intended to enhance the transparency,
objectivity, and usefulness of the performance result. The Agency is publishing  DQRs as a replacement
and improvement upon the prior method of conveying data quality information  about performance
measures: the Verification and Validation section in the EPA's Annual Plan.
DQR  Contents

Each data  quality record (DQR) is organized into four sections:

1) Measure and DQR Metadata
2) Source  Reporting and Data Definition
3) Information Systems and Data Quality Procedures
4) Reporting and Oversight

Section 1: Measure and  DQR Metadata

This section presents background information related to the measure and the DQR. The fields in this
section help users understand the measure itself, the office at the EPA that is accountable for that
measure, and how the measure supports an EPA strategic goal or strategic target.
National Program Office (NPO). Identifies the highest-level office within the EPA responsible for the
performance measurement data. A current list of the EPA NPOs can be found at EPA's Organizational
Structure Website. The following is a list of NPOs with DQRs published in FY 2013.
• Office of the Administrator (OA)
• Office of Air and Radiation (OAR)
• Office of Administration and Resource
• Office of Indian and Tribal Affairs (OITA)
• Office of the Inspector General (OIG, or INSP
GEN)
                                                              Back to Table of Contents

-------
Management (OARM)                         • Office of Research and Development (ORD)
• Office of Chemical Strategies and Pollution   • Office of Solid Waste and Emergency
Prevention (OCSPP)
• Office of Enforcement and Compliance
Assurance (OECA)
• Office of Environmental Information (OEI)
Response (OSWER)
• Office of Water (OW)
Goal Number and Title; Objective Number and Title; Sub-Objective Number and Title; Strategic Target
Code and Title. These fields indicate how each measure connects to the EPA's Strategic Plan. (For
more information about the EPA's Strategic Plan, visit EPA's Strategic Plan Website.^)  Please note that
some measures for supporting programs, such as administrative or information technology programs,
support multiple strategic goals

Managing Office. Identifies the program responsible for the measure.

la. Performance Measure Term Definitions. Provides definitions of key terms in the performance
measure text, and also may contain background information about the measure. Intended to promote
consistent performance data collection and reporting and to improve understanding of the
performance results.

Section  2:  Source Reporting  and Data Definition

This section provides information about the origin and characteristics of the data the  EPA uses to
calculate performance, to help describe the representativeness and validity of the performance result.

2a. Original Data Source. Identifies the name or type of organization(s) from which the EPA receives
the source data used for calculating performance.

2b. Source Data Collection.  Details the  manner by which original data  are collected, including citing
the quality procedures followed. Provides  information to help characterize the representativeness and
reliability of the source data and the appropriateness of their use for performance measurement.

2c. Source Data Reporting. Identifies the  form/mechanism by which the EPA (1) receives data from
the original data sources and (2) enters the data into an EPA information system. Also, specifies the
timing and frequency of data transmission.
                                                                    Back to Table of Contents

-------
Section 3: Information Systems and  Data Quality
Procedures

This section describes the steps the EPA undertakes in transforming the original data into a final
performance result, how the result connects back to the source data and also the quality procedures
the EPA undertakes to minimize biases and errors in data handling.

3a. Information Systems. Describes each of the EPA's information system utilized in the process of
collecting, calculating and/or reporting the results for  a measure.

3b. Data Quality Procedures. Documents procedures for the oversight, review, and  quality assurance
of the performance data by the EPA.

3c. Data Oversight. Describes responsibilities for overseeing source data reporting and for overseeing
the information systems utilized in producing the performance result.

3d. Calculation Methodology. Provides the methodology used to transform source data into the
performance result for a measure. Explains historical changes in the methodology, if applicable.

Section 4: Reporting and  Oversight

This section provides information  on how the EPA oversees quality at the final stage of reporting, to
help ensure appropriate interpretation of results and to summarize the  most important quality issues
(and the degree to which they are being addressed).

4a. Oversight and Timing of Results Reporting. Identifies responsibilities for oversight of final
reporting. Specifies the frequency of reporting, if other than annual.

4b. Data Limitations/Qualifications. Identifies limitations or qualifications necessary for appropriate
interpretation of performance results.

4c. Third-Party Audits. Identifies relevant assessments of the data flow for a performance  measure.

Additional  Resources

For more information on the EPA's quality system, please see EPA's Quality System Website. To learn
more about data quality-related guidance from OMB, please see OMB Circular A-ll, section 230.13,
available at OMB's Circulair A-ll (pdf document").
                                           10                      Back to Table of Contents

-------
  Enabling Support Measures                          No Associated Objective                                     Measure 007
  Measure Code : 007 - Percent of GS employees (DEU) hired  within 80 calendar
  days.
  Office of Administration and Resource Management (OARM)
   1. Measure and DQR Metadata
   Goal Number and Title                          Enabling Support Program
   Objective Number and Title
   Sub-Objective Number and Title	
   Strategic Target Code and Title
   Managing Office
   Performance Measure Term Definitions
GS employees: The General Schedule (GS) classification and pay system covers the majority of civilian white-collar Federal employees.
GS classification standards, qualifications, pay structure, and related human resources policies (e.g., general staffing and pay administration
policies) are administered by the U.S. Office of Personnel Management (OPM) on a Government-wide basis. Each agency classifies its GS
positions and appoints and pays its GS employees filling those positions following statutory and OPM guidelines. The General Schedule has
15 grades-GS-1 (lowest) to GS-15 (highest).

DEU: This measure will track the hiring timeliness for non-federal applicants using the delegated examining recruitment process. Delegated
examining authority is an authority OPM grants to agencies to fill competitive civil service jobs with applicants applying from outside the
Federal workforce, Federal employees who do not have competitive service status, or Federal employees with competitive service status.
Appointments made by agencies through delegated examining authority are subject to civil service laws and regulations. This is to ensure
fair and open competition, recruitment from all segments of society, and selection on the basis of the applicants' competencies or
knowledge, skills, and abilities (see 5 U.S.C. § 2301).

Hired within 80 calendar days:
This is the measure used to track the time to hire for all Job Opportunity Announcements (JO As) posted on US A Jobs from the time the
announcement is drafted until the time of entry on duty (EOD) .


Background:
   OPM's original End-to-End 80-day hiring initiative focused on the Agency's entire hiring process from the time a hiring request is
initiated until the employee comes on board; the 80-day hiring initiative focused on those non-federal employees hired through the
delegated examining recruitment process.
   OPM's 80-day hiring model is designed to assess the time to hire federal employees where a job opportunity announcement was posted
                                                         11                                    Back to Table of Contents

-------
  Enabling Support Measures                          No Associated Objective                                      Measure 007
on USAJOBs.

The President's May 2010 "Hiring Reform Initiative" memo seeks agencies to improve the timeliness of "all" hiring actions and in
particular hiring actions for Mission Critical Occupations and commonly-filled positions.  Agency specific reporting requirements for time
to hire statistics are uncertain and not yet finalized (please see
http://www.whitehouse.gov/the-press-office/presidential-memorandum-improving-federal-recruitment-and-hiring-process).

For more information, please see http://www.opm.gov/publications/EndToEnd-HiringInitiative.pdf


 2. Data Definition and Source Reporting	
 2a. Original Data Source
 The original data source is EPA employees who request, prepare, and process SF-52s, Requests for Personnel Actions, and other
 documents, (e.g., staffing requisition, position description, job analysis, etc.) associated with processing hiring actions.

 2b. Source Data Collection	
 The source data is collected from the  SF-52, Request for Personnel Action, and other documents associated (e.g., staffing requisition,
 position description, job analysis, etc.) with processing hiring actions, as well as steps taken by staff in processing these actions. Staff in
 the three Human Resources Shared Service Centers use dates on the SF-52s to enter dates in the Human Resources Activities and
 Communication Tracking System (HRACTS).  They also record information, such as vacancy announcement numbers and comments in
 HRACTS.  Data in HRACTS is reviewed quarterly by the SSC staff to ensure completeness and accuracy. Customers serve as an
 additional review layer as they have access to HRACTS and can raise any inconsistencies in data entered.
 2c. Source Data Reporting
 Form/mechanism for receiving data and entering into EPA system:
 The servicing human resources personnel at EPA's 3 Shared Service Centers enter data into the system.  Data is typically transmitted
 through scanning and emailing to a designated email box from the hiring decision-makers to the SSC staff. Once received, the servicing
 human resources personnel at EPA's  3 Shared Service Centers enter data into the system.

 Timing and frequency of reporting:
 The data is reported quarterly to the Office of Personnel Management. In addition, Agency-wide, Office-level, and SSC reports can be
 prepared on an annual, quarterly, or selected time period basis.
 3. Information Systems and Data Quality Procedures
 3a. Information Systems
 Office of Human Resources (OHR) HRACTS.
                                                           12                                     Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                        Measure 007
Office of Human Resources (OHR) Human Resources Activity Communication Tracking System (HRACTS).

EPA's Human Resources Activity and Communication Tracking System (HRACTS) is an in-house, lotus-notes based system designed to
track and monitor HR workload including recruitment actions at the Agency's Shared Service Centers. HRACTS also tracks other HR
workload activity including awards, reassignment, etc.; tracks EPA's status towards achieving OPM's original  80-day hiring goal for
delegated examining recruitment actions and provides status reports to customers. HRACTS has multiple date fields for inputting the date
for each step in the hiring process. HRACTS can track the time throughout EPA's hiring process from the time a hiring request is initiated
until the employee comes on board. Upon HR office consolidation to the Shared Service Center in FY09, HRACTS was refined to be
useful in tracking Agency-wide hiring timeliness, standards for data quality were developed; and types of hiring methods used (e.g. MP,
DEU, etc) were incorporated.

HRACTS is continually undergoing changes and modifications to meet the constant clarification and unique needs of the 80-day end-to-end
hiring model. HRACTS has been revised to meet the diverse demands for easy access by Agency-wide managers to track the status of
hiring actions. HRACTS reports are being revised to provide organizations with in-depth information on the status of their pending
recruitment actions in a secure and controlled environment.  The system was refined to notify applicants of the status of their vacancy
application throughout the hiring process and also provide managers with a link to survey their perspective of the overall hiring process.
Revisions also include better reporting templates to track trends and anomalies along the hiring process timeline.

Agency-wide, Office-level, and SSC reports can be prepared on an  annual, quarterly, or selected time period basis.  Manager access was
made  available to better enable tracking of the status of their individual recruitment actions.

While HRACTS  can track by the type of recruitment action (DEU,  MP, etc), HRACTS is currently not capable of tracking by occupational
series (e.g. Mission Critical Occupations and commonly-filled positions).

The system meets the quality control standards of lotus notes.

Additional information:
Further system enhancements may be needed to track hiring timeliness for MCOs and commonly-filled positions to meet the President's
Hiring Reform Initiatives.
3b. Data Quality Procedures                                                                                                     |
SSC / OHR staff review and analyze the reports to determine trends and assess workload. SSC staff review and validate the data, identify
anomalies or data-entry errors, make corrections, and provide the updated information so that the system's reports can be current and
accurate. Agency managers can be provided with system access to  further enhance data integrity. Questions about the data or resolution of
data issues are frequently resolved through discussion and consultation with the SSC and OHR.	
3c. Data Oversight
The Lotus Notes  Manager of the Information Resources Management Division is responsible for overseeing the source data reporting and
making changes/modifications to the system to further improve tracking and reporting;  run reports; train authorized staff on the use of the

                                                             13                                      Back to Table of Contents

-------
 Enabling Support Measures                            No Associated Objective                                        Measure 007
system, and makes enhancements to the system to meet time to hire goals.

3d. Calculation Methodology
Data is entered to track all hires where a JOA was posted on USAJOBs. The system tracks each step of the hiring process. The steps
included in the metrics are: SSC drafts/posts JOA; JOA open period; SSC prepares certificates; customer has certificates
(interview/selection process; SSC makes tentative offer; conduct background check; make formal job offer; selectee enters on duty. We
were instructed to track the Senior Executive Service (SES) hiring process as well, although these are two very different hiring processes.
4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting                                                                                       |
The Reporting Oversight Personnel is the JrtR Director.  Responsibilities include monitoring progress against milestones and measures;
work with OPM and JrtR community to achieve timelines and targets for correcting agency hiring by reducing substantially the time to hire
for Mission Critical Occupations (MCOs) and commonly filled positions; measuring/improving the quality and speed of the hiring process,
and analyzing the causes of agency hiring problems and establishing timelines/targets for reducing them.  Time to hire information is
reported on a quarterly basis.

4b. Data Limitations/Qualifications

JrtRACTS is not integrated with the Agency's People Plus System, the Agency's official personnel system, therefore, discrepancies may
arise such as the total number of hires. While HRACTS can track by the type of recruitment action (DEU, MP, etc.), JdRACTS is currently
not capable of tracking by occupational series (e.g., Mission Critical Occupations and commonly-filled positions.)


4c. Third-Party Audits
EPA OIG released a report on OARM's revised hiring process, including timing and technological capability, in 2010. Please see
http://www.epa.gov/oig/reports/2010/20100809-10-P-0177.pdf.

OPM conducted a review of EPA's hiring process.  Please see http://www.opm.gov/hiringtoolkit/docs/EPAcasestudy.pdf.	
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                             14                                      Back to Table of Contents

-------
  Enabling Support Measures                          No Associated Objective                                     Measure 008
  Measure Code : 008 - Percent of GS employees (all hires) hired within 80 calendar
  days
  Office of Administration and Resource Management (OARM)
   1. Measure and DQR Metadata
   Goal Number and Title                          Enabling Support Program
   Objective Number and Title
   Sub-Objective Number and Title	
   Strategic Target Code and Title
   Managing Office
   Performance Measure Term Definitions
GS employees: The General Schedule (GS) classification and pay system covers the majority of civilian white-collar Federal employees.
GS classification standards, qualifications, pay structure, and related human resources policies (e.g., general staffing and pay administration
policies) are administered by the U.S. Office of Personnel Management (OPM) on a Government-wide basis. Each agency classifies its GS
positions and appoints and pays its GS employees filling those positions following statutory and OPM guidelines. The General Schedule has
15 grades-GS-1 (lowest) to GS-15 (highest).

Other than DEU:
This measure will track the hiring timeliness for all hires not using the delegated examining recruitment process. Delegated examining
authority is an authority OPM grants to agencies to fill competitive civil service jobs with applicants applying from outside the Federal
workforce, Federal employees who do not have competitive service status, or Federal employees with competitive service status.
Appointments made by agencies through delegated examining authority are subject to civil service laws and regulations. This is to ensure
fair and open competition, recruitment from all segments of society, and selection on the basis of the applicants' competencies or
knowledge, skills, and abilities (see 5 U.S.C. § 2301).

Hired within 80 calendar days:
This is the measure used to track the time to hire for all Job Opportunity Announcements (JO As) posted on US A Jobs from the time the
announcement is drafted until the time of entry on duty (EOD) .

Background:
   OPM's original End-to-End 80-day hiring initiative focused on the Agency's entire hiring process from the time a hiring request is
initiated until the employee comes on board; the 80-day hiring initiative focused on those non-federal employees hired through the
delegated examining recruitment process.
   OPM's 80-day hiring model is designed to assess the time to hire federal employees where a job opportunity announcement was posted
                                                          15                                    Back to Table of Contents

-------
  Enabling Support Measures                          No Associated Objective                                      Measure 008
on USAJOBs.

The President's May 2010 "Hiring Reform Initiative" memo seeks agencies to improve the timeliness of "all" hiring actions and in
particular hiring actions for Mission Critical Occupations and commonly-filled positions.  Agency specific reporting requirements for time
to hire statistics are uncertain and not yet finalized (please see
http://www.whitehouse.gov/the-press-office/presidential-memorandum-improving-federal-recruitment-and-hiring-process).

For more information, please see http://www.opm.gov/publications/EndToEnd-HiringInitiative.pdf


 2. Data Definition and Source Reporting	
 2a. Original Data Source
 The original data source is EPA employees who request, prepare, and process SF-52s, Requests for Personnel Actions, and other
 documents, (e.g., staffing requisition, position description, job analysis, etc.) associated with processing hiring actions.	
 2b. Source Data Collection
 The source data is collected from the SF-52, Request for Personnel Action, and other documents associated (e.g., staffing requisition,
 position description, job analysis, etc.) with processing hiring actions, as well as steps taken by staff in processing these actions.  Staff in
 the three Human Resources Shared Service Centers use dates on the SF-52s to enter dates in the Human Resources Activities and
 Communication Tracking System (HRACTS). They also record information, such as vacancy announcement numbers and comments in
 HRACTS. Data in HRACTS is reviewed quarterly by the SSC staff to ensure completeness and accuracy.  Customers serve as an
 additional review layer as they have access to HRACTS and can raise any inconsistencies in data entered.	
 2c. Source Data Reporting
 Form/mechanism for receiving data and entering into EPA system:
 The servicing human resources personnel at EPA's 3 Shared Service Centers enter data into the system. Data is typically transmitted
 through scanning and emailing to a designated email box from the hiring decision-makers to the SSC staff. Once received, the servicing
 human resources personnel at EPA's 3 Shared Service Centers enter data into the system.

 Timing and frequency of reporting:
 The data is reported quarterly to the Office of Personnel Management. In addition, Agency-wide, Office-level, and SSC reports can be
 prepared on an annual, quarterly, or selected time period basis.
 3. Information Systems and Data Quality Procedures	
 3a. Information Systems
 Office of Human Resources (OHR) HRACTS.
 Office of Human Resources (OHR) Human Resources Activity Communication Tracking System (HRACTS).
                                                           16                                     Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                        Measure 008

EPA's Human Resources Activity and Communication Tracking System (HRACTS) is an in-house, lotus-notes based system designed to
track and monitor HR workload including recruitment actions at the Agency's Shared Service Centers. HRACTS also tracks other HR
workload activity including awards, reassignment, etc.; tracks EPA's status towards achieving OPM's original 80-day hiring goal for
delegated examining recruitment actions and provides status reports to customers. HRACTS has multiple date fields for inputting the date
for each step in the hiring process.  HRACTS can track the time throughout EPA's hiring process from the time a hiring request is initiated
until the employee comes on board. Upon HR office consolidation to the Shared Service Center in FY09, HRACTS was refined to be
useful in tracking Agency-wide hiring timeliness, standards for data quality were developed; and types of hiring methods used (e.g. MP,
DEU, etc) were incorporated.

HRACTS is continually undergoing changes and modifications to meet the constant clarification and unique needs of the 80-day end-to-end
hiring model. HRACTS has been revised to meet the diverse demands for easy access by Agency-wide managers to track the status of
hiring actions. HRACTS reports are being revised to provide organizations with in-depth information on the status of their pending
recruitment actions in a secure and controlled environment.  The system was refined to notify applicants of the status of their vacancy
application throughout the hiring process and also provide managers with a link to survey their perspective of the overall hiring process.
Revisions also include better reporting templates to track trends and anomalies along the hiring process timeline.

Agency-wide, Office-level, and SSC reports can be prepared on an  annual, quarterly, or selected time period basis.  Manager access was
made available to better enable tracking of the status of their individual recruitment actions.

While HRACTS  can track by the type of recruitment action (DEU,  MP, etc), HRACTS is currently not capable of tracking by occupational
series (e.g. Mission Critical Occupations and commonly-filled positions).

The system meets the quality control standards of Lotus Notes.

Additional information:
Further system enhancements may be needed to track hiring timeliness for MCOs and commonly-filled positions to meet the President's
Hiring Reform Initiatives.
3b. Data Quality Procedures                                                                                                     |
SSC / OHR staff review and analyze the reports to determine trends and assess workload. SSC staff review and validate the data, identify
anomalies or data-entry errors, make corrections, and provide the updated information so that the system's reports can be current and
accurate. Agency managers can be provided with system access to  further enhance data integrity. Questions about the data or resolution of
data issues are frequently resolved through discussion and consultation with the SSC and OHR.	
3c. Data Oversight
The Lotus Notes  Manager of the Information Resources Management Division is responsible for overseeing the source data reporting and
making changes/modifications to the system to further improve tracking and reporting;  run reports; train authorized staff on the use of the
system, and makes enhancements to the system to meet time to hire goals.

                                                             17                                      Back to Table of Contents

-------
 Enabling Support Measures                            No Associated Objective                                        Measure 008
3d. Calculation Methodology
Data is entered to track all hires where a JOA was posted on USAJOBs. The system tracks each step of the hiring process. The steps
included in the metrics are: SSC drafts/posts JOA; JOA open period; SSC prepares certificates; customer has certificates
(interview/selection process; SSC makes tentative offer; conduct background check; make formal job offer; selectee enters on duty. We
were instructed to track the Senior Executive Service (SES) hiring process as well, although these are two very different hiring processes.
4.  Reporting and Oversight	
4a. Oversight and Timing of Results Reporting                                                                                        |
The Reporting Oversight Personnel is the JrtR Director. Responsibilities include monitoring progress against milestones and measures;
work with OPM and JrtR community to achieve timelines and targets for correcting agency hiring by reducing substantially the time to hire
for Mission Critical Occupations (MCOs) and commonly filled positions; measuring/improving the quality and speed of the hiring process,
and analyzing the causes of agency hiring problems and establishing timelines/targets for reducing them. Time to hire information is
reported on a quarterly basis.

4b. Data Limitations/Qualifications
JrtRACTS is not integrated with the Agency's People Plus System, the Agency's official personnel system, therefore, discrepancies may
arise such as the total number of hires. While HRACTS can track by the type of recruitment action (DEU, MP, etc.), JdRACTS is currently
not capable of tracking by occupational series (e.g., Mission Critical Occupations and commonly-filled positions.)
4c. Third-Party Audits
EPA OIG released a report on OARM's revised hiring process, including timing and technological capability, in 2010. Please see
http://www.epa.gov/oig/reports/2010/20100809-10-P-0177.pdf.

OPM conducted a review of EPA's hiring process. Please see http://www.opm.gov/hiringtoolkit/docs/EPAcasestudy.pdf.
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                             18                                      Back to Table of Contents

-------
  Enabling Support Measures                          No Associated Objective                                      Measure 009
  Measure Code  : 009 - Increase in number and percentage of certified acquisition
  staff (1102)
  Office of Administration and Resource Management (OARM)
   1.  Measure and DQR Metadata
   Goal Number and Title                          Enabling Support Program
   Objective Number and Title
   Sub-Objective Number and Title	
   Strategic Target Code and Title
   Managing Office
   Performance Measure Term Definitions
Certified acquisition staff (1102): The GS-1102 series includes positions that manage, supervise, perform, or develop policies and
procedures for professional work involving the procurement of supplies, services, construction, or research and development using formal
advertising or negotiation procedures; the evaluation of contract price proposals; and the administration or termination and close out of
contracts. The work requires knowledge of the legislation, regulations, and methods used in contracting; and knowledge of business and
industry practices, sources of supply, cost factors, and requirements characteristics. The purpose of the Federal Acquisition Certification in
Contracting (FAC-C) program is to establish core requirements for education, training, and experience for contracting professionals in
civilian agencies. The federal certification in contracting is not mandatory for all GS-1102s; however, members of the workforce issued new
Contracting Officer (CO) warrants on or after January 1, 2007, regardless of GS series, must be certified at an appropriate level to support
their warrant obligations, pursuant to agency policy.

Background:
   It is essential that the Federal Government have the capacity to carry out robust and thorough management and oversight of its contracts
in order to achieve programmatic goals, avoid significant overcharges, and curb wasteful spending. A GAO study last year of 95 major
defense acquisitions projects found cost overruns of 26 percent, totaling $295 billion over the life  of the projects. Improved contract
oversight could reduce such sums significantly.
   Executive Agencies were requested to propose plans to increase the Acquisition Workforce by  5%. OMB provided tools to the Agencies
to determine what the appropriate size would be for the acquisition workforce which is how EPA determined that we need 351 1102s by
FY2014. We proposed adding new contracting personnel annually, in even increments, through 2014 in order to reach this goal. Since EPA
is always working on certifying our contracting personnel, the target certification levels for FY2012 include certifying the personnel that
EPA is bringing onboard to satisfy the increase in the acquisition workforce and certifying those already at EPA.  Since EPA's proposed
plan included bringing on mid- and senior-level 1102s, it is expected that many will already be certified.
   Certification and warranting procedures are initiated by the individual  seeking the certification/warrant. There may be eligible
individuals already in the acquisition workforce who have not yet applied for certification that EPA is unable to track.
                                                           19                                    Back to Table of Contents

-------
  Enabling Support Measures                           No Associated Objective                                       Measure 009

For more information, please see:

Presidential Memorandum for the Heads of Executive Departments and Agencies - Subject: Government Contracting,
http://www.whitehouse.gov/thejress_officeMemorandum-for4he-Heads-of-Executive-Departments-and-Agencies-Subject-GovernmentA
March 4, 2009

October 27, 2009 OMB Memorandum for Chief Acquisition Officers, Senior Procurement Executives, Chief Financial Officers, Chief
Human Capital Officers - Subject: Acquisition Workforce Development Strategic Plan for Civilian Agencies - FY 2010 - 2014.
http://www.whitehouse.gov/sites/default/files/omb/assets/procurement_workforce/AWF_Plan_10272009.pdf
The link is correct as it applies to the Acquisition Workforce Strategic Plan for Civilian Agencies-FY 2010- 2014 relative to increasing the
by 5% as stated in the Background summary for EPA.
 2. Data Definition and Source  Reporting
 2a. Original Data Source
 The Agency Acquisition Career Manager (ACM) reviews and approves the final completed package for an applicant's certification. The
 EPA has a Certification and Warrant Database that is used as the tool for approval and tracking the number of FAC-C and warrants issued
 in the Agency.  This data is reported as the total assigned number of EPA 1102s assigned and the percentage of the total 1102 staff the
 certified. The baseline is 324 assigned 1102s in FY 09 with 70% of the total 1102s assigned in FY 09 certified.	
 2b. Source Data Collection                                                                                                    |
 Source Data Collection Methods:
 Before an individual is certified, there are three levels of review and approval of documentation proving certification eligibility. An initial
 review is performed on every individual's documentation  for certification by an EPA Policy Analyst that specializes in FAC-C certification
 eligibility. The Analyst aids the applicant in preparing a complete package to be reviewed for approval. Once the package is completed, it
 is provided to the Policy Analyst's Team Leader for review and approval. Once it is determined that the package is ready for final review
 by the Agency Acquisition Career Manager (ACM) the final completed package is sent forward for review and approval. Once approved,
 FAC-C level I, II, or III is granted based on the information provided and applied for. The FAC-C certification allows for a warrant to be
 applied for and issued.	
 2c. Source Data Reporting                                                                                                     |
 Form/mechanism for receiving data and entering into EPA system:
 The data in the "Federal  Acquisition Certification, Warrants,  and BPAs" database is reviewed and inputted by EPA Procurement Analysts
 who are trained to verify documents submitted by employees for Federal Acquisition Certification in Contracting (FAC-C) certification and
 approval. The individual uploads his or her documents for review and approval into the email the FAC-C mailbox where the EPA
 Procurement Analyst can review the uploaded documentation to support the education, experience and training requirements for FAC-C
 certification. Once this review is completed the Procurement Analyst releases the file to the supervisor of record for approval/disapproval.
 After the supervisor's approval/disapproval, the system notifies the ACM that the file is ready for review and approval/disapproval.  After

                                                            20                                    Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 009
the ACM approves the application, the FAC-C certificate is then ready for printing and signature by the ACM.

Timing and frequency of reporting:
Once the individual uploads all the documents in their application request for certification, there are system notifications generated that
flow in the review and approval to the Procurement Analyst, Supervisor, and ACM. After the FAC-C Level I, II, or III certificate is signed
by the ACM, it is scanned and emailed to the applicant in advance of receiving the original in the mail.   The 1102 certification data is
reported annually consistent with the OMB, OFPP reporting guidance for the Annual Acquisition Human Plan (AHCP).
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
The information for tracking the certification targets is currently maintained in the EPA's "Federal Acquisition Certification, Warrants, and
BPAs" database.
The EPA's "Federal Acquisition Certification, Warrants, and BPAs" database Warrants/Certifications is a Lotus Notes Database which
contains scanned copies of EPA Warrants. For reporting purposes, information is pulled manually from the scanned Warrant and placed on
each record. This information includes Warrant Number, Level, Type, Authority (name and title), Issue Date, Limitation, Start Date,
AAShip and Division.  Access is closely kept; each record can only be accessed by the FAC/C and warrant holder, the supervisor, and such
administrative officers as are listed in the configuration. Contents are reviewed and updated twice yearly by a designated PTOD POC.

As Warrants are added or cancelled, a group of specialists in OCFO and ITSC are notified so as to keep records up to date in other systems.
Updates to other systems are manual. The source data exists on the paper documents. There is no transformation i.e., aggregated, modeled,
normalized, etc.).

EXAMPLES of system integrity standards include the System Life Cycle Management Policy and the IT security policy. This is a
stand-alone reporting system built on the EPA approved Lotus Notes platform. It is in the Operations and Maintenance portion of the
System Life Cycle Management. It rests on secured, internal EPA server and does not replicate. Proper access is applied to each document.
All reporting is done in the Notes Client in canned reporting views. There is no web access.	
3b. Data Quality Procedures                                                                                                     |
This is not public data viewable outside of EPA information system.  The data in the "Federal Acquisition Certification,  Warrants, and
BPAs" database is reviewed and inputted by EPA Procurement Analysts who are trained to verify documents submitted by employees for
Federal Acquisition Certification in Contracting (FAC-C) certification and approval. Once this review is completed the Procurement
Analyst releases the file to the supervisor of record for approval/disapproval. After the supervisor's approval/disapproval, the system
notifies the ACM that the file is ready for review and approval/disapproval. After the ACM approves the application, the FAC-C certificate
is then ready for printing and signature by the ACM.	
3c. Data Oversight                                                                                                             |
Source Data Reporting Oversight Personnel:  The Agency Senior Procurement Executive (SPE) oversees the final reporting of 1102

                                                           21                                      Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 009
certification data consistent with the OMB, OFPP reporting guidance in the Annual Acquisition Human Plan (AHCP). The Agency
Acquisition Career Manager (ACM) is responsible for data research, data collection, data validation, and preparation of the Annual AHCP.

Information system Oversight Personnel:  The Senior Procurement Executive (SPE) of the Environmental Protection Agency (EPA) is
responsible for establishing an effective acquisition management system which ensures that quality goods and services are obtained at
reasonable prices, in a timely fashion, and in accordance with the statutory and regulatory requirements and the programmatic needs of the
agency. The Agency Senior Procurement Executive (SPE) oversees the final reporting of 1102 certification data consistent with the OMB,
OFPP reporting guidance in the Annual Acquisition Human Plan (AHCP). As warrants are added or cancelled in the EPA "Federal
Acquisition Certification, Warrants, and BPAs" database, a group of specialists in OCFO and ITSC are notified so as to keep records up to
date in other systems. As warrants are added or cancelled,  a group of specialists in OCFO and ITSC are notified so as to keep records up to
date in other systems.

3d. Calculation Methodology
This data is reported as the total assigned number of EPA 1102s assigned and the percentage of the total 1102 staff the certified. The
baseline is 324 assigned 1102s in FY 09 with 70% of the total 1102s assigned in FY 09 certified. The projected target for 2012 for total
assigned 1102s is 335 with a projected 80% of the total assigned staff certified. EPA is continually working on certifying our 1102
acquisition workforce; however, the estimates proposed targets rely upon receiving the additional FTEs for the acquisition workforce.
4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting
The Agency Senior Procurement Executive (SPE) oversees the final reporting of 1102 certification data consistent with the OMB, OFPP reporting
guidance in the Annual Acquisition Human Plan (AHCP).

4b. Data Limitations/Qualifications                                                                                               |
An error estimate has not been calculated for this measure. The EPA has a Certification and Warrant Database that is used as the tool for
approval and tracking the number of FAC-C and warrants issued in the Agency. The database is a stand-alone reporting system built on the
EPA approved Lotus Notes platform. It is in the Operations and Maintenance portion of the System Life Cycle Management. It rests on
secured, internal EPA server and does not replicate. Proper access is applied to each document. All reporting is done in the Notes Client in
canned reporting views. There is no web access. The source data exist on paper documents. There is no transformation of data (i.e.,
aggregated, modeled, normalized, etc.).
4c. Third-Party Audits
There are no independent third party audits of the data flow for this performance measure at this time.  However, future audits could be
conducted by relevant OIG, GAO, and OMB.

As an internal management control tool, the Senior Procurement Executive (SPE) has established the Balanced Scorecard Performance
Measurement and Performance Management Program (Balanced Scorecard- BSC). The purpose of the BSC program establishes an
                                                            22                                     Back to Table of Contents

-------
 Enabling Support Measures                            No Associated Objective                                        Measure 009
Acquisition System Performance Management Plan framework under which the Office of Acquisition Management (OAM) may ensure
that business systems adhere to EPA's mission and vision, and strategy statements follow best business management practices, and comply
with applicable statutes, regulations, and contract terms and conditions.  Through the utilization of the Balance Scorecard framework,
OAM will be able to identify opportunities to strengthen the EPA's Acquisition Workforce Strategic Human Capital Plan, thus allowing
EPA to purse all available authorities and strategies to ensure that the Agency appropriate resources and the best qualified staff to provide
mission support. The BSC program operates with performance measures, self-assessment, and peer review/oversight components.	
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                             23                                      Back to Table of Contents

-------
  Enabling Support Measures                         No Associated Objective                                    Measure 010
  Measure Code : 010 - Cumulative percentage reduction in GreenHouse Gas (GHG)
  Scopes 1  & 2 emissions.
  Office of Administration and Resource Management (OARM)
   1. Measure and DQR Metadata
   Goal Number and Title                          Enabling Support Program
   Objective Number and Title
   Sub-Objective Number and Title	
   Strategic Target Code and Title
   Managing Office
   Performance Measure Term Definitions
GreenHouse Gas (GHG) Scope 1 emissions: Scope 1 GHG emissions are emissions associated with fossil fuel burned at EPA facilities or
in EPA vehicles and equipment. Sources of Scope 1 GHG emissions include fuel oil and natural gas burned in boilers, gasoline used in
vehicles, and diesel fuel used in emergency generators.

GreenHouse Gas (GHG) Scope 2 emissions: Scope 2 GHG emissions are emissions associated with indirect sources of energy such as
electricity, chilled water, or purchased steam.  For example, the GHG emissions from the coal and natural gas used to generate the
electricity supplied to EPA facilities are considered EPA Scope 2 GHG emissions.

Note:  This measure reports cumulative percentage reduction in Scope 1 and 2 emissions aggregately.

EPA's 34 reporting facilities: The EPA facilities at which the Agency controls building operations, pays utility bills directly to the utility
company, and reports annual energy and water consumption  data to the U.S. Department of Energy in order to demonstrate compliance with
federal energy and water reduction requirements.
1)      Research Triangle Park, NC New Main
2)      Research Triangle Park, NC RTF
3)      Research Triangle Park, NC National Computer Center
4)      Research Triangle Park, NC Incinerator
5)      Research Triangle Park, NC Child Care Center
6)      Research Triangle Park, NC Page Road
7)      Chapel Hill, NC
8)      Cincinnati - AWBERC, OH
9)      Cincinnati- T and E, OH
10)     Cincinnati- Center Hill, OH
                                                       24                                   Back to Table of Contents

-------
  Enabling Support Measures                          No Associated Objective                                       Measure 010
11)     Cincinnati - Child Care
12)     Cincinnati - PUB S, OH
13)     Ann Arbor, MI
14)     Fort Meade, MD
15)     Edison, NJ
16)     Edison - REAC, NJ
17)     Duluth, MN
18)     Las Vegas, NV
19)     Narragansett, RI
20)     Richmond, CA
21)     Corvallis-Main, OR
22)     Corvallis-WRS, OR
23)     Houston, TX
24)     Athens-ORD, GA
25)     Athens SESD, GA
26)     Manchester, WA
27)     Kansas City STC, KS
28)     Golden, CO
29)     Chelmsford, MA
30)     Gulf Breeze, FL
31)     Newport, OR
32)     Ada, OK
33)     Montgomery, AL
34)     Grosse He, MI

FY 2008 baseline:  140,911 metric tons of carbon dioxide equivalent (MTCO2e). A breakdown of this baseline is available at
http://www.epa.gov/oaintrnt/documents/epa ghg targets letter omb.pdf.

Background: This measure tracks EPA's performance in meeting Executive Order 13514 ( Federal Leadership in Environmental, Energy,
and Economic Performance) and demonstrating leadership in GHG emissions reductions. For more information on Executive Order 13514,
please see http://www.epa.gov/oaintrnt/practices/eo 13514.htm. More information on EPA's GHG reduction goals and strategies is available
at http://www.epa.gov/oaintrnt/ghg/strategies.htm, and EPA's letter informing OMB of the Agency's Scope 1 and 2 GHG emissions
reduction goal is available at http ://www. epa. gov/oaintrnt/documents/epa  ghg  targets letter omb .pdf An OIG evaluation of EPA's
progress in meeting its GHG reduction goals is available at http://www.epa.gov/oig/reports/2011/20110412-1 l-P-0209.pdf


 2. Data Definition and Source Reporting	
 2a. Original Data Source
                                                           25                                    Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 010
EPA Contractor	
2b. Source Data Collection                                                                                                      |
Source Data Collection Methods:
Scope 1 emissions. See section on Energy Consumption Goal for detail on Enegy and Water Data collection.  For other foundation
information needed for GHG emissions calculations, EPA relies primarily on federal wide data systems to collect other information
necessary to collect foundation data for GHG Scope 1  and 2 emissions. These data systems are used by all federal agencies, with some
minor exceptions.  For example, EPA utilizes GSA's FAS system to gather fleet fuel use; however EPA keeps a separate parallel system
to ensure data quality.
Scope 2 emissions. See section on Energy Consumption Goal for detail on Enegy and Water Data collection.

EPA uses the DOE data portal to convert foundation information into GHG emissions equivalents.

Date/Time Intervals Covered by Source Data:
Quarterly; FY2008 to present
While EPA collects energy and water use data quarterly, use of the DOE Data Portal to calculate GHG Scope 1  and 2 emissions is done
once each Fiscal Year.

EPA OA Requirements/Guidance Governing Collection:
The contractor is responsible for reviewing and quality assuring/quality checking (QA/QCing) the data. Specifically, the contractor
performs an exhaustive review of all invoices and fuel logs to verify that reported consumption and cost data are correct. Once the energy
data is reviewed and verified, the contractor will review and verify the GHG equivalents data ensuring they are using the current translation
factors.
2c. Source Data Reporting
Form/mechanism for receiving data and entering into EPA system:

EPA has abandoned its earlier system of GHG emissions calculations and relies primarily on the DOE Data Portal to calculate its GHG
emissions. EPA merely reports out the DOE generated data as it's performance metrics.

Scope 1 emissions.  See section on Energy Consumption Goal for detail on Enegy and Water Data collection
Scope 2 emissions.  See section on Energy Consumption Goal for detail on Enegy and Water Data collection.

For other foundation information needed for GHG emissions calculations, EPA relies primarily on federal wide data systems to collect
other information necessary to collect foundation data for GHG Scope 1 and 2 emissions. These data systems are used by all federal
agencies, with some minor exceptions.  For example, EPAUtilizes GSA's FAS system to gather fleet fuel use; however EPA keeps a
separate parallel system to ensure data quality.

Timing and frequency of reporting:
The contractor provides GHG production information to the Agency quarterly and annually.	
                                                            26                                      Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 010


3.  Information Systems and  Data Quality Procedures	
3a.  Information Systems
Energy and Water Database.

The Energy and Water Database is a collection of numerous spreadsheets that track energy consumption and GHG production data supplied
by the Agency's contractor.

Beginning on January 31, 2011 and annually thereafter, EPA contractors enter basic energy use and green power purchase information into
a new Department of Energy Data Portal. This portal takes the energy use data and green power purchase information for each federal
agency, for the previous fiscal year, and calculates Scope 1 and 2 GHG emissions.	
3b.  Data Quality Procedures
EPA's Sustainable Facilities Practices Branch compares reported and verified energy use at each reporting facility against previous years'
verified data to see if there are any significant and unexplainable increases or decreases in energy consumption and costs.
3c. Data Oversight
The Chief, Sustainable Facilities Practices Branch, is responsible for overseeing the  data entry into the DOE Data Portal. This position
manages EPA's energy conservation program, including forecasting, project development, data reporting, and EPA's GHG inventory.

Source Data Reporting Oversight Personnel:

Detailed Standard Operating Procedures have been developed, that includes specific requirements for quality control of energy data
collection and reporting, covering areas such as data verification, data entry, and other steps in the energy data reporting process

Information Systems Oversight Personnel:

While EPA is still developing experience with advanced metering systems, it has procedures in place to insure data accuracy.  These
include running manual data collection and advanced metering data collection in parallel,  typically for at least one year, to confirm
accuracy  of advanced metered data.  We also compare current period information with historic information to identify any variances.

Agency feedback to DOE serves as a QA/QC mechanism for formula and conversion factor changes in the DOE Data Portal system..
3d.  Calculation Methodology                                                                                                   |
Timeframe: Cumulative from FY2008 to end of most recent fiscal year

The Department of Energy, EPA, and GSA in cooperation with CEQ and OMB developed Greenhouse Gas Accounting Guidance for
federal government GHG reporting in 2010.  DOE developed a data portal for federal  GHG reporting in the same year. This Data Portal
receives foundation data (i.e. energy use) and converts the data into GHG emissions for each federal agency. In January 2011, EPA entered

                                                            27                                     Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 010
the various energy, water, transportation, travel, and commuting data for FY 2008 and FY 2010 into the DOE Data Portal.  While some
calculations or conversion factors change periodically in the Data Portal, each change is vetted by federal government working groups,
DOE, CEQ and OMB.  EPA is currently in the process of uploading FY 2011 foundation data into the DOE Data Portal, and will complete
this by no later than January 31, 2012.
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The Chief, Sustainable Facilities Practices Branch, is responsible for overseeing the data entry into the DOE Data Portal. This position
manages EPA's energy conservation program, including forecasting, project development, data reporting, and EPA's GHG inventory.
4b. Data Limitations/Qualifications	
EPA does not currently have a formal meter verification program to ensure that an on-site utility meter reading corresponds to the charges
included in the utility bill. However, as EPA implements the advance metering requirements of the Energy Policy Act of 2005 and the
Energy Independence and Security Act of 2007, which is currently underway, EPA will move to annual calibration of advanced meters.
4c. Third-Party Audits
Currently, EPA relies on DOE to maintain the appropriate conversion formulas to calculate GHG emissions.
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                            28                                      Back to Table of Contents

-------
  Enabling Support Measures                        No Associated Objective                                   Measure 052
  Measure Code : 052 - Number of major EPA environmental systems that use the
  CDX electronic requirements enabling faster receipt, processing, and quality
  checking of data.
  Office of Environmental Information (OEI)
   1. Measure and DQR Metadata	
   Goal Number and Title                         Enabling Support Program
   Objective Number and Title
   Sub-Objective Number and Title	
   Strategic Target Code and Title
   Managing Office                                Office of Information Collection
   Performance Measure Term Definitions
Major EPA Environmental Systems: Major environmental systems are those that use CDX services to support the electronic reporting or
exchange of information among trading partners or from the regulated entities to EPA.

Enabling Faster Receipt, Processing, and Quality Checking of Data: This terminology means the services used to ensure quality data
entering the data and that they are submitted in a much faster way than the previous legacy methods, e.g., electronic and Internet-based as
opposed to  a paper or other method that involves mailing to the Agency.

CDX: Central Data Exchange. CDX is the point of entry on the Environmental Information Exchange Network (Exchange Network) for
environmental data submissions to the Agency.

CDX assembles the registration/submission requirements of many different data exchanges with EPA and the States, Tribes, local
governments and the regulated community into a centralized environment. This system improves performance tracking of external
customers and overall management by making those processes more consistent and comprehensive. The creation of a centralized
registration system, coupled with the use of web forms and web-based approaches to submitting the data,  invite opportunities to introduce
additional automated quality assurance procedures for the system and reduce human error. For more information, visit:
http ://www. epa. gov/cdx/index.htm


 2. Data Definition and Source Reporting	
 2a. Original Data Source	
 Users of CDX from the Private sector, State, local, and Tribal government; entered into the CDX Customer Registration Subsystem

                                                      29                                  Back to Table of Contents

-------
 Enabling Support Measures                          No Associated Objective                                      Measure 052

CDX Users at EPA program offices include the:
•       Office of Air and Radiation (OAR)
•       Office of Enforcement and Compliance Assurance (OECA)
•       Office of Environmental Information (OEI)
•       Office of Prevention, Pesticides and Toxic Substances (OPPTS)
•       Office of Solid Waste and Emergency Response (OSWER)
•       Office of Water (OW)	
2b. Source Data Collection	
Source Data Collection Methods:
Reports are routinely generated from log files on CDX servers that support user registration and identity management.

EPA OA Requirements/Guidance Governing Collection:
QA/QC is performed in accordance with a CDX Quality Assurance Plan ["Quality Assurance Project Plan for the Central Data Exchange,"
10/8/2004] and the CDX Design Document v.3. Appendix K registration procedures [Central Data Exchange Electronic Reporting
Prototype System Requirements : Version 3; Document number: EP005S3; December 2000]. Specifically, data are reviewed for
authenticity and  integrity. Automated edit checking routines are performed in accordance with program specifications and the CDX
Quality Assurance Plan. EPA currently has a draft plan developed in August 2007. In FY 2011, CDX will develop robust quality criteria,
which will include performance metric results and align with the schedule for the upcoming CDX contract recompete.

Spatial Detail Covered By the Source Data: This is not applicable other than a user's address.	
2c. Source Data Reporting	
Form/Mechanism for Receiving data and entering into EPA System:
CDX manages the collection of data and documents in a secure way either by users entering data onto web  forms or via a batch file
transfer, both of which are completed using the CDX  environment.  These data are then transported to the appropriate EPA system.

Timing and Frequency of Reporting:  Annual
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
CDX Customer Registration Subsystem. This subsystem is used to register external users for reporting or exchanging data with EPA via
CDX.

CDX completed its last independent security risk assessment in June 2011, and all vulnerabilities are being reviewed or addressed.

Additional Information:
                                                          30                                    Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                        Measure 052
In addition, environmental data collected by CDX is delivered to National data systems in the Agency. Upon receipt, the National systems
often conduct a more thorough data quality assurance procedure based on more intensive rules that can be continuously changing based on
program requirements. As a result, CDX and these National systems appropriately share the responsibility for ensuring environmental data
quality
3b. Data Quality Procedures
The CDX system collects, reports, and tracks performance measures on data quality and customer service. While its automated routines are
sufficient to screen systemic problems/issues, a more detailed assessment of data errors/problems generally requires a secondary level of
analysis that takes time and human resources.

CDX incorporates a number of features to reduce errors in registration data and that contribute greatly to the  quality of environmental data
entering the Agency. These features include pre-populating data either from CDX or National systems, conducting web-form edit checks,
implementing XML schemas for basic edit checking and providing extended quality assurance checks for selected Exchange Network Data
flows using Schematron.
3c. Data Oversight
Although not officially termed, CDX is a general support application that provides centralized services to a multitude of program offices in
the Agency and data trading partners on the Exchange Network. The general answer is that EPA Program Office System Managers and
their management chains are responsible for oversight of the data quality.  The closest individual responsible for "data integrity purposes"
is the Chief of the Information Technology Branch.

3d. Calculation Methodology	
Unit of analysis: Systems

No data transformations occur.
4.  Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Oversight of Final Reporting: Reports on CDX quality and performance are conducted on an annual basis. The reports consist of both
quantitative measures from system logs and qualitative measures from user and program office surveys.

Timing of Results Reporting:
Annually	
4b. Data Limitations/Qualifications
The potential error in registration data, under CDX responsibility has been assessed to be less than 1%.  This is accomplished through as
combination of automated edit checks in web form fields and processes in place to confirm the identity  of individuals prior to approving
access to CDX data flows.
4c. Third-Party Audits                                                                                                           |
                                                            31                                      Back to Table of Contents

-------
 Enabling Support Measures                             No Associated Objective                                          Measure 052
Third party security risk assessments are conducted every three years in accordance with FISMA requirements. Alternatives analysis
reviews are also conducted in accordance with OMB CPIC requirements. Lastly, adhoc third party requirements are conducted internally.
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                               32                                        Back to Table of Contents

-------
  Enabling Support Measures                        No Associated Objective                                    Measure 053
  Measure Code :  053 - States, tribes and territories will be able to exchange data with
  CDX through nodes in real time, using standards and automated data-quality
  checking.
  Office of Environmental Information (OEI)
   1. Measure and DQR Metadata	
   Goal Number and Title                         Enabling Support Program
   Objective Number and Title
   Sub-Objective Number and Title	
   Strategic Target Code and Title
   Managing Office                                Office of Information Collection
   Performance Measure Term Definitions
Able to exchange data: A trading partner has the programmatic and technical infrastructure in place to exchange data across the Exchange
Network.

Nodes: Nodes are points of presence on the Internet which are used to support the secure transport of data to trusted trading partners.

Real-time: When the data is generated and approved, it is automatically transported to the destination of another trading partner.

CDX: Central Data Exchange. CDX is the point of entry on the Environmental Information Exchange Network (Exchange Network) for
environmental data submissions to the Agency.

CDX assembles the registration/submission requirements of many different data exchanges with EPA and the States, Tribes, local
governments and  the regulated community into  a centralized environment. This system improves performance tracking of external
customers and overall management by making those processes more consistent and comprehensive.  The creation of a centralized
registration system, coupled with the use of web forms  and web-based approaches to submitting the data, invite opportunities to introduce
additional automated quality assurance procedures for the system and reduce human error. For more information, visit:
http ://www. epa. gov/cdx/index.htm


 2. Data Definition and Source Reporting	
 2a. Original Data Source
 Users of CDX from the Private sector, State, local, and Tribal government; entered into the CDX Customer Registration Subsystem

                                                       33                                  Back to Table of Contents

-------
 Enabling Support Measures                          No Associated Objective                                      Measure 053

CDX Users at EPA program offices include the:
•       Office of Air and Radiation (OAR)
•       Office of Enforcement and Compliance Assurance (OECA)
•       Office of Environmental Information (OEI)
•       Office of Prevention, Pesticides and Toxic Substances (OPPTS)
•       Office of Solid Waste and Emergency Response (OSWER)
•       Office of Water (OW)	
2b. Source Data Collection	
Source Data Collection Methods:
Reports are routinely generated from log files on CDX servers that support user registration and identity management.

Tabulation of records. Collection is ongoing.

EPA OA Requirements/Guidance Governing Collection:
QA/QC is performed in accordance with a CDX Quality Assurance Plan ["Quality Assurance Project Plan for the Central Data Exchange,"
10/8/2004] and the CDX Design Document v.3. Appendix K registration procedures [Central Data Exchange Electronic Reporting
Prototype System Requirements : Version 3; Document number: EP005S3; December 2000]. Specifically, data are reviewed for
authenticity and  integrity. Automated edit checking routines are performed in accordance with program specifications and the CDX
Quality Assurance Plan. EPA currently has a draft plan developed in August 2007. In FY 2011, CDX will develop robust quality criteria,
which will include performance metric results and align with the schedule for the upcoming CDX contract recompete.

Spatial Detail Covered By the Source Data: This is not applicable other than a user's address.	
2c. Source Data Reporting	
Form/Mechanism for Receiving Data and Entering into EPA System:
CDX manages the collection of data and documents in a secure way either by users entering data onto web  forms or via a batch file
transfer, both of which are completed using the CDX environment.  These data are then transported to the appropriate EPA system.

Timing and Frequency of Reporting:  Annual
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
CDX Customer Registration Subsystem. This subsystem is used to register external users for reporting or exchanging data with EPA via
CDX.

CDX completed its last independent security risk assessment in June 2011, and all vulnerabilities are being reviewed or addressed.
                                                          34                                     Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                        Measure 053

Additional Information:
In addition, environmental data collected by CDX is delivered to National data systems in the Agency. Upon receipt, the National systems
often conduct a more thorough data quality assurance procedure based on more intensive rules that can be continuously changing based on
program requirements. As a result, CDX and these National systems appropriately share the responsibility for ensuring environmental data
quality	
3b. Data Quality Procedures
The CDX system collects, reports, and tracks performance measures on data quality and customer service. While its automated routines are
sufficient to screen systemic problems/issues, a more detailed assessment of data errors/problems generally requires a secondary level of
analysis that takes time and human resources.

CDX incorporates a number of features to reduce errors in registration data and that contribute greatly to the quality of environmental data
entering the Agency. These features include pre-populating data either from CDX or National systems, conducting web-form edit checks,
implementing XML schemas for basic edit checking and providing extended quality assurance checks for selected Exchange Network Data
flows using Schematron.
3c. Data Oversight
Although not officially termed, CDX is a general support application that provides centralized services to a multitude of program offices in
the Agency and data trading partners on the Exchange Network. The general answer is that EPA Program Office System Managers and
their management chains are responsible for oversight of the data quality.  The closest individual responsible for "data integrity purposes"
is the Chief of the Information Technology Branch.

3d. Calculation Methodology
Unit of analysis: Users

No data transformations occur.	


4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting
Oversight of Final Reporting: Reports on CDX quality and performance are conducted on an annual basis.  The reports consist of both
quantitative measures from system logs and qualitative measures from user and program office surveys.

Timing of Results Reporting:
Annually
4b. Data Limitations/Qualifications
The potential error in  registration data, under CDX responsibility has been assessed to be less than 1%. This is accomplished through a
combination of automated edit checks in web form fields and processes in place to confirm the identity of individuals prior to approving
                                                            35                                      Back to Table of Contents

-------
 Enabling Support Measures                             No Associated Objective                                         Measure 053
access to CDX data flows.	
4c. Third-Party Audits
Third party security risk assessments are conducted every three years in accordance with FISMA requirements.  Alternatives analysis
reviews are also conducted in accordance with OMB CPIC requirements.  Lastly, adhoc third party requirements are conducted internally
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                               36                                        Back to Table of Contents

-------
  Enabling Support Measures                         No Associated Objective                                   Measure 098
  Measure Code : 098 - Cumulative percentage reduction in energy consumption.
  Office of Administration and Resource Management (OARM)
   1. Measure and DQR Metadata	
   Goal Number and Title                         Enabling Support Program
   Objective Number and Title
   Sub-Objective Number and Title
   Strategic Target Code and Title
   Managing Office
   Performance Measure Term Definitions
Energy consumption:
Per guidance issued by DOE and CEQ on the implementation of the Energy Policy Act of 2005, Energy Independence Act of 2007, and EO
13514,  energy consumption is defined as the electricity, natural gas, steam, high temperature hot water, chilled water, fuel oil, propane, and
other energy used in EPA occupied facilities where EPA pays directly for utilities. This group of "reporting facilities" consists of EPA
laboratories - either owned by EPA, leased by EPA. or leased by GSA for EPA. This definition of energy consumption matches that used
by all federal agencies in implementing the above referenced legislation and EO. Energy consumption reductions are measured  using a
BTUs/Gross Square Foot/Year metric that is described in the above referenced guidance and used by all federal agencies.

EPA's 34 reporting facilities: The EPA facilities at which the Agency controls building operations, pays utility bills directly to the utility
company, and reports annual energy and water consumption data to the U.S. Department of Energy in order to demonstrate compliance with
federal energy and water reduction requirements.

FY2003 baseline:
EPA's energy consumption baseline for FY 2003 is 388,190 BTUs/GSF/Year.

Background:
Per statute and EO, EPA must reduce energy use at its "reporting" facilities by 3% annually, for a cumulative reduction of 30% by FY 2015,
from a FY 2003 baseline. EPA must reduce its energy use 18% below its FY 2003 baseline by the end of FY 2011, 21% by the end of FY
2012, and 24% by FY 2013.  EPA's energy cumulative energy reduction was  18.1% in FY 2011.
 2. Data  Definition and Source Reporting
 2a. Original Data Source
                                                       37                                   Back to Table of Contents

-------
 Enabling Support Measures                            No Associated Objective                                         Measure 098
EPA Contractor	
2b. Source Data Collection                                                                                                        |
Source Data Collection Methods:
The Agency's contractor requests and collects quarterly energy and water reporting forms, utility invoices, and fuel consumption logs from
energy reporters at each of EPA's "reporting" facilities. The reported data are based on metered readings from the laboratory's utility bills
for certain utilities (natural gas,  electricity, purchased steam, chilled water, high temperature hot water, and potable water) and from on-site
consumption logs for other utilities (propane and fuel oil). In instances when data are missing and cannot be retrieved, reported data are
based on a proxy or historical average. It is relatively rare for EPA to use proxy data, and even more rare for EPA to use proxy data over a
significant period of time. In the relatively few cases where a meter breaks, or an advanced metering system loses data, EPA develops
proxy data to substitute for the missing data. For example, if a week's worth of data is missing from a particular meter, an average of the
previous week's data and the following week's data is used. These adjustments are similar to those used in the private sector and in most
Advanced Metering software systems, which typically flag duplicate data or missing data, and use comparable operating period data to fill
in any gaps. Again, the use of proxy data is rare, and would alter EPA's reported energy use by +/- 0.25% at most on an annual basis.

Date/Time Intervals Covered by Source Data:
Quarterly; FY2003 to present

EPA QA Requirements/Guidance Governing Collection:
The contractor is responsible for reviewing and quality  assuring/quality checking  (QA/QCing) the data. Specifically, the contractor
performs an exhaustive review of all invoices and fuel logs to verify that reported consumption and cost data are correct.  Once the energy
data is reviewed and verified, the contractor will review and verify the GHG equivalents data ensuring they are using the current translation
factors.	
2c. Source Data Reporting                                                                                                         |
Form/mechanism for receiving data and entering into EPA system:

EPA currently relies on a paper based system to collect and report out energy data. A contractor receives  hard or PDF copies of all utility
bills from reporting locations, assimilates and reports out the data in predetermined quarterly and annual data reports.  The standard
operating procedures for Energy Reporting include multiple QA/QC practices at each step of the data collection and analysis process.

EPA's contractors use DOE provided conversion factors to convert native fuel units into BTU equivalents. These conversion factors are
used by all federal agencies in their mandatory energy reporting. Shortly EPA expects to switch a significant portion of its energy
reporting to  an advanced metering system (approximately 74% of energy use), but will run the current paper based system for at least a
year to ensure quality and continuity of energy data.

Timing and frequency of reporting:
EPA collects and  distributes energy data on a quarterly basis.  .


                                                             38                                       Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 098

3.  Information Systems and Data Quality Procedures	
3a.  Information Systems                                                                                                       |
Energy and Water Database.

The Energy and Water Database is a collection of numerous spreadsheets that track energy consumption and GHG production data supplied
by the Agency's contractor.

In addition, beginning on January 31, 2011 and annually thereafter, EPA must enter this data into a Department of Energy Data Portal.
This portal gathers energy use data for each federal agency, for the previous fiscal year.	
3b.  Data Quality Procedures
EPA's Sustainable Facilities Practices Branch compares reported and verified energy use at each reporting facility against previous years'
verified data to see if there are any significant and unexplainable increases or decreases in energy consumption and costs.
3c. Data Oversight
The Chief, Sustainable Facilities Practices Branch, is responsible for overseeing the energy and water data collection system. This position
manages EPA's energy conservation program, including forecasting, project development, and data reporting.

Source Data Reporting Oversight Personnel:
Detailed Standard Operating Procedures have been developed, that includes specific requirements for quality control of energy data
collection and reporting, covering areas such as data verification, data entry, and other steps in the energy data reporting process.

Information Systems Oversight Personnel:
While EPA is still developing experience with advanced metering systems, it has procedures in place to insure data accuracy. These
include running manual data collection and advanced metering  data collection in parallel, typically for at least one year, to confirm
accuracy of advanced metered data. We also compare current period information with historic information to identify any variances.	
3d.  Calculation Methodology                                                                                                    |
Timeframe:

Cumulative from FY2003 to end of most recent fiscal year

Generally, any change in energy data reporting procedures involves running the previous method in parallel with the new methof for at
least a year, prior to standardizing a new methodology. For example, when our Research Triangle Park, North Carolina laboratory installed
an advanced metering system, we ran the old and the new data streams for two years in ensure accuracy/continuity of data.

See attached Standard Operating Procedures.
                                                            39                                     Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 098
               T
                 •**«>
EPA Energy Database SOP 1st Q FY 2012.pdf	
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The Chief, Sustainable Facilities Practices Branch, is responsible for overseeing the energy and water data collection system.  This position
manages EPA's energy conservation program, including forecasting, project development, and data reporting. EPA reports energy data
internally to facility managers and staff involved in energy management, and annually to DOE and CEQ.

4b. Data Limitations/Qualifications
EPA does not currently have a formal meter verification program to ensure that an on-site utility meter reading corresponds to the charges
included in the utility bill. However, as EPA implements the advance metering requirements of the Energy Policy Act of 2005 and the
Energy Independence and Security Act of 2007, which is currently underway, EPA will move to annual calibration of advanced meters.
4c. Third-Party Audits
EPA reports energy data internally to facility managers and staff involved in energy management, and annually to DOE and CEQ.
 Record Last Updated: 02/13/2012 01:16:49 PM
                                                            40                                      Back to Table of Contents

-------
  Enabling Support Measures                         No Associated Objective                                     Measure 35A
  Measure Code : 35A - Environmental and business actions taken for improved
  performance or risk reduction.
  Office of the Inspector General (OIG)
   1. Measure and DQR Metadata
   Goal Number and Title
Enabling Support Program
   Objective Number and Title
   Sub-Objective Number and Title
   Strategic Target Code and Title
   Managing Office
  Chief of Staff in the Immediate Office of the Inspector General
   Performance Measure Term Definitions
Number of environmental and business actions taken for improvements made or risks reduced in response to or influenced by OIG
recommendations.
OIG performance results are a chain of linked events, starting with OIG outputs (e.g., recommendations, reports of best practices,  and
identification of risks). The  subsequent actions taken by EPA or its stakeholders/partners, as a result of OIG's outputs, to  improve
operational  efficiency and environmental  program  delivery  are  reported  as  intermediate outcomes.  The resulting  improvements in
operational efficiency, risks reduced/eliminated, and conditions of environmental and human health are reported as outcomes. By using
common categories of performance measures, quantitative results can be summed and reported. Each outcome is  also qualitatively
described, supported, and linked to an OIG product or output.  The OIG can only control  its  outputs and has no authority,  beyond its
influence, to implement its recommendations that lead to environmental and management outcomes.

# Environmental/Health Improvements: Identifiable and documented environmental or human health improvements resulting from, or
influenced by, any OIG work. Measured by the number and types of improvements.  Narrative should describe the type of improvement
and results in better environmental or human health conditions.  The significance in improvements or impacts can be described in terms of
physical characteristics, numbers of people affected, health and behavioral changes, and compliance with standards, including a percent
change in a recognized environmental/health performance measure or indicator. Example: Faster cleanup of toxic waste dumps resulted
from a process improvement that was recommended by the OIG and implemented by EPA reducing cases of illness.

# Best Practices Implemented: Environmental program or business/operational best practices that were disseminated through OIG work
and implemented by Agency offices, States, or other government agencies. Describe each best  practice implemented and its implication for
efficiency, effectiveness or economy. Example 1: An OIG audit finds that one Region has improved its grants process through a best
practice using a data control check system, resulting in better data accuracy and tracking of grant funds.  OIG auditors recommend that
                                                         41
                                               Back to Table of Contents

-------
  Enabling Support Measures                            No Associated Objective                                        Measure 35A
another Region use the same system, and the best practice is successfully implemented to improve the Region's grants program.  Example 2:
An audit report describes a successful new method, developed by one EPA Region, to track and pursue fines for violators of waste manifest
regulations.  As a result of the report, several other EPA Regions decide to use the new method.

# Risks Reduced or Eliminated: Environmental or business risks reduced or eliminated as a result of any OIG work. Measured in terms of
the number of types (not occurrences) of risks reduced or eliminated. Narrative should describe the risk by type of environmental or human
health exposure, incidence, financial, integrity or security or threat. Agency actions, which were influenced by OIG recommendations or
advice, taken to resolve management challenges, Agency level or material weaknesses.  Describe FMFIA weakness or management
challenge addressed, and the action taken and implications. Example: Indictment/conviction regarding illegal dumping, or closure of
fraudulent asbestos removal company, reduces the risk of exposure to harmful pollutants.

Additional Information:

U.S. EPA, Office of Inspector General, Audits, Evaluations, and Other Publications;                   Available on the Internet at
www.epa.gov/oig , last updatedAugust 2011.

Federal Government Inspector General Quality Standards.
Except for justified exceptions, OIG adheres to the following standards, which apply across the federal government:
• Overall Governance: Quality Standards for Federal Offices of Inspector General.   (President's Council on Integrity and Efficiency
(PCIE) and Executive Council on Integrity and Efficiency (ECIE), October 2003). (http://www.ignet.gov/pande/standards/igstds.pdf) This
document contains quality standards for the  management, operation and conduct of the Federal Offices of Inspector General (OIG). This
document specifies that each federal OIG shall conduct, supervise, and coordinate its audits, investigations, inspections, and evaluations in
compliance with the applicable professional standards listed below:
• For Investigations: Quality Standards for Investigations . (President's Council on Integrity  and Efficiency (PCIE) and Executive Council
on Integrity and Efficiency (ECIE), December 2003). http://www.ignet.gov/pande/standards/invstds.pdf Consistent with appropriate
Department of Justice Directives.
• For Inspections and Evaluations: Quality Standards for Inspections . (President's Council  on Integrity and Efficiency (PCIE) and
Executive Council on Integrity and Efficiency  (ECIE), January 2005). http://www.ignet.gov/pande/standards/oeistds.pdf.
• For Audits: Government Auditing Standards,  issued by the US General Accounting Office (GAO). The professional standards and
guidance in the Yellow Book are commonly referred to as generally accepted government auditing standards (GAGAS). These standards
and guidance provide a framework for conducting high quality government audits and attestation engagements with competence, integrity,
objectivity, and independence. The current version of the Yellow Book (July 2007) can be located in its entirety at the following Website:
www.gao.gov/govaud/d07162g.pdf.

EPA OIG-Specific Operating Standards.  The Project Management Handbook  is the Office of Inspector General (OIG) policy document
for conducting audit,  program evaluation, public liaison, follow-up, and related projects. The Handbook describes the processes and
standards the OIG uses to conduct the various phases of its work and helps ensure the quality, consistency, and timeliness of its products.
Each OIG office may issue, upon  approval by the Inspector General, supplemental guidance over assignments for which that office has
                                                             42                                      Back to Table of Contents

-------
  Enabling Support Measures                           No Associated Objective                                       Measure 35A
responsibility.... This Handbook describes the audit, evaluation, public liaison, and follow-up processes and phases; it does not address OIG
investigative processes although it does apply to audits/evaluations performed by the Office of Investigations (OI) [within EPA OIG]....OIG
audit, program evaluation, public liaison, and follow-up reviews are normally conducted in accordance with appropriate Government
Auditing Standards , as issued by the Comptroller General of the United States, commonly known as the Yellow Book.

Staff may use GAGAS in conjunction with other sets of professional standards. OIG reports may cite the use of other standards as
appropriate. Teams should use GAGAS as the prevailing standard for conducting a review and reporting results should inconsistencies exist
between GAGAS and other professional standards.

For some projects, adherence to all of the GAGAS may not be feasible or necessary. For these projects, the Product Line Director (PLD)
will provide a rationale, the applicable standards not followed, and the impact on project results. The PLD's decision should be made during
the design meeting, documented in the working papers, and described in the Scope and Methodology section of the report. [Source: Project
Management Handbook ].

Product Line Directors.  Product Line Directors oversee one or more particular work areas and multiple project teams.  The OIG product
lines are as  follows: Air/Research and Development; Water; Superfund/Land; Cross Media; Public Liaison and Special Reviews;
Assistance Agreements; Contracts; Forensic Audits; Financial  Management; Risk Assessment and Program Performance; Information
Resources Management; Investigations; US Chemical Safety and Hazard Investigation Board; Legal Reviews; Briefings; OIG Enabling
Support Programs; and Other Activities.

For more information on the PLD responsibilities, see Chapter 5 of the OIG Project Management Handbook , attached to this record.


 2. Data Definition and Source Reporting	
 2a. Original Data Source                                                                                                       |

 Data track EPA programs' environmental and business actions taken or improvements made and risks reduced or avoided as a result of
 OIG performance evaluations, audits, inspections and investigations.  OIG collects such data from EPA programs and from EPA's
 contractors, partners and stakeholders.	
 2b. Source Data Collection                                                                                                     |

 Collection mode of information supporting this measure can vary.

 OIG must determine whether the Agency's/auditee's corrective actions have adequately addressed and corrected the problems identified in
 the report.  (Additional information on OIG's follow-up process can be found at
 at http://oigintra.epa.gov/policy/policies/documents/OIG-04Follow-upPolicy.pdf)
 Project Managers (PMs) may  make and document periodic inquiries concerning the Agency's/auditee's progress in implementing
 corrective actions resulting from OIG work.  As part of this process, OIG may also request documentation supporting the progress or
 completion of actions taken to implement the Agency's corrective actions plan. OIG may also request the Agency's views and concurrence
                                                            43                                     Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 35A
on the actual benefits resulting from the report. When a report is closed upon issuance, the transmittal memorandum should state that OIG
will make periodic inquiries of the Agency's/audi tee's progress in implementing corrective actions resulting from OIG work.

EPA Manual 2750 provides policy and direction for program managers to report and coordinate their corrective action plans with the OIG.
(EPA's Audit Management Process, 2750 Change 2, December 3, 1988, Website:
http://intranet.epa.gov/rmpolicy/ads/manuals72750_2_t.pdf.) This document requires OIG, as part of an effective system of internal
controls, to evaluate the adequacy of such efforts before the recommendations can be closed out in the Agency's follow-up database.
Evaluation of the corrective actions taken will allow the OIG to measure performance and accountability against OIG's performance targets
and strategic goals. On an annual basis, a portion of OIG resources will be devoted to conducting follow-up reviews on specific significant
reports. Each Assistant Inspector General (AIG), in consultation with his or her Product Line Director (PLD), will identify such work
during the annual planning process.
2c. Source Data Reporting
Data comes from OIG audit, evaluations and investigations that are performed under strict compliance with professional standard of the US
Government Accountability Office and the US Department of Justice and subject to independent peer review. Data in the form of
activities, output, and outcomes is entered by designated staff into the Inspector General Enterprise Management System. All original data
is quality controlled for compliance with professional standard and data entered is quality reviewed for accuracy, completeness, timeliness
and adequately supported.


3.  Information Systems and Data Quality Procedures	
3a.  Information Systems

OIG Performance Measurement and Results System (PMRS). PMRS captures and aggregates information on an array of OIG
measures in a logic model format, linking  immediate outputs with long-term intermediate outcomes and results. (The logic model can be
found in OIG's Annual Performance Report at http://www.epa.gov/oig/planning.htm.)  PMRS is the OIG official system for collecting
performance results data, in relation to its  strategic and annual goals. All outputs (recommendations, best practices, risks identified) and
outcome results (actions taken, changes in policies, procedures, practices, regulations, legislation, risks reduced, certifications for decisions,
environmental improvements) influenced by OIG's current or prior work, and recognized during FY 2010 and beyond, should be entered
into PMRS.

PMRS was developed as a prototype in FY 2001. Since then, there have been system improvements for ease of use. For example, during
FY 2009 the PMRS was converted to a relational database directly linked to the new Inspector General Enterprise Management System
(IGEMS).

IGEMS is an OIG employee time-tracking and project cost-tracking database that generates management reports.  IGEMS is used to
generate a project tracking number and a work product number. This system also tracks project progress and stores all related cost
information.

                                                           44                                      Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 35A

AutoAudit and Teammate.  These are repositories for all project working papers.	
3b. Data Quality Procedures                                                                                                     |
Data quality assurance and control are performed as an extension of OIG products and services, subject to rigorous compliance with the
Government Auditing Standards of the Comptroller General, and are regularly reviewed by OIG management, an independent OIG
Management Assessment Review Team, and external independent peer reviews (e.g., by accountancies qualified to evaluate OIG
procedures against Government Auditing Standards). Each Assistant Inspector General certifies the completeness and accuracy of
performance data.

All data reported are audited internally for accuracy and consistency.

OIG processes, including data processes, are governed by the quality standards described in "Additional Information" under the
Performance Term Definition field.  Notably, the Project Management Handbook (which governs audits) provides a QA checklist (see
Appendix 4, of the 2008 Project Management Handbook , attached to this record). The Project Manager (PM) is responsible for completing
the Quality Assurance (QA) checklist throughout the project.  The PM prepares the checklist and submits it to the Product Line Director
(PLD) upon completion of the Post Reporting Phase of the Project.  The Checklist should be completed for all projects, recognizing that
some steps in the checklist may not be applicable to all projects.  The QA Checklist asks teams to ensure the integrity of data that resides in
all of the OIG data systems. [Source: Project Management Handbook ].
 Policy 101. PM H. Final. 05.08.08. pdf


During FY 2008, OIG implemented an Audit Follow-up Policy to independently verify the status of Agency actions on OIG
recommendations, which serve as the basis for OIG intermediate outcome results reported in the OIG PMRS.

(Additional information on the OIG's follow-up process can be found at
http://oigintra.epa.gov/policv/policies/documents/OIG-04Follow-upPolicy.pdf
3c. Data Oversight	
There are three levels of PMRS access: View Only, Edit and Administrator.  Everyone with IGEMS access has view only privileges.
Individuals tasked with adding or editing PMRS entries must be granted PMRS Edit privileges. Contact a PMRS administrator to request
Edit privileges.

Each Product Line Director (PLD), each of whom oversees one or more OIG work areas (e.g., Superfund, Contracts, etc.) and multiple
                                                            45                                      Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                        Measure 35A
project management teams, is responsible for ensuring that teams maintain proper integrity, accessibility, and retrievability of working
papers in accordance with OIG policies. Likewise, they must ensure that information in OIG's automated systems is updated regularly by
the team. (See field 2i, Additional Information, for more information about PLDs.)
3d. Calculation Methodology
Database measures include numbers of: 1) recommendations for environmental and management improvement; 2) legislative, regulatory
policy, directive, or process changes; 3) environmental, program management, security and resource integrity risks identified, reduced, or
eliminated; 4) best practices identified and implemented; 5) examples of environmental and management actions taken and improvements
made; 6) monetary value of funds questioned, saved, fined, or recovered; 7) criminal, civil, and administrative actions taken, 8) public or
congressional inquiries resolved; and 9) certifications, allegations disproved, and cost corrections.

Because intermediate and long-term results may not be realized over a period of several years, only verifiable results are reported in the
year completed.

Unit of measurement:  Individual outcomes/actions
4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting                                                                                       |

Data comes from OIG audit, evaluations and investigations that are performed under strict compliance with professional standard of the US
Government Accountability Office and the US Department of Justice and subject to independent peer review. Data in the form of
activities, output, and outcomes is entered by designated staff into the Inspector General Enterprise Management System. All original data
is quality controlled for compliance with professional standard and data entered is quality reviewed for accuracy, completeness, timeliness
and adequately supported. All data entered is carefully reviewed several times a years as it is entered and subsequently reported on a
quarterly baThe OIG Assistant Inspectors General oversee the quality of the data used to generate reports of performance.  The Office of
the Chief of Staff oversee the data quality and the IG reviews the documents and date use for external consumption. Data is audited and
quality test on a continuous basis through several steps from origin to final use.	
4b. Data Limitations/Qualifications

Because intermediate  and long-term results may not be  realized over a period of several years, only verifiable results are reported in the
year completed.

Although all OIG staff are responsible for data accuracy in their products and services, there is a possibility of incomplete, miscoded, or
missing data in the system due to human error or time lags. Data supporting achievement of results are often from indirect or external
sources, with their own methods or standards for data verification/validation. Such data are  reviewed according to the appropriate OIG
quality standards (see "Additional Information"), and any questions about the quality of such data are documented in OIG reports and/or
thePMRS.

                                                            46                                     Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 35A
The error rate for outputs is estimated at +/-2%, while the error rate for reported long-term outcomes is presumably greater because of the
longer period needed for tracking results and difficulty in verifying a nexus between our work and subsequent actions and impacts beyond
OIG's control. (The OIG logic model in the Annual Performance Report clarifies the kinds of measures that are output-oriented, like risks
identified, versus outcome-oriented, like risks reduced.) Errors tend to be those of omission. Some errors may result from duplication as
well.
4c. Third-Party Audits
There have not been any previous audit findings or reports by external groups on data or database weaknesses in PMRS.

A December 2008 independent audit
(www.epa.gov/oig/reports/2009/OualityReviewofEPAOIG-20081216.pdf) found the following with regard to general OIG processes:
"We determined that the EPA OIG audits methodology, policies and procedures adequately complied with the Government Auditing
Standards. The EPA OIG  quality control system adequately documented compliance with professional and auditing standards for :
Independence; Professional Judgment; Competence; Audit Planning; Supervision; Evidence and Audit Documentation; Reports on
Performance Audits; Nonaudit Services; and the Quality Control Process. The auditors documented, before the audit report was issued,
evidence of supervisory review of the work performed that supports findings, conclusions, and recommendations contained in the audit
report.
"We determined that EPA OIG adequately followed the quality control policies established in the EPA OIG Project Management
Handbook  for conducting audit, program evaluation,  and related projects. The audit documentation adequately includes evidence of work
performed in the major three phases: Preliminary Research, Field Work and Reporting.
"We determined that EPA OIG adequately followed the standards and principles set forth in the PCIE and Executive Council on Integrity
and Efficiency Quality Standards for Investigations, as applicable.  The investigation adequately documented compliance with the
guidelines applicable to the investigation efforts of criminal investigators working for the EPA OIG."
The audit also identified two minor conditions, related working paper review/approval and completion/update status. OIG agreed with the
auditor recommendations  related to the conditions and adapted its Project Management Handbook to address the concerns.

A June 2010 internal OIG review of OIG report quality (which included a review of reporting procedures) found no substantial issues (see
http://www.epa.gov/oig/reports/2010/20100602-10-N-0134.pdf).	
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                            47                                      Back to Table of Contents

-------
  Enabling Support Measures                          No Associated Objective                                     Measure 35B
  Measure Code : 35B - Environmental and business recommendations or risks
  identified for corrective action.
  Office of the Inspector General (OIG)
   1. Measure and DQR Metadata
   Goal Number and Title                          Enabling Support Program
   Objective Number and Title
   Sub-Objective Number and Title	
   Strategic Target Code and Title
   Managing Office                                  Chief of Staff in the Immediate Office of the Inspector General
   Performance Measure Term Definitions
 This is a measure of the number of OIG recommendations or risks identified for action, correction or improvement.

OIG performance results are a chain of linked events, starting with OIG outputs (e.g., recommendations, reports of best practices,  and
identification of risks). The  subsequent actions taken by  EPA  or its stakeholders/partners,  as a result of OIG's outputs, to  improve
operational  efficiency and environmental program delivery  are reported as intermediate outcomes.  The resulting  improvements in
operational efficiency, risks reduced/eliminated, and conditions of environmental and human  health  are reported as outcomes. By using
common categories of performance  measures, quantitative results can  be summed and reported.  Each outcome is  also qualitatively
described, supported, and linked to an  OIG product or output.  The OIG can only  control its outputs and has no authority,  beyond its
influence, to implement its recommendations that lead to environmental and management outcomes.

# Recommendations for Improvement: Number of recommendations for action in OIG reports, formal presentations or analyses.  When
the final product is issued, the number of report recommendations should be recorded in PMRS whether or not the Agency has concurred
with or implemented the recommendations. (Do not count observations, suggestions, or editorial comments.) Describe each
recommendation and its implications for environmental or management action and improvement.

# Best Practices Identified: Best practices identified by OIG work for environmental or management program implementation to resolve a
problem or risk, or improve a condition,  process or result (from any source: EPA, State, other agency, etc.). Results are measured by the
number of best practices identified. Narrative should explain the significance by describing the potential environmental or management
change, action or impact. Example 1: In reviewing several States' partnership roles for an audit issue, we found that one State had
developed very efficient and cost-effective water quality measures  that could be applicable to other States or nationwide. Example 2: An
audit determines that a Region has improved its management of a grant program because of a workgroup the Region set up to coordinate
grant and cooperative agreement functions.

                                                         48                                    Back to Table of Contents

-------
  Enabling Support Measures                           No Associated Objective                                        Measure 35B
# Environmental or Business/ Operational/ Control Risks Identified (including noncompliance): Actual or potential environmental,
health or operational risks identified by any OIG work. Measured in terms of the number of risks by type including the number of FMFIA
disclosed program assurance issues, EPA management challenges and specific risks or internal control weaknesses.  Includes issues
presented in EPA financial statement audits and internal OIG reviews. Narrative should describe the risks and potential/actual
environmental, health, and safety vulnerabilities, behaviors or conditions, risk of financial or resource loss or internal control weakness and
their implications. Example 1: An OIG report on hog farm waste identifies environmental risks for drinking water contamination in nearby
wells.  Example 2: An OIG report identified that grants were given to grantees without specific performance objectives or verification that
the grantees had acceptable financial accountability systems or controls.

Additional Information:

U.S. EPA, Office of Inspector General, Audits, Evaluations, and Other Publications;                   Available on the Internet at
www.epa.gov/oig , last updatedAugust 2011.

Federal Government Inspector General Quality Standards.
Except for justified exceptions, OIG adheres to the following standards, which apply across the federal government:
• Overall Governance: Quality Standards for Federal Offices of Inspector General.  (President's Council on Integrity and Efficiency
(PCIE) and Executive  Council on Integrity and Efficiency (ECIE), October 2003). (http://www.ignet.gov/pande/standards/igstds.pdf)  This
document contains quality  standards for the management, operation  and conduct of the Federal Offices of Inspector General (OIG). This
document specifies that each federal OIG shall conduct, supervise, and coordinate its audits, investigations, inspections, and evaluations in
compliance with the applicable professional standards listed below:
• For Investigations:  Quality Standards for Investigations . (President's Council on Integrity and Efficiency (PCIE) and Executive Council
on Integrity and Efficiency (ECIE), December 2003). http://www.ignet.gov/pande/standards/invstds.pdf Consistent with appropriate
Department of Justice  Directives.
• For Inspections and Evaluations:  Quality Standards for Inspections . (President's Council on Integrity and Efficiency (PCIE) and
Executive Council on Integrity and Efficiency (ECIE), January 2005). http://www.ignet.gov/pande/standards/oeistds.pdf
• For Audits: Government Auditing Standards, issued by the US General Accounting Office (GAO). The professional standards and
guidance in the Yellow Book are commonly referred to as generally accepted government auditing standards (GAGAS). These standards
and guidance provide a framework for conducting high quality government audits and attestation engagements with competence, integrity,
objectivity, and independence. The current version of the Yellow Book (July 2007) can be located in its entirety at the following Website:
www.gao.gov/govaud/d07162g.pdf

EPA OIG-Specific  Operating Standards. The Project Management Handbook is the Office of Inspector General (OIG) policy document
for conducting audit, program evaluation, public liaison, follow-up,  and related projects. The Handbook describes the processes and
standards the OIG uses to conduct the various phases of its work and helps ensure the quality, consistency, and timeliness of its products.
Each OIG office may issue, upon approval by the Inspector General, supplemental guidance over assignments for which that office has
responsibility.... This Handbook describes the audit, evaluation, public liaison, and follow-up processes and  phases; it does not address OIG
investigative processes although it does apply to audits/evaluations performed by the Office of Investigations (OI) [within EPA OIG]....OIG
                                                             49                                      Back to Table of Contents

-------
  Enabling Support Measures                           No Associated Objective                                       Measure 35B
audit, program evaluation, public liaison, and follow-up reviews are normally conducted in accordance with appropriate Government
Auditing Standards , as issued by the Comptroller General of the United States, commonly known as the Yellow Book.

Staff may use GAGAS in conjunction with other sets of professional standards. OIG reports may cite the use of other standards as
appropriate. Teams should use GAGAS as the prevailing standard for conducting a review and reporting results should inconsistencies exist
between GAGAS and other professional standards.

For some projects, adherence to all of the GAGAS may not be feasible or necessary. For these projects, the Product Line Director (PLD)
will provide a rationale, the applicable standards not followed, and the impact on project results. The PLD's decision should be made during
the design meeting, documented in the working papers, and described in the Scope and Methodology section of the report. [Source: Project
Management Handbook ].

Product Line Directors.  Product Line Directors oversee one or more particular work areas and multiple project teams.  The OIG product
lines are as  follows: Air/Research and Development; Water; Superfund/Land; Cross Media; Public Liaison and Special Reviews;
Assistance Agreements; Contracts; Forensic Audits; Financial Management; Risk Assessment and Program Performance; Information
Resources Management; Investigations; US Chemical Safety and Hazard Investigation Board; Legal Reviews; Briefings; OIG Enabling
Support Programs; and Other Activities.

For more information on the PLD responsibilities, see Chapter 5 of the OIG Project Management Handbook , attached to this record.


 2. Data Definition and Source Reporting	
 2a. Original Data Source                                                                                                       |

 Data track environmental and business recommendations or risks identified for corrective action as a result of OIG performance
 evaluations, audits, inspections and investigations.  OIG collects such data from EPA programs and from EPA's contractors, partners and
 stakeholders.	
 2b. Source Data Collection                                                                                                     |

 Collection mode of information supporting this measure can vary.

 OIG must determine whether the Agency's/auditee's corrective actions have adequately addressed and corrected the problems identified in
 the report.  (Additional information on OIG's follow-up process can be found at
 at http://oigintra.epa.gov/policy/policies/documents/OIG-04F ollow-upPolicy.pdf)
 Project Managers (PMs) may make and document periodic inquiries concerning the Agency's/auditee's progress in implementing
 corrective actions resulting from OIG work.  As part of this process, OIG may also request documentation supporting the progress or
 completion of actions taken to implement the Agency's corrective actions plan. OIG may also request the Agency's views and concurrence
 on the actual benefits resulting from the report. When a report is closed upon issuance, the transmittal memorandum should state that OIG
 will make periodic inquiries of the Agency's/audi tee's progress in implementing corrective actions resulting from OIG work.
                                                             50                                     Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 35B

EPA Manual 2750 provides policy and direction for program managers to report and coordinate their corrective action plans with the OIG.
(EPA's Audit Management Process, 2750 Change 2, December 3, 1988, Website:
http://intranet.epa.gov/rmpolicy/ads/manuals/2750 2 t.pdf.) This document requires OIG, as part of an effective system of internal
controls, to evaluate the adequacy of such efforts before the recommendations can be closed out in the Agency's follow-up database.
Evaluation of the corrective actions taken will allow the OIG to measure performance and accountability against OIG's performance targets
and strategic goals. On an annual basis, a portion of OIG resources will be devoted to conducting follow-up reviews on specific significant
reports. Each Assistant Inspector General (AIG), in consultation with his or her Product Line Director  (PLD), will identify such work
during the annual planning process.	
2c. Source Data Reporting                                                                                                     |

Data comes from OIG audit, evaluations and investigations that are performed under strict compliance with professional standard of the US
Government Accountability Office and the US Department of Justice and subject to independent peer review. Data in the form of
activities, output, and outcomes is entered by designated staff into the Inspector General Enterprise Management System.  All original data
is quality controlled for compliance with professional standard and data entered is quality reviewed for accuracy, completeness, timeliness
and adequately supported.	


3.  Information Systems and Data Quality Procedures
3a.  Information Systems                                                                                                      |

OIG Performance Measurement and Results System (PMRS). PMRS captures and aggregates information on an array of OIG
measures in a logic model format, linking immediate outputs with long-term intermediate outcomes and results. (The logic model can be
found in OIG's Annual Performance Report at http://www.epa.gov/oig/planning.htm.) PMRS is the OIG official system for collecting
performance results data, in relation to its strategic and annual goals.  All outputs (recommendations, best practices, risks identified) and
outcome results (actions taken, changes in policies, procedures, practices, regulations, legislation, risks reduced, certifications for decisions,
environmental improvements) influenced by OIG's current or prior work, and recognized during FY 2010 and beyond, should be entered
into PMRS.

PMRS was developed as a prototype in FY 2001. Since then, there have been system improvements for ease of use. For example, during
FY 2009 the PMRS was converted to a relational database directly linked to the new Inspector General Enterprise Management System
(IGEMS).

IGEMS is an OIG employee time-tracking and project cost-tracking database that generates management reports.  IGEMS is used to
generate a project tracking number and a  work product number. This system also tracks project progress and stores all related cost
information.

AutoAudit and Teammate. These are repositories for all project working papers.	
                                                                                                                           I
                                                           51                                      Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 35B
3b. Data Quality Procedures	|
Data quality assurance and control are performed as an extension of OIG products and services, subject to rigorous compliance with the
Government Auditing Standards of the Comptroller General, and are regularly reviewed by OIG management, an independent OIG
Management Assessment Review Team, and external independent peer reviews (e.g., by accountancies qualified to evaluate OIG
procedures against Government Auditing Standards). Each Assistant Inspector General certifies the completeness and accuracy of
performance data.

All data reported are audited internally for accuracy and consistency.

OIG processes, including data processes, are governed by the quality standards described in "Additional Information" under the
Performance Term Definition field.  Notably, the Project Management Handbook  (which governs audits) provides a QA checklist (see
Appendix 4, of the 2008 Project Management Handbook , attached to this record). The Project Manager (PM) is responsible for completing
the Quality Assurance (QA) checklist throughout the project.  The PM prepares the checklist and submits it to the Product Line Director
(PLD) upon completion of the Post Reporting Phase of the Project. The Checklist should be completed for all projects, recognizing that
some steps in the checklist may not be applicable to all projects.  The QA Checklist asks teams to ensure the integrity of data that resides in
all of the OIG data systems. [Source: Project Management Handbook ].
 Policy! 01. PM H. Final. 05.08.08. pdf


During FY 2008, OIG implemented an Audit Follow-up Policy to independently verify the status of Agency actions on OIG
recommendations, which serve as the basis for OIG intermediate outcome results reported in the OIG PMRS.

(Additional information on the OIG's follow-up process can be found at
http://oigintra.epa.gov/policv/policies/documents/OIG-04Follow-upPolicy.pdf
3c. Data Oversight	
There are three levels of PMRS access: View Only, Edit and Administrator. Everyone with IGEMS access has view only privileges.
Individuals tasked with adding or editing PMRS entries must be granted PMRS Edit privileges. Contact a PMRS administrator to request
Edit privileges.

Each Product Line Director (PLD), each of whom oversees one or more OIG work areas (e.g., Superfund, Contracts, etc.) and multiple
project management teams, is responsible for ensuring that teams maintain proper integrity, accessibility, and retrievability of working
papers in accordance with OIG policies. Likewise, they must ensure that information in OIG's automated systems is updated regularly by
                                                            52                                      Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 35B
the team. (See field 2i, Additional Information, for more information about PLDs.)	
3d. Calculation Methodology
Database measures include numbers of:  1) recommendations for environmental and management improvement; 2) legislative, regulatory
policy, directive, or process changes; 3) environmental, program management, security and resource integrity risks identified, reduced, or
eliminated; 4) best practices identified and implemented; 5) examples of environmental and management actions taken and improvements
made; 6) monetary value of funds questioned, saved, fined, or recovered; 7) criminal, civil, and administrative actions taken, 8) public or
congressional inquiries resolved; and 9) certifications, allegations disproved, and cost corrections.

Because intermediate and long-term results may not be realized over a period of several years, only verifiable results are reported in the
year completed.

Unit of measurement:  Individual recommendations/risks


4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting                                                                                       |

The OIG Assistant Inspectors General oversee the quality of the data used to generate reports of performance.  The Office of the Chief of
Staff oversee the data quality and the IG reviews the documents and date use for external consumption. Data is audited and quality test on a
continuous basis through several steps from origin to final use.	
4b. Data Limitations/Qualifications

Because intermediate and long-term results may not be realized over a period of several years, only verifiable results are reported in the
year completed.

Although all OIG staff are responsible for data accuracy in their products and services, there is a possibility of incomplete, miscoded, or
missing data in the system due to human error or time lags. Data supporting achievement of results are often from indirect or external
sources, with their own methods or standards for data verification/validation. Such data are reviewed according to the appropriate OIG
quality standards (see "Additional Information"), and any questions about the quality of such data are documented in OIG reports and/or
thePMRS.

The error rate for outputs is estimated at +/-2%, while the error rate for reported long-term outcomes is presumably greater because of the
longer period needed for tracking results and difficulty in verifying a nexus between our work and subsequent actions and impacts beyond
OIG's control. (The OIG logic model in the  Annual Performance Report clarifies the kinds of measures that are output-oriented, like risks
identified, versus outcome-oriented, like risks reduced.) Errors tend to be those of omission. Some errors may result from duplication as
well.	
4c. Third-Party Audits
There have not been any previous audit findings or reports by external groups on data or database weaknesses in PMRS.

                                                             53                                       Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                      Measure 35B
A December 2008 independent audit
(www.epa.gov/oig/reports/2009/OualityReviewofEPAOIG-20081216.pdf) found the following with regard to general OIG processes:
"We determined that the EPA OIG audits methodology, policies and procedures adequately complied with the Government Auditing
Standards. The EPA OIG quality control system adequately documented compliance with professional and auditing standards for :
Independence; Professional Judgment; Competence; Audit Planning; Supervision; Evidence and Audit Documentation; Reports on
Performance Audits; Nonaudit Services; and the Quality Control Process. The auditors documented, before the audit report was issued,
evidence of supervisory review of the work performed that supports findings, conclusions, and recommendations contained in the audit
report.
"We determined that EPA OIG adequately followed the quality control policies established in the EPA OIG Project Management
Handbook for conducting audit, program evaluation, and related projects. The audit documentation adequately includes evidence of work
performed in the major three phases: Preliminary Research, Field Work and Reporting.
"We determined that EPA OIG adequately followed the standards and principles set forth in the PCIE and Executive Council on Integrity
and Efficiency Quality Standards for Investigations, as applicable. The investigation adequately documented compliance with the
guidelines applicable to the investigation efforts of criminal investigators working for the EPA OIG."
The audit also identified two minor conditions, related working paper review/approval and completion/update status.  OIG agreed with the
auditor recommendations related to the conditions and adapted its Project Management Handbook  to address the concerns.

A June 2010 internal OIG review of OIG report quality (which included a review of reporting procedures) found no substantial issues (see
http://www.epa.gov/oig/reports/2010/20100602-10-N-0134.pdf).	
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                            54                                     Back to Table of Contents

-------
  Enabling Support Measures                          No Associated Objective                                    Measure 35C
  Measure Code : 35C - Return on the annual dollar investment, as a percentage of
  the OIG budget, from audits and  investigations.
  Office of the Inspector General (OIG)
   1. Measure and DQR Metadata
   Goal Number and Title
Enabling Support Program
   Objective Number and Title
   Sub-Objective Number and Title
   Strategic Target Code and Title
   Managing Office
  Chief of Staff in the Immediate Office of the Inspector General
   Performance Measure Term Definitions
This is a measure of the total dollar amount of questioned costs, cost efficiencies, civil settlements, fines and recoveries from OIG audits
and investigations compared to annual budget investments in the OIG.
OIG performance results are a chain of linked events, starting with OIG outputs (e.g., recommendations, reports of best practices, and
identification of risks). The  subsequent actions taken by EPA or its stakeholders/partners,  as a result of OIG's outputs,  to improve
operational  efficiency and environmental  program delivery are  reported as intermediate outcomes.  The resulting  improvements  in
operational efficiency, risks reduced/eliminated, and conditions of environmental and human  health are reported as outcomes. By using
common categories of performance  measures, quantitative results can be summed  and reported. Each  outcome is  also  qualitatively
described, supported, and linked to an OIG product or output. The OIG can only control its outputs and has  no authority, beyond its
influence, to implement its recommendations that lead to environmental and management outcomes.

$s Questioned Costs Sustained: Dollar amount of questioned costs accepted or agreed to by the Agency or other action official. Describe
the EPA total amount questioned and its nature.

$s Efficiencies or Adjustments Sustained: Dollar amount of efficiencies or cost adjustments, accepted or agreed to by the Agency or other
action official. Describe the total amount identified as an efficiency/adjustment and its nature.

Actual Costs Recovered: Questioned costs or cost efficiencies that are recovered.

$ Questioned Costs: (actual dollars) The dollar value of questioned costs as defined by the IG Act. Describe nature of costs questioned.
The IG Act defines a questioned cost as "a cost that is questioned by the Office because of 1) an alleged violation or provision of law,
regulation, contract, grant, or cooperative agreement, or other agreement or document governing the expenditure of funds; 2) a finding that
                                                         55
                                                Back to Table of Contents

-------
  Enabling Support Measures                            No Associated Objective                                       Measure 35C
at the time of the audit, such cost is not supported by adequate documentation; or 3) a finding that the expenditure of funds for the intended
purpose is unnecessary or unreasonable."
It is the amounts paid by EPA for which the OIG recommends EPA pursue recovery, including Government property, services or benefits
provided to ineligible recipients; recommended collections of money inadvertently or erroneously paid out; and recommended collections or
offsets for overcharges or ineligible claims.
For contract/grant reports, it is contractor or grantee costs the "auditor" recommends be disallowed by the contracting officer, grant official,
or other management official on an EPA portion of a contract or grant. Costs normally result from a finding that expenditures were not
made in accordance with applicable laws, regulations, contracts, grants, or other agreements; or a finding that the expenditure of funds for
the intended purpose was unnecessary or unreasonable.

$ Recommended Efficiencies, Costs Saved or Avoided:  (monetized results) The immediate and near future monetary benefit of savings
or funds put to better use on an EPA project as a result of OIG work:
1) Savings from eliminating work products or office functions, which were no longer of use or too costly; and 2) The savings from new or
streamlined processes or work products, instituted to save time and/or money.
Describe the nature of the savings including monetary value of time saved.
For cost efficiencies, the IG Act defines a recommendation that funds be put to better use as "a recommendation by the Office that funds
could be used more efficiently if management of an establishment took actions to implement and complete the recommendation, including:
1) Reductions in outlays;  2) Deobligations of funds from programs or operations; 3) Withdrawal of interest subsidy costs on loans or loan
guarantees, insurance, or bonds; 4) Costs not incurred by implementing recommended improvements related to the operations of the
establishment, a contractor, or grantee; 5) Avoidance of unnecessary expenditures noted in preaward reviews of contract or grants; or 6)
Other savings which are specifically identified.
Cost efficiencies, funds put to better use, represent a quantity of funds that could be used more efficiently if management took actions to
complete recommendations pertaining to deobligation of funds, costs not incurred by implementing recommended improvements, and other
savings identified.

$ Cost Adjustments (Savings, Questioned) Made During the Audit, But Not Reported for Resolution: During the conduct of an audit
or evaluation, costs may be questioned or opportunities for savings and adjustments may be identified which are acknowledged and acted
upon/resolved prior to the report being issued. These costs may not be reported to the Agency since they are resolved prior to issuance and
therefore do not go into the Agency Audit Resolution Process. These $ costs/savings or adjustments should be reported in PMRS as Valued
Added results by the OIG or its surrogates as long as they can be substantiated. Also, report adjustments know as "Cost Realism", where a
contract is adjusted to reflect accurate costs that may change a decision, or impact future funding of a contract or project. Describe the
action taken and anticipated or actual impact.

$ Fines, Recoveries, Restitutions, Collections: Dollar value of investigative recoveries, meaning: 1) Recoveries during the course of an
investigation before any criminal or civil prosecution; 2) criminal  or civil court-ordered fines, penalties, and restitutions; 3) out-of-court
settlements, including non-court settlements resulting from administrative actions.  Describe nature of amounts and reason.

Additional Information:
                                                             56                                      Back to Table of Contents

-------
  Enabling Support Measures                            No Associated Objective                                       Measure 35C

U.S. EPA, Office of Inspector General, Audits, Evaluations, and Other Publications;                    Available on the Internet at
www.epa.gov/oig , last updatedAugust 2011.

Federal Government Inspector General Quality Standards.
Except for justified exceptions, OIG adheres to the following standards, which apply across the federal government:
• Overall Governance: Quality Standards for Federal Offices of Inspector General.  (President's Council on Integrity and Efficiency
(PCIE) and Executive Council on Integrity and Efficiency (ECIE), October 2003). (http://www.ignet.gov/pande/standards/igstds.pdf) This
document contains quality standards for the management, operation and conduct of the Federal Offices of Inspector General (OIG).  This
document specifies that each federal OIG shall conduct, supervise, and coordinate its audits, investigations, inspections, and evaluations in
compliance with the applicable professional standards listed below:
• For Investigations: Quality Standards for Investigations  . (President's Council on Integrity and Efficiency (PCIE) and Executive Council
on Integrity and Efficiency (ECIE), December 2003). http://www.ignet.gov/pande/standards/invstds.pdf Consistent with appropriate
Department of Justice Directives.
• For Inspections and Evaluations:  Quality Standards for Inspections  . (President's  Council on Integrity and Efficiency (PCIE) and
Executive Council on Integrity  and Efficiency (ECIE), January 2005).  http://www.ignet.gov/pande/standards/oeistds.pdf.
• For Audits: Government Auditing Standards, issued by the US General Accounting Office (GAO).  The professional standards and
guidance in the Yellow Book are commonly referred to as generally accepted government auditing standards (GAGAS). These standards
and guidance provide a framework for conducting high quality government audits and attestation engagements with competence, integrity,
objectivity, and independence. The current version of the Yellow Book (July 2007) can be located in its entirety at the following Website:
www.gao.gov/govaud/d07162g.pdf.

EPA OIG-Specific Operating Standards.  The Project Management Handbook is the Office of Inspector General (OIG) policy document
for conducting audit,  program evaluation, public liaison, follow-up, and related projects. The Handbook describes the processes and
standards the OIG uses to conduct the various phases of its work and helps ensure the quality, consistency, and timeliness of its products.
Each OIG office may issue, upon  approval by the Inspector General, supplemental guidance over assignments for which that office has
responsibility.... This  Handbook describes the audit, evaluation, public liaison, and follow-up processes and phases; it does not address OIG
investigative processes although it does apply  to audits/evaluations performed by the Office of Investigations (OI) [within EPA OIG]....OIG
audit, program evaluation, public  liaison, and follow-up reviews are normally conducted in accordance with appropriate Government
Auditing Standards ,  as issued by the Comptroller General of the United States, commonly known as the Yellow Book.

Staff may use GAGAS in conjunction with other sets of professional standards. OIG reports may cite the use of other standards as
appropriate. Teams should use GAGAS as the prevailing standard for conducting a review and reporting results should inconsistencies exist
between GAGAS and other professional standards.

For some projects, adherence to all of the GAGAS may not be feasible or necessary. For these projects, the Product Line Director (PLD)
will provide a rationale, the applicable standards not followed,  and the impact on project results. The PLD's decision should be made during
the design meeting, documented in the working papers, and described in the Scope and Methodology section of the report. [Source: Project
                                                             57                                      Back to Table of Contents

-------
  Enabling Support Measures                           No Associated Objective                                      Measure 35C
Management Handbook ].

Product Line Directors.  Product Line Directors oversee one or more particular work areas and multiple project teams.  The OIG product
lines are as follows:  Air/Research and Development; Water; Superfund/Land; Cross Media; Public Liaison and Special Reviews;
Assistance Agreements; Contracts; Forensic Audits; Financial Management; Risk Assessment and Program Performance; Information
Resources Management; Investigations; US Chemical Safety and Hazard Investigation Board; Legal Reviews; Briefings; OIG Enabling
Support Programs; and Other Activities.

For more information on the PLD responsibilities, see Chapter 5 of the OIG Project Management Handbook ,  attached to this record.


 2. Data  Definition  and Source  Reporting	
 2a. Original Data Source	

 Data is collected and reported by designated OIG staff members in OIG Performance Measurement Databases as a result of OIG
 performance evaluations, audits, inspections and investigations and other analysis of proposed and existing Agency Policies, regulations
 and laws. OIG collects such data  from the activities, outputs, intermediate outcomes and long-term outcome results of OIG operations.
 OIG collects such data from EPA programs and from court and other public data sources.	
 2b. Source Data Collection
 Performance information is entered by designated staff into the Inspector General Enterprise Management System from OIG audits,
 evaluations and investigations performed under strict compliance with applicable professional standards. All OIG products go through a
 rigorous quality assurance process and are subject to independent peer review.	
 2c. Source Data Reporting
 Data is derived from the results of audits, evaluations, investigations and special analysis that are performed in accordance with
 Professional Standards of the US Government Accountability Office or the Us Department of Justice. All OIG products are quality
 controlled and subject to independent peer review for compliance with a all professional standards. Data is entered, in compliance with
 EPA and OIG data quality standards into the Inspector General Enterprise Management System and which is further reviewed for quality
 and consistency by the OIG performance quality staff members.	


 3.  Information Systems and Data Quality  Procedures	
 3a.  Information Systems

 OIG Performance Measurement and Results System (PMRS). PMRS captures and aggregates information on an array of OIG measures
 in a logic model format, linking immediate outputs with long-term intermediate outcomes and results. (The logic model can be found in
 OIG's Annual Performance Report at http://www.epa.gov/oig/planning.htm.) PMRS is the OIG official system for collecting performance
 results data, in relation to its strategic and annual goals. All outputs (recommendations, best practices, risks identified) and outcome results
                                                           58                                     Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 35C
(actions taken, changes in policies, procedures, practices, regulations, legislation, risks reduced, certifications for decisions, environmental
improvements) influenced by OIG's current or prior work, and recognized during FY 2010 and beyond, should be entered into PMRS.

PMRS was developed as a prototype in FY 2001. Since then, there have been system improvements for ease of use. For example, during
FY 2009 the PMRS was converted to a relational database directly linked to the new Inspector General Enterprise Management System
(IGEMS).

IGEMS is an OIG employee time-tracking and project cost-tracking database that generates management reports.  IGEMS is used to
generate a project tracking number and a work product number.  This system also tracks project progress and stores all related cost
information.

AutoAudit and Teammate.  These are repositories for all project working papers. _
3b. Data Quality Procedures
Data quality assurance and control are performed as an extension of OIG products and services, subject to rigorous compliance with the
Government Auditing Standards of the Comptroller General, and are regularly reviewed by OIG management, an independent OIG
Management Assessment Review Team, and external independent peer reviews (e.g., by accountancies qualified to evaluate OIG
procedures against Government Auditing Standards). Each Assistant Inspector General certifies the completeness and accuracy of
performance data.

All data reported are audited internally for accuracy and consistency.

OIG processes, including data processes, are governed by the quality standards described in "Additional Information" under the
Performance Term Definition field. Notably, the Project Management Handbook (which governs audits) provides a QA checklist (see
Appendix 4, of the 2008 Project Management Handbook , attached to this record). The Project Manager (PM) is responsible for completing
the Quality Assurance (QA) checklist throughout the project.  The PM prepares the checklist and submits it to the Product Line Director
(PLD) upon completion of the Post Reporting  Phase of the Project.  The Checklist should be completed for all projects, recognizing that
some steps in the checklist may not be applicable to all projects. The QA Checklist asks teams to ensure the integrity of data that resides in
all of the OIG data systems.  [Source: Project Management Handbook ].
 Policy! 01 . PM H . Final. 05. 08. 08. pdf


During FY 2008, OIG implemented an Audit Follow-up Policy to independently verify the status of Agency actions on OIG
recommendations, which serve as the basis for OIG intermediate outcome results reported in the OIG PMRS.

(Additional information on the OIG's follow-up process can be found at
                                                            59                                      Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 35C
http://oigintra.epa.gov/policv/policies/docutnents/OIG-04Follow-upPolicy.pdf
3c. Data Oversight	|
There are three levels of PMRS access: View Only, Edit and Administrator. Everyone with IGEMS access has view only privileges.
Individuals tasked with adding or editing PMRS entries must be granted PMRS Edit privileges. Contact a PMRS administrator to request
Edit privileges.

Each Product Line Director (PLD), each of whom oversees one or more OIG work areas (e.g., Superfund, Contracts, etc.) and multiple
project management teams, is responsible for ensuring that teams maintain proper integrity, accessibility, and retrievability of working
papers in accordance with OIG policies. Likewise, they must ensure that information in OIG's automated systems is updated regularly by
the team. (See field 2i, Additional Information, for more information about PLDs.)
3d. Calculation Methodology                                                                                                    |
Database measures include numbers of: 1) recommendations for environmental  and management improvement; 2) legislative, regulatory
policy, directive, or process changes; 3) environmental, program management, security and resource integrity risks identified, reduced, or
eliminated; 4) best practices identified  and implemented; 5) examples of environmental and management actions taken and improvements
made; 6) monetary value of funds questioned, saved, fined, or recovered; 7) criminal,  civil, and administrative actions taken, 8) public or
congressional inquiries resolved; and 9) certifications, allegations disproved, and cost corrections.

Because intermediate and long-term results may not be realized over a period of several years, only verifiable results are reported in the
year completed.

Unit of measurement:   Individual outcomes/actions


Unit of Measurement:   Percentage (of the OIG budget)	


4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting                                                                                      |

Data comes from OIG audit, evaluations and investigations that are performed under strict compliance with professional standard of the US
Government Accountability Office and the US Department of Justice and  subject to independent peer review. Data in the form of
activities, output, and outcomes is entered by designated staff into the Inspector General Enterprise Management System. All original data
is quality controlled for compliance with professional standard and data entered is quality reviewed for accuracy, completeness, timeliness
and adequately supported. All data entered is carefully reviewed several times a years as it is entered and subsequently reported on a
quarterly basis. The OIG Assistant Inspectors General oversee the quality  of the data used to generate reports of performance. The Office
                                                            60                                      Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 35C
of the Chief of Staff oversee the data quality and the IG reviews the documents and date use for external consumption. Data is audited and
quality test on a continuous basis through several steps from origin to final public consumption	
4b. Data Limitations/Qualifications

Because intermediate and long-term results may not be realized over a period of several years, only verifiable results are reported in the
year completed.

Although all OIG staff are responsible for data accuracy in their products and services, there is a possibility of incomplete, miscoded, or
missing data in the system due to human error or time lags. Data supporting achievement of results are often from indirect or external
sources, with their own methods or standards for data verification/validation.  Such data are reviewed according to the appropriate OIG
quality standards (see "Additional Information"), and any questions about the quality of such data are documented in OIG reports and/or
thePMRS.

The error rate for outputs is estimated at +/-2%, while the error rate for reported long-term outcomes is presumably greater because of the
longer period needed for tracking results and difficulty in verifying a nexus between our work and subsequent actions and impacts beyond
OIG's control. (The OIG logic model in the Annual Performance Report clarifies the kinds  of measures that are output-oriented, like risks
identified, versus outcome-oriented, like risks reduced.) Errors tend to be those of omission.  Some errors may result from duplication as
well.
4c. Third-Party Audits
There have not been any previous audit findings or reports by external groups on data or database weaknesses in PMRS.

A December 2008 independent audit
(www.epa.gov/oig/reports/2009/OualityReviewofEPAOIG-20081216.pdf) found the following with regard to general OIG processes:
"We determined that the EPA OIG audits methodology, policies and procedures adequately  complied with the Government Auditing
Standards. The EPA OIG quality control system adequately documented compliance with professional and auditing standards for :
Independence; Professional Judgment; Competence; Audit Planning; Supervision; Evidence and Audit Documentation; Reports on
Performance Audits; Nonaudit Services; and the Quality Control Process.  The auditors documented, before the audit report was issued,
evidence of supervisory review of the work performed that supports findings, conclusions, and recommendations contained in the audit
report.
"We determined that EPA OIG adequately followed the quality control policies established in the EPA OIG Project Management
Handbook  for conducting audit, program evaluation, and related projects. The audit documentation adequately includes evidence of work
performed in  the major three phases: Preliminary Research, Field Work and Reporting.
"We determined that EPA OIG adequately followed the standards and principles set forth in the PCIE and Executive Council  on Integrity
and Efficiency Quality Standards for Investigations, as applicable. The investigation adequately documented compliance with the
guidelines applicable to the investigation  efforts of criminal investigators working for the EPA OIG."
The audit also identified two minor conditions, related working paper review/approval and completion/update status.  OIG agreed with the
auditor recommendations related to the conditions and adapted its Project Management Handbook to address the concerns.


                                                            61                                      Back to Table of Contents

-------
 Enabling Support Measures                            No Associated Objective                                         Measure 35C
A June 2010 internal OIG review of OIG report quality (which included a review of reporting procedures) found no substantial issues (see
http://www.epa.gov/oig/reports/2010/20100602-10-N-0134.pdf).	
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                               62                                        Back to Table of Contents

-------
  Enabling Support Measures                          No Associated Objective                                     Measure 35D
  Measure Code : 35D - Criminal, civil,  administrative, and fraud prevention actions.
  Office of the Inspector General (OIG)
   1.  Measure and DQR Metadata	
   Goal Number and Title                           Enabling Support Program
   Objective Number and Title
   Sub-Objective Number and Title
   Strategic Target Code and Title
   Managing Office                                  Chief of Staff in the Immediate Office of the Inspector General
   Performance Measure Term Definitions
This is a measure of the total number of convictions, indictments, civil and administrative actions from OIG investigations.

OIG performance results are a chain of linked events, starting with OIG  outputs (e.g.,  recommendations, reports of best practices, and
identification of risks).  The  subsequent actions taken by EPA or its stakeholders/partners,  as  a  result of OIG's outputs, to improve
operational  efficiency  and environmental program  delivery are  reported as intermediate outcomes.  The resulting improvements in
operational  efficiency, risks reduced/eliminated, and conditions of environmental and human  health are reported as outcomes. By using
common  categories of performance measures, quantitative  results can be summed  and reported. Each outcome is also qualitatively
described, supported, and linked to an OIG product or output.  The OIG can only control its outputs and has no authority, beyond its
influence, to implement its recommendations that lead to environmental and management outcomes.

# Criminal/Civil/Administrative Actions: Measured by the number of: 1) Indictments or informations where there is preliminary
evidence  of a violation of law; 2) convictions, guilty pleas, pre-trial diversion agreements, and based on the proof of evidence as decided by
a judicial body affecting EPA operations and environmental programs; 3) Civil actions arising from OIG work. Civil actions include civil
judgments and civil settlements from law suits for recovery; and 4) Administrative actions as a result of OIG work, which include: a)
Personnel actions, such as reprimands, suspensions, demotions, or terminations of Federal, State, and local employees (including Federal
contractor/grantee employees); b) Contractor or grantee (individual and entity) suspensions and/or debarments from doing business with the
Federal government; and
c) Compliance agreements.

Additional Information:

U.S. EPA, Office of Inspector General, Audits, Evaluations, and Other Publications;                   Available on the Internet at
www.epa.gov/oig , last updatedAugust 2011.

                                                          63                                    Back to Table of Contents

-------
  Enabling Support Measures                           No Associated Objective                                       Measure 35D
Federal Government Inspector General Quality Standards.
Except for justified exceptions, OIG adheres to the following standards, which apply across the federal government:
• Overall Governance: Quality Standards for Federal Offices of Inspector General.  (President's Council on Integrity and Efficiency
(PCIE) and Executive Council on Integrity and Efficiency (ECIE), October 2003). (http://www.ignet.gov/pande/standards/igstds.pdf) This
document contains quality standards for the management, operation and conduct of the Federal Offices of Inspector General (OIG).  This
document specifies that each federal OIG shall conduct, supervise, and coordinate its audits, investigations, inspections, and evaluations in
compliance with the applicable professional standards listed below:
• For Investigations: Quality Standards for Investigations . (President's Council on Integrity and Efficiency (PCIE) and Executive Council
on Integrity and Efficiency (ECIE), December 2003). http://www.ignet.gov/pande/standards/invstds.pdf Consistent with appropriate
Department of Justice Directives.
• For Inspections and Evaluations:  Quality Standards for Inspections  . (President's  Council on Integrity and Efficiency (PCIE) and
Executive Council on Integrity and Efficiency (ECIE), January 2005). http://www.ignet.gov/pande/standards/oeistds.pdf.
• For Audits: Government Auditing Standards, issued by the US General Accounting Office (GAO).  The professional standards and
guidance in the Yellow Book are commonly referred to as generally accepted government auditing standards (GAGAS). These standards
and guidance provide a framework for conducting high quality government audits and attestation engagements with competence, integrity,
objectivity, and independence. The current version of the Yellow Book (July 2007) can be located in its entirety at the following Website:
www.gao.gov/govaud/d07162g.pdf.

EPA OIG-Specific Operating Standards.  The Project Management Handbook is the Office of Inspector General (OIG) policy document
for conducting audit, program evaluation, public liaison, follow-up, and related projects. The Handbook describes the processes and
standards the OIG uses to conduct the various phases of its work and helps ensure the quality, consistency, and timeliness of its products.
Each OIG office may issue, upon approval by the Inspector General, supplemental guidance over assignments for which that office has
responsibility.... This Handbook describes the audit, evaluation, public liaison, and follow-up processes and phases; it does not address OIG
investigative processes although it does apply to audits/evaluations performed by the Office of Investigations (OI) [within EPA OIG]....OIG
audit, program evaluation, public liaison, and follow-up reviews are normally conducted in accordance with appropriate Government
Auditing Standards , as issued by the Comptroller General of the United States, commonly known as the Yellow Book.

Staff may use GAGAS in conjunction with other sets of professional standards. OIG reports may cite the use of other standards as
appropriate. Teams should use GAGAS as the prevailing standard for conducting a review and reporting results should inconsistencies exist
between GAGAS and other professional standards.

For some projects, adherence to all of the GAGAS may not be feasible or necessary. For these projects, the Product Line Director (PLD)
will provide a rationale, the applicable standards not followed, and the impact on project results. The PLD's decision should be made during
the design meeting, documented in the working papers, and described in the Scope and Methodology section of the report. [Source: Project
Management Handbook ].

Product Line Directors.   Product Line Directors oversee one or more particular work areas and multiple project teams. The OIG product
lines are as follows:  Air/Research and Development; Water; Superfund/Land; Cross Media; Public Liaison and Special Reviews;
                                                             64                                      Back to Table of Contents

-------
  Enabling Support Measures                           No Associated Objective                                      Measure 35D
Assistance Agreements; Contracts; Forensic Audits; Financial Management; Risk Assessment and Program Performance; Information
Resources Management; Investigations; US Chemical Safety and Hazard Investigation Board; Legal Reviews; Briefings; OIG Enabling
Support Programs; and Other Activities.

For more information on the PLD responsibilities, see Chapter 5 of the OIG Project Management Handbook  , attached to this record.
 2. Data Definition and Source Reporting
 2a. Original Data Source
 Data is collected and reported by designated OIG staff members in OIG Performance Measurement Databases as a result of OIG
 performance evaluations, audits, inspections and investigations and other analysis of proposed and existing Agency Policies, regulations
 and laws. OIG collects such data from the activities, outputs, intermediate outcomes and long-term outcome results of OIG operations.
 2b. Source Data Collection

 Performance information is entered by designated staff into the Inspector General Enterprise Management System from OIG audits,
 evaluations and investigations performed under strict compliance with applicable professional standards. All OIG products go through a
 rigorous quality assurance process and are subject to independent peer review.	
 2c. Source Data Reporting
 Data is derived from the results of audits, evaluations, investigations and special analysis that are performed in accordance with
 Professional Standards of the US Government Accountability Office or the Us Department of Justice. All OIG products are quality
 controlled and subject to independent peer review for compliance with a all professional standards. Data is entered, in compliance with
 EPA and OIG data quality standards into the Inspector General Enterprise Management System and which is further reviewed for quality
 and consistency by the OIG performance quality staff members.	


 3. Information Systems and  Data Quality Procedures	
 3a. Information Systems

 OIG Performance Measurement and Results System (PMRS). PMRS captures and aggregates information on an array of OIG
 measures in a logic model format, linking immediate outputs with long-term intermediate outcomes and results. (The logic model can be
 found in OIG's Annual Performance Report at http://www.epa.gov/oig/planning.htm.) PMRS is the OIG official system for collecting
 performance results data, in relation to its strategic and annual goals.  All outputs (recommendations, best practices, risks identified) and
 outcome results (actions taken, changes in policies, procedures, practices, regulations, legislation, risks reduced, certifications for decisions,
 environmental improvements) influenced by OIG's current or prior work, and recognized during FY 2010 and beyond, should be entered
 into PMRS.
                                                           65                                     Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 35D

PMRS was developed as a prototype in FY 2001. Since then, there have been system improvements for ease of use. For example, during
FY 2009 the PMRS was converted to a relational database directly linked to the new Inspector General Enterprise Management System
(IGEMS).

IGEMS is an OIG employee time-tracking and project cost-tracking database that generates management reports. IGEMS is used to
generate a project tracking number and a work product number.  This system also tracks project progress and stores all related cost
information.

AutoAudit and Teammate. These are repositories for all project working papers.	
3b. Data Quality Procedures                                                                                                    |
Data quality assurance and control are performed as an extension of OIG products and services, subject to rigorous compliance with the
Government Auditing Standards of the Comptroller General, and are regularly reviewed by OIG management, an independent OIG
Management Assessment Review Team, and external independent peer reviews (e.g., by accountancies qualified to evaluate OIG
procedures against Government Auditing Standards). Each Assistant Inspector General certifies the completeness and accuracy of
performance data.

All data reported are audited internally for accuracy and consistency.

OIG processes, including data processes, are governed by the quality standards described in "Additional Information" under the
Performance Term Definition field. Notably, the Project Management Handbook (which governs audits) provides a QA checklist (see
Appendix 4, of the 2008 Project Management Handbook , attached to this record).  The Project Manager (PM) is responsible for completing
the Quality Assurance (QA) checklist throughout the project. The PM prepares the checklist and submits it to the Product Line Director
(PLD) upon completion of the Post Reporting Phase of the Project. The Checklist should be completed for all projects, recognizing that
some steps in the checklist may not be applicable to all projects. The QA Checklist asks teams to ensure the integrity of data that resides in
all of the OIG data systems.  [Source: Project Management Handbook  ].
 Policy! 01. PM H. Final. 05.08.08. pdf


During FY 2008, OIG implemented an Audit Follow-up Policy to independently verify the status of Agency actions on OIG
recommendations, which serve as the basis for OIG intermediate outcome results reported in the OIG PMRS.

(Additional information on the OIG's follow-up process can be found at
http://oigintra.epa.gov/policv/policies/documents/OIG-04Follow-upPolicy.pdf

                                                            66                                     Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                        Measure 35D
3c. Data Oversight
There are three levels of PMRS access: View Only, Edit and Administrator. Everyone with IGEMS access has view only privileges.
Individuals tasked with adding or editing PMRS entries must be granted PMRS Edit privileges.  Contact a PMRS administrator to request
Edit privileges.

Each Product Line Director (PLD), each of whom oversees one or more OIG work areas (e.g., Superfund, Contracts, etc.) and multiple
project management teams, is responsible for ensuring that teams maintain proper integrity, accessibility, and retrievability of working
papers in accordance with OIG policies. Likewise, they must ensure that information in OIG's automated systems is updated regularly by
the team. (See field 2i, Additional Information, for more information about PLDs.)
3d. Calculation Methodology
Database measures include numbers of: 1) recommendations for environmental and management improvement; 2) legislative, regulatory
policy, directive, or process changes; 3) environmental, program management, security and resource integrity risks identified,  reduced, or
eliminated; 4) best practices identified and implemented; 5) examples of environmental and management actions taken and improvements
made; 6) monetary value of funds questioned, saved, fined, or recovered; 7) criminal, civil, and administrative actions taken, 8) public or
congressional inquiries resolved; and 9) certifications, allegations disproved, and cost corrections.

Because intermediate and long-term results may not be realized over a period of several years, only verifiable results are reported in the
year completed.

Unit of measurement:  Individual actions
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Data comes from OIG audit, evaluations and investigations that are performed under strict compliance with professional standard of the US
Government Accountability Office and the US Department of Justice and subject to independent peer review. Data in the form of
activities, output, and outcomes is entered by designated staff into the Inspector General Enterprise Management System. All original data
is quality controlled for compliance with professional standard and data entered is quality reviewed for accuracy, completeness, timeliness
and adequately supported. All data entered is carefully reviewed several times a years as it is entered and subsequently reported on a
quarterly baThe OIG Assistant Inspectors General oversee the quality of the data used to generate reports of performance.  The Office of
the Chief of Staff oversee the data quality and the IG reviews the documents and date use for external consumption. Data is audited and
quality test on a continuous basis through several steps from origin to final use.	
4b. Data Limitations/Qualifications                                                                                                |

Because intermediate and long-term results may not be realized over a period of several years, only verifiable results are reported in the
year completed.

Although all OIG staff are responsible for data accuracy in their products and services, there is a possibility of incomplete, miscoded, or
                                                            67                                      Back to Table  of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 35D
missing data in the system due to human error or time lags. Data supporting achievement of results are often from indirect or external
sources, with their own methods or standards for data verification/validation. Such data are reviewed according to the appropriate OIG
quality standards (see "Additional Information"), and any questions about the quality of such data are documented in OIG reports and/or
thePMRS.

The error rate for outputs is estimated at +/-2%, while the error rate for reported long-term outcomes is presumably greater because of the
longer period needed for tracking results and difficulty in verifying a nexus between our work and subsequent actions and impacts beyond
OIG's control. (The OIG logic model in the Annual Performance Report clarifies the kinds of measures that are output-oriented, like risks
identified, versus outcome-oriented, like risks reduced.) Errors tend to be those of omission.  Some errors may result from duplication as
well.	
4c. Third-Party Audits
There have not been any previous audit findings or reports by external groups on data or database weaknesses in PMRS.

A December 2008 independent audit
(www.epa.gov/oig/reports/2009/OualityReviewofEPAOIG-20081216.pdf) found the following with regard to general OIG processes:
"We determined that the EPA OIG audits methodology, policies and procedures adequately complied with the Government Auditing
Standards. The EPA OIG  quality control system adequately documented compliance with professional and auditing standards for :
Independence; Professional Judgment; Competence; Audit Planning; Supervision; Evidence and Audit Documentation; Reports on
Performance Audits; Nonaudit Services; and the Quality Control Process. The auditors documented, before the audit report was issued,
evidence of supervisory review of the work performed that supports findings, conclusions, and recommendations contained in the audit
report.
"We determined that EPA OIG adequately followed the quality control policies established in the EPA OIG Project Management
Handbook  for conducting audit, program evaluation, and related projects. The audit documentation adequately includes evidence of work
performed in the major three phases: Preliminary Research, Field Work and Reporting.
"We determined that EPA OIG adequately followed the standards and principles set forth  in the PCIE and Executive Council on Integrity
and Efficiency Quality Standards for Investigations, as applicable. The investigation adequately documented compliance with the
guidelines applicable to the investigation efforts of criminal investigators working for the EPA OIG."
The audit also identified two minor conditions, related working paper review/approval and completion/update status.  OIG agreed with the
auditor recommendations  related to the conditions and adapted its Project Management Handbook to address the concerns.

A June 2010 internal OIG review of OIG report quality (which included a review of reporting procedures) found no substantial issues (see
http://www.epa.gov/oig/reports/2010/20100602-10-N-0134.pdf).	
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                            68                                      Back to Table of Contents

-------
  Enabling Support Measures                         No Associated Objective                                   Measure 998
  Measure Code : 998 - EPA's TRI program will work with partners to conduct data
  quality checks to enhance accuracy and reliability of environmental data.
  Office of Environmental Information (OEI)
   1. Measure and DQR Metadata
   Goal Number and Title
Enabling Support Program
   Objective Number and Title
   Sub-Objective Number and Title
   Strategic Target Code and Title
   Managing Office
  Office of Information Analysis and Access
   Performance Measure Term Definitions
TRI Program: Number of Data Quality Checks - the Regions and HQ will identify possible data quality issues and follow up with approximately 500 facilities annually to
ensure accuracy of TRI data on HQ-generated lists of facilities.
 2. Data  Definition and Source Reporting
 2a. Original Data Source
 EPA receives this data from companies or entities required to report annually under EPCRA (see 2b.)  The data quality checks are
 performed by EPA HQ and regional offices on the facility data submitted.
 2b. Source Data Collection                                                                                            |

 All covered facilities are required to annually submit toxic  chemical release and other waste management quantities and facility-specific
 information for the previous calendar year on or before July 1 to EPA and the States if reporting threshold requirements [40 CFR Part 372]
 are exceeded.  EPA makes the collected data available to the public through EPA's TRI National Analysis and various online tools (e.g.,
 Envirofacts TRI Explorer, TRI.NET, and my RTK.	
 2c. Source Data Reporting                                                                                             |

 Form/mechanism for receiving data and entering into EPA's system:  More than 97 percent of covered facilities use EPA's web-based
 electronic reporting tool - TRI-MEweb - to report their releases and other waste management information on the TRI program.  Timing and
 frequency of reporting: covered facilities are required to submit release and waste management information for previous calendar year on
 or before July  1 if they meet reporting requirements.	
                                                       69
                                              Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                       Measure 998
3.  Information Systems and Data Quality Procedures
3a.  Information Systems	

TRI-MEweb and TRIPS databases	
3b. Data Quality Procedures
• EPA provides guidance documents (general, chemical-specific and sector-specific), training modules and TRI hotline assistance.

• EPA performs multiple quality control and quality assurance checks during reporting (TRI-MEweb DQ checks) and at the end of the
reporting period (in-house DQ checks). Here are few examples:

• Facilities that reported large changes in release, disposal or waste management practices on sector-level for certain chemicals (e.g., PBT
chemicals);

• Facilities that submit invalid Chemical Abstract Service (CAS) numbers that do not match the chemical name;

• Facilities that report invalid North American Industry Classification System (NAICs) codes;

• Facilities that report invalid/incorrect RCRA facility IDs when they send wastes to offsite locations for management;

• Facilities that did not report for the current reporting year but reported for the previous reporting year; and

• Facilities that reported incorrect quantities on Form R Schedule 1 for dioxin and dioxin-like compounds;

The TRI Program generates a list of facilities with potential data quality issues and sends the list to the 10 TRI Regional coordinators.  The
TRI Program HQ staff and Regional coordinators contact the facilities and discuss data quality issues. The facilities may revise their
reports where errors are identified. Certain facilities may be referred to enforcement for further examination. For each annual TRI
collection received on or before July 1, headquarters and regional personnel will identify potential data quality issues and work with the
Regions to contact facility reporters and resolve the issues during the following  fall and spring.

3c. Data Oversight

EPA performs several data quality analyses to support the TRI National Analysis. For this measure, the Regions and the HQ staff annually
identify potential data quality issues and contact approximately 500 facilities for follow up.	
3d. Calculation Methodology	

Unit of Analysis:  Number of facilities contacted
                                                            70                                      Back to Table of Contents

-------
 Enabling Support Measures                            No Associated Objective                                        Measure 998
4.  Reporting and Oversight
4a. Oversight and Timing of Results Reporting	
For TRI reports (due to EPA and the states annually on July 1), the TRI program will identify potential data quality issues and work with
the Regions to contact facility reporters and resolve the issues during the following fall and spring.
4b. Data Limitations/Qualifications	
Over 97% of all TRI reporting facilities use TRI-MEweb.
4c. Third-Party Audits
This program does not conduct third-party audits of the data quality data.
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                             71                                       Back to Table of Contents

-------
  Enabling Support Measures                        No Associated Objective                                   Measure 999
  Measure Code : 999 - Total number of active unique users from states, tribes,
  laboratories, regulated facilities and other entities that electronically report
  environmental data to EPA through CDX.
  Office of Environmental Information (OEI)
   1. Measure and DQR Metadata	
   Goal Number and Title                         Enabling Support Program
   Objective Number and Title
   Sub-Objective Number and Title	
   Strategic Target Code and Title
   Managing Office                                Office of Information Collection
   Performance Measure Term Definitions
Active unique users: Active accounts include those who have logged in within the last two years in which the statistic is generated. In
addition, users who have multiple accounts are only counted as one account (unique). Active unique users include: States, Tribes,
laboratories, and regulated facilities.

CDX: Central Data Exchange. CDX is the point of entry on the Environmental Information Exchange Network (Exchange Network) for
environmental data submissions to the Agency.

CDX assembles the registration/submission requirements of many different data exchanges with EPA and the States, Tribes, local
governments and the regulated community into a centralized environment. This system improves performance tracking of external
customers and overall management by making those processes more consistent and comprehensive. The creation of a centralized
registration system, coupled with the use of web forms and web-based approaches to submitting the data, invite opportunities to introduce
additional automated quality assurance procedures for the system and reduce human error. For more information, visit:
http ://www. epa. gov/cdx/index.htm


 2. Data Definition and Source Reporting	
 2a. Original Data Source
 Users of CDX from the Private sector, State, local, and Tribal government; entered into the CDX Customer Registration Subsystem

 CDX Users at EPA program offices include the:
 •       Office of Air and Radiation (OAR)

                                                      72                                  Back to Table of Contents

-------
 Enabling Support Measures                          No Associated Objective                                      Measure 999
•       Office of Enforcement and Compliance Assurance (OECA)
•       Office of Environmental Information (OEI)
•       Office of Prevention, Pesticides and Toxic Substances (OPPTS)
•       Office of Solid Waste and Emergency Response (OSWER)
»       Office of Water (OW)	
2b. Source Data Collection
Source Data Collection Methods: Reports are routinely generated from log files on CDX servers that support user registration and
identity management.

Tabulation of records: The records of registration provide an up-to-date, accurate count of users.

Date/Time Intervals Covered by Source Data:  Ongoing

EPA OA Requirements/Guidance Governing Collection. QA/QC is performed in accordance with a CDX Quality Assurance Plan
["Quality Assurance Project Plan for the Central  Data Exchange," 10/8/2004] and the CDX Design Document v.3. Appendix K registration
procQdures[Central Data Exchange Electronic Reporting Prototype System Requirements : Version 3; Document number: EP005S3;
December 2000]. Specifically, data are reviewed for authenticity and integrity.  Automated edit checking routines are performed in
accordance with program specifications and the CDX Quality Assurance Plan. EPA currently has a draft plan developed in August 2007.
In FY 2012, CDX will develop robust quality criteria, which will include performance metric results and align with the schedule for the
upcoming CDX contract recompete.
2c. Source Data Reporting
Form/mechanism for receiving data and entering into EPA system: CDX manages the collection of data and documents in a secure
way either by users entering data onto web forms or via a batch file transfer, both of which are completed using the CDX environment.
These data are then transported to the appropriate EPA system.

Timing and frequency of reporting: Ongoing	


3. Information Systems and Data Quality Procedures	
3a. Information Systems                                                                                                     |
CDX Customer Registration Subsystem. Users identify themselves with several descriptors and use a number of CDX security mechanisms
for ensuring the integrity of individuals' identities

CDX completed its last independent security risk assessment in June 2011, and all vulnerabilities are being reviewed or addressed. CDX
users register themselves via web forms on CDX to obtain access to data flows in which they receive privileges. This user information
comes directly from the user and is not transformed.
                                                          73                                     Back to Table of Contents

-------
 Enabling Support Measures                           No Associated Objective                                        Measure 999

Additional information:
In addition, environmental data collected by CDX is delivered to National data systems in the Agency. Upon receipt, the National systems
often conduct a more thorough data quality assurance procedure based on more intensive rules that can be continuously changing based on
program requirements. As a result, CDX and these National systems appropriately share the responsibility for ensuring environmental data
quality.	
3b. Data Quality Procedures
The CDX system collects, reports, and tracks performance measures on data quality and customer service. While its automated routines are
sufficient to screen systemic problems/issues, a more detailed assessment of data errors/problems generally requires a secondary level of
analysis that takes time and human resources.

CDX incorporates a number of features to reduce errors in registration data and that contribute greatly to the quality of environmental data
entering the Agency. These features include pre-populating data either from CDX or National systems, conducting web-form edit checks,
implementing XML schemas for basic edit checking and providing extended quality assurance checks for selected Exchange Network Data
flows using Schematron.
3c. Data Oversight
Although not officially termed, CDX is a general support application that provides centralized services to a multitude of program offices in
the Agency and data trading partners on the Exchange Network. The general answer is that EPA Program Office System Managers and
their management chains are responsible for oversight of the data quality. The closest individual responsible for "data integrity  purposes"
is the Chief of the Information Technology Branch.

3d. Calculation Methodology
Unit of Analysis: Users
EPA counts users based on the above definition in la.


4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting
Oversight of Final Reporting: Reports on CDX quality and performance are conducted on an annual basis.  The reports consist of both
quantitative measures from system logs and qualitative measures from user and program office surveys.

Timing of Results Reporting: Annually
4b. Data Limitations/Qualifications
The potential error in  registration data, under CDX responsibility has been assessed to be less than 1%.  This is accomplished through a
combination of automated edit checks in web form fields and processes in place to confirm the identity of individuals prior to approving
access to CDX data flows.	
4c. Third-Party Audits                                                                                                          |
                                                            74                                      Back to Table of Contents

-------
 Enabling Support Measures                             No Associated Objective                                          Measure 999
Third party security risk assessments are conducted every three years in accordance with FISMA requirements. Alternatives analysis
reviews are also conducted in accordance with OMB CPIC requirements. Lastly, adhoc third party requirements are conducted internally.
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                               75                                        Back to Table of Contents

-------
  Goal 1                                          No Associated Objective                                    Measure AC1
  Measure Code : AC1 - Percentage of products  completed by Air, Climate, and
  Energy.
  Office of Research and Development (ORD)
   1. Measure and DQR Metadata
   Goal Number and Title                          1 - Taking Action on Climate Change and Improving Air Quality
   Objective Number and Title
   Sub-Objective Number and Title	
   Strategic Target Code and Title	
   Managing Office                                 Office of Program Accountability and Resource Management- Planning
   Performance Measure Term Definitions
A research product is "a deliverable that results from a specific research project or task. Research products may require translation or
synthesis before integration into an output ready for partner use."

 This secondary performance measure tracks the timely completion of research products.

Sustainability Research  Strategy, available from: http://epa.gov/sciencematters/april2011/truenorth.htm

http://www.epa.gov/risk_assessment/health-risk.htm


 2. Data Definition and Source  Reporting	
 2a. Original Data Source
 EPA and its partners confirm the schedule for completing research outputs and products that are transformed or synthesized into outputs.
 ORD tracks progress toward delivering the outputs; clients are notified of progress. Scheduled milestones are compared to actual progress
 on a quarterly basis. At the end of the fiscal year, outputs are either classified as "met" or "not met" to determine the overall percentage of
 planned products that have been met by the research program. The actual product completion date is self-reported.

 2b. Source Data Collection	

 Each output is assigned to a Lab or Center representative before the start of the fiscal year. This individual provides quarterly status
 updates via ORD's Resource Management System. Status reports are reviewed by senior management, including the Lab or Center
 Director and National Program Director.  Overall  status data is generated and reviewed by ORD's Office of Program Accountability and
 Resource Management.	
                                                        76                                    Back to Table of Contents

-------
 GoaM
2c. Source Data Reporting
No Associated Objective
Measure AC1
Quarterly status updates are provided via ORD's Resource Management System.
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
Internal database or internal tracking system such as the Resources Management System (RMS).	
3b. Data Quality Procedures	
EPA and its partners confirm the schedule for completing research outputs and products that are transformed or synthesized into outputs.
ORD tracks progress toward delivering the outputs; clients are notified of progress. Scheduled milestones are compared to actual progress
on a quarterly basis. At the end of the fiscal year, outputs are either classified as "met" or "not met" to determine the overall percentage of
planned products that have been met by the ACE program.	
3c. Data Oversight

The National Program Director oversees the source data reporting, specifically, the process of establishing agreement with program
stakeholders and senior ORD managers on the list and content of the planned products, and subsequent progress, completion, and delivery
of these products.	
3d. Calculation Methodology                                                                                                  |
At the end of the fiscal year, outputs are either classified as "met" or "not met". An overall percentage of planned products met by the ACE
program is reported.	
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The Office of Program Accountability and Resource Management is responsible for reporting program progress in meeting its target of
completion of 100% of Ace, Climate, and Energy program planned products.

4b. Data Limitations/Qualifications	
This measure does not capture directly the quality or impact of the research products.	
4c. Third-Party Audits

Not applicable
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                           77
                                                Back to Table of Contents

-------
Goal 1                                               No Associated Objective                                         Measure AC1
                                                             78                                       Back to Table of Contents

-------
  Goal 1                                              Objective 1                                         Measure AD1
  Measure Code : ADI - Cumulative number of major scientific models and decision
  support tools used in implementing environmental management programs that
  integrate climate change science data
  Office of the Administrator (OA)
   1. Measure and DQR Metadata	
   Goal Number and Title                         1 - Taking Action on Climate Change and Improving Air Quality
   Objective Number and Title	1-Address Climate Change	
   Sub-Objective Number and Title                  1 - Address Climate Change
   Strategic Target Code and Title                   3 - EPA w'" integrate climate change science trend and scenario information into five major scientific
   Managing Office                                Office of Policy
   Performance Measure Term Definitions
Consistent with this approach, EPA is defining a major scientific model and/or decision support tool as one that may influence a major
agency rule or action. For example, the BASINS CAT model is a decision support tool that enhances the ability of U.S. cities and
communities with combined sewer systems to meet the requirements of EPA's Combined Sewer Overflow (CSO) Control Policy [1]. In
1996, EPA estimated the cost of CSO control, consistent with the CSO Control Policy, to be $44.7 billion (1996 dollars). For this reason,
the BASIN CAT model is an appropriate decision support tool to include.


A program is defined as multiple projects. For example, the Great Lakes Restoration Initiative (GLRI) is a program that includes funding for
grants. This EPA-led interagency initiative targets the most significant problems in the region, including invasive aquatic species, non-point
source pollution, and contaminated sediment. It has outcome-oriented performance goals and measures, many of which are
climate-sensitive. To ensure the overall success of the initiative, it is imperative that consideration of climate change and climate adaptation
be integrated into GLRI grants and projects. Aside from GLRI, other climate-sensitive programs across the Agency include those for land
revitalization and cleanup, air quality monitoring and protection, wetlands and water protection and restoration to name a few. Greenhouse
gas mitigation programs and projects would not be included in this total.

Climate change data needs to be integrated into the tool or model.

The 2011-2015 Strategic Plan is the driver for this annual measure

Here is the adaptation website: http://www.epa.gov/climatechange/effects/adaptation.html


                                                        79                                  Back to Table of Contents

-------
 Goal 1                                                Objective 1                                           Measure AD1
2. Data Definition and Source Reporting
2a. Original Data Source
Data will be submitted to the Office of Policy (OP) from environmental and research programs across the Agency. The data originate from
each of the National Program Offices and Regional Offices; they collect the information from their program contacts.	
2b. Source Data Collection	

The data are submitted to the Senior Advisor for Climate Adaptation in the Office of Policy. The climate adaptation advisor will determine
whether the result meets the criteria.
2c. Source Data Reporting
The Program Offices (OAR, OW, OCSPP, OSWER, OITA) and Regional Offices will contact the climate change adaptation advisor to
report this information. Tracked in a spreadsheet and maintained by the Office of Policy (OP).	


3. Information Systems and Data  Quality Procedures	
3a. Information Systems
Performance data are tracked in a spreadsheet and maintained by the Office of Policy (OP). This is source data from the Program Offices
and Regional Offices, and is summed to be entered into PERS. Information system integrity standards don't apply. The Budget Automation
System (BAS) is the final step for data entry.	
3b. Data Quality Procedures	

The climate adaptation advisor verifies the information with his climate change adaptation team through conversations with the Program
and Regional Offices, and then has one of his staff enter the data into BAS.	
3c. Data Oversight	

EPA Senior Advisor for Climate Adaptation	
3d. Calculation Methodology
The "scientific models/decisions support tools" measure is calculated by assigning a numeric value of one (1) to any major scientific model
or decision support tool. This is an annual, not cumulative measure. A model/tool may only be counted once.


4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting

Climate Change Adaptation Science Advisor	
4b. Data Limitations/Qualifications	

It is difficult to firmly define when a particular scientific model or decision-support tool has been adequately integrated into an
environmental management program. Whether this has adequately been done requires verification by the climate change adaptation
                                                          80                                     Back to Table of Contents

-------
 Goal 1                                                     Objective 1                                              Measure AD1
advisor. Some programs might not be captured in this measure. The final tabulation is a conservative count of the work completed. There is
no data lag. A model/tool may only be counted once.	
4c. Third-Party Audits

Not applicable	
 Record Last Updated: 02/13/2012 01:16:59 PM
                                                              81                                        Back to Table of Contents

-------
  Goal 1                                             Objective 1                                        Measure AD2
  Measure Code : AD2 - Cumulative number of major rulemakings with climate
  sensitive, environmental impacts, and within existing authorities, that integrate
  climate change science data
  Office of the Administrator (OA)
   1. Measure and DQR Metadata	
   Goal Number and Title                         1 - Taking Action on Climate Change and Improving Air Quality
   Objective Number and Title	1-Address Climate Change	
   Sub-Objective Number and Title                  1 - Address Climate Change
   Strategic Target Code and Title                   4 - EPA w'" account for climate change by integrating climate change science trend and scenario infor
   Managing Office                                Office of Policy
   Performance Measure Term Definitions
EPA is defining a "major" rule based upon guidelines published by the Office of Management and Budget. Specifically, a major rule is one
that has an annual effect on the economy of $100 million or more. Also, the term "rule" refers to a proposed rule.

Climate change data needs to be considered and integrated into the rulemaking process.

The 2011-2015 Strategic Plan is the driver for this annual  measure

Here is the adaptation website: http://www.epa.gov/climatechange/effects/adaptation.html


 2. Data Definition and Source Reporting	
 2a. Original Data Source                                                                                             |
 Data will be submitted to the Office of Policy (OP) from  environmental and research programs across the Agency. The data originate from
 each of the National Program Offices; they collect the information from their program contacts.
 2b. Source Data Collection                                                                                            |

 The data are submitted to the Senior Advisor for Climate Adaptation in the Office of Policy. The climate change advisor will determine
 whether the result meets the criteria.	
 2c. Source Data Reporting                                                                                            |
 The programs (OAR, OW, OCSPP,  OSWER) will contact the climate change adaptation advisor to report this information. The information
 is maintained by the Office of Policy (OP)	

                                                      82                                  Back to Table of Contents

-------
 Goal 1                                                 Objective 1                                            Measure AD2

3. Information Systems and Data Quality Procedures	
3a. Information Systems
Performance data are tracked in a spreadsheet and maintained by the Office of Policy (OP). This is source data from the programs and is
summed to be entered into PERS. Information system integrity standards don't apply. The Budget Automation System (BAS) is the final
step for data entry.	
3b. Data Quality Procedures	

The climate change adaptation advisor verifies the information with his climate change adaptation team through conversations with the
programs and then has one of his staff enter the data into BAS.	
3c. Data Oversight	

EPA Senior Advisor on Climate Adaptation	
3d. Calculation Methodology
The "proposed rule making" measure is calculated by assigning a numeric value of one (1) to any major rule proposed. This is an annual,
not cumulative measure A rule may only be counted once.


4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting

Climate Change Adaptation Science Advisor	
4b. Data Limitations/Qualifications	

There are different ways for accounting for climate change in a rule making process (e.g., in the rule itself; in guidance issued for
implementing the rule). Where climate change has adequately been accounted for in a rule making process requires verification by the
climate change adaptation advisor. Some programs might not be captured in this measure. The final tabulation is a conservative count of
the work completed. There is no data lag. A rule may only be counted once.
4c. Third-Party Audits

Not applicable	
 Record Last Updated: 02/13/2012 01:16:59 PM
                                                          83                                     Back to Table of Contents

-------
  Goal 1                                              Objective 1                                        Measure ADS
  Measure Code : AD3 - Cumulative number of major grant, loan, contract, or
  technical assistance agreement programs that integrate climate science data into
  climate sensitive projects that have an environmental outcome
  Office of the Administrator (OA)
   1. Measure and DQR Metadata
   Goal Number and Title
1 - Taking Action on Climate Change and Improving Air Quality
   Objective Number and Title
1 - Address Climate Change
   Sub-Objective Number and Title
1 - Address Climate Change
   Strategic Target Code and Title
5 - EPA will build resilience to climate change by integrating considerations of climate change impacts
   Managing Office
  Office of Policy
   Performance Measure Term Definitions
EPA will measure the amount of grants, loans, contracts, or technical assistance agreements. The term project is defined as an individual
funding agreement and a program is defined as multiple projects. For example, the Great Lakes Restoration Initiative (GLRI) is a program
that includes funding for grants. This EPA-led interagency initiative targets the most significant problems in the region, including invasive
aquatic species, non-point source pollution, and contaminated sediment. It has outcome-oriented performance goals and measures, many of
which are climate-sensitive. To ensure the overall success of the initiative, it is imperative that consideration of climate change and climate
adaptation be integrated into GLRI grants and projects. Aside from GLRI, other climate-sensitive programs across the Agency include those
for land revitalization and cleanup, air quality monitoring and protection, wetlands and water protection and restoration to name a few.
Greenhouse gas mitigation programs and projects would not be included in this total.

Climate change data needs to  be integrated into climate-sensitive projects funded through EPA grants, loans, contracts,  or technical
assistance agreements.

The 2011-2015 Strategic Plan is the driver for this annual measure

Here is the adaptation website: http://www.epa.gov/climatechange/effects/adaptation.html
 2. Data Definition and Source Reporting
 2a. Original Data Source
                                                       84
                                              Back to Table of Contents

-------
 Goal 1                                                 Objective 1                                           Measure ADS
Data will be submitted to the Office of Policy (OP) from environmental and research programs across the Agency. The data originate from
each of the National Program Offices and Regional Offices; they collect the information from their program contacts.	
2b. Source Data Collection
The data are submitted to the Senior Advisor for Climate Adaptation in the Office of Policy. The data are entered into a spreadsheet. The
climate change adaptation advisor will determine whether the result meets the criteria.	
2c. Source Data Reporting
The Program Offices (OAR, OW, OCSPP, OSWER, OITA) and Regional Offices will contact the climate change adaptation advisor to
report this information. Tracked in a spreadsheet and maintained by the Office of Policy (OP).	


3. Information Systems and Data Quality Procedures	
3a. Information Systems	
Performance data are tracked in a spreadsheet and maintained by the Office of Policy (OP). This is source data from the Program Offices
and Regional Offices, and is summed to be entered into PERS. Information system integrity standards don't apply. The Budget Automation
System (BAS) is the final step for data entry.	
3b. Data Quality Procedures	

The climate change adaptation advisor verifies the information with his climate change adaptation team through conversations with the
programs and then has one of his staff enter the data into BAS.	
3c. Data Oversight	

EPA Senior Advisor for Climate Adaptation
3d. Calculation Methodology
The "program" measure is calculated by assigning a numeric value of one (1) to any major programs that integrate climate change data.
This is an annual, not cumulative measure A program may only be counted once.	


4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting

Climate Change Adaptation Science Advisor	
4b. Data Limitations/Qualifications	

It is difficult to firmly define when climate change data have been adequately integrated into the grants, loans, contracts, or technical
assistance agreements used in an environmental management program. Whether this has adequately been done requires verification by the
climate change adaptation advisor. Some programs might not be captured in this measure. The final tabulation is a conservative count of
the work completed. There is no data lag. A program may only be counted once.	
4c. Third-Party Audits
                                                          85                                     Back to Table of Contents

-------
 Goal 1                                                         Objective 1                                                   Measure ADS



Not applicable	









 Record Last Updated: 02/13/2012 01:16:59 PM
                                                                    86                                          Back to Table of Contents

-------
  Goal 1                                                 Objective 1                                           Measure G02
  Measure Code : G02 - Million metric tons of carbon equivalent (mmtco2e) of
  greenhouse gas reductions in the buildings sector.
  Office of Air and Radiation (OAR)
   1. Measure and DQR Metadata
   Goal Number and Title                           1 - Taking Action on Climate Change and Improving Air Quality
   Objective Number and Title	1-Address Climate Change	
   Sub-Objective Number and Title                   1 - Address Climate Change
   Strategic Target Code and Title                    2 - Additional programs from across EPA will promote practices to help Americans save energy and conserv
   Managing Office                                  Office of Atmospheric Programs
   Performance Measure Term Definitions
Carbon equivalent of Greenhouse Gas Emissions: Carbon equivalent of Greenhouse Gas Emissions: Carbon dioxide (CO2) is the base
of the global wanning potential (GWP) system and has a GWP of 1. All other greenhouse gases' ability to increase global warming is
expressed in terms of CO2. The CO2e for a gas is derived by multiplying the tons of the gas by that gas's GWP. Commonly expressed as
"million metric tons of carbon dioxide equivalents" (MMTCO2e).

Buildings Sector: The Buildings Sector includes the following Energy Star partnerships: Energy Star Labeling, Energy Star Homes, and the
Energy Star Buildings programs. In the Energy Star Labeling program, the American public continues to look to ENERGY STAR as the
national symbol for energy efficiency to inform purchasing choices, save money on utility bills, and protect the environment. In 2010,
Americans purchased about 200 million products that had earned the ENERGY STAR across more than 60 product categories for a
cumulative total of about 3.5 billion ENERGY STAR qualified products purchased since 2000. Qualified products—including appliances,
heating and cooling equipment, consumer electronics, office equipment, lighting, and more—offer consumers savings of as much as 65
percent relative to standard models while providing the features and functionality consumers expect.  In the Energy Star Homes program we
focus on the 17 percent of the GHGs emitted in the United States that are attributed to the energy we use to heat, cool, and light our homes,
as well as power the appliances and electronics in them. By making energy-efficient choices in the construction of new homes and the
improvement of existing homes, American homeowners, renters, homebuilders, and home remodelers can lower household utility bills
while helping to protect the environment. Through ENERGY STAR, EPA offers an array of useful tools and resources to households and
the housing industry to increase the energy efficiency of the nation's housing stock. In the the Energy Star Buildings program we focus on
efforts to improve energy efficiency in commercial buildings across the country by 20 percent over the next decade. Through the ENERGY
STAR program, EPA is already helping the commercial building sector improve energy  efficiency in the places where consumers work,
play, and learn. In turn, these efforts will help create jobs, save money, reduce dependence on foreign oil, and contribute to cleaner air and
the protection of people's health. These and future efficiency efforts are of critical importance,  as commercial buildings are responsible for
approximately 20 percent of all energy consumption in the United States.

                                                          87                                    Back to Table of Contents

-------
 Goal 1                                                 Objective 1                                            Measure G02


2.  Data Definition and Source  Reporting	
2a. Original Data Source
Carbon emissions related to baseline energy use (e.g., business-as-usual" without the impact of EPA's voluntary climate programs) comes
from the Energy Information Agency (EIA) and from EPA's Integrated Planning Model (IPM) of the U.S. electric power sector. Baseline
data for non-carbon dioxide (CO2) emissions, including nitrous oxide and other high global warming potential gases, are maintained by
EPA. The non-CO2 data are compiled with input from industry and also independently from partners' information.

Data collected by EPA's voluntary programs include partner reports on facility- specific improvements (e.g. space upgraded,
kilowatt-hours (kWh) reduced), national market data on shipments of efficient products, and engineering measurements of equipment
power levels and usage patterns.

Additional Information:
The accomplishments of many of EPA's voluntary programs are documented in the Climate Protection Partnerships Division Annual
Report. The most recent version is ENERGY STAR and Other Climate Protection Partnerships 2008 Annual Report.
http://www.energystar.gov/ia/partners/annualreports/annual_report_2008.pdf	
2b. Source Data Collection
Avoided emissions of GHGs are determined using marginal emissions factors for CO2 equivalency based on factors established as part of
the U.S. government's reporting process to the UN Framework Convention on Climate Change, as well as historical emissions data from
EPA's eGRID database. For future years, EPA uses factors derived from energy efficiency scenario runs of the integrated utility dispatch
model, Integrated Planning Model (IPM®).	
2c. Source Data Reporting	
Carbon emissions related to baseline energy use (e.g., business-as-usual" without the impact of EPA's voluntary climate programs) comes
from the Energy Information Agency (EIA) and from EPA's Integrated Planning Model (IPM) of the U.S. electric power sector. Baseline
data for non-carbon dioxide (CO2) emissions, including nitrous oxide and other high global warming potential gases, are maintained by
EPA. The non-CO2 data are compiled with input from industry and also independently from partners' information.

Data collected by EPA's voluntary programs include partner reports on facility- specific improvements (e.g. space upgraded,
kilowatt-hours (kWh) reduced), national market data on shipments of efficient products, and engineering measurements of equipment
power levels and usage patterns.
3.  Information Systems and Data Quality Procedures	
3a.  Information Systems	
Climate Protection Partnerships Division Tracking System.  The tracking system's primary purpose is to maintain a record of the annual
                                                          88                                     Back to Table of Contents

-------
 Goal 1                                                  Objective 1                                             Measure G02
greenhouse gas emissions reduction goals and accomplishments for the voluntary climate program using information from partners and
other sources.

The Climate Protection Partnerships Division Tracking System contains transformed data.

The Climate Protection Partnerships Division Tracking System meets relevant EPA standards for information system integrity.
3b. Data Quality Procedures	
ENERGY STAR program procedures for oversight, review and quality assurance include the following. To participate, product
manufacturers and retailers enter into formal partnership agreements with the government and agree to adhere to the ENERGY STAR
Identity Guidelines, which describe how the ENERGY STAR name and mark may be used. EPA continually monitors the use of the brand
in trade media, advertisements, and stores and on the Internet. The Agency also conducts biannual onsite store-level assessments of
ENERGY STAR qualified products on the stores' shelves to ensure the products are presented properly to consumers.  To ensure that
ENERGY STAR remains a trusted symbol for environmental protection through superior efficiency, EPA completed comprehensive
enhancements of the product qualification and verification processes. Third-party certification of ENERGY STAR products went into
effect, as scheduled, on January 1, 2011. Before a product can be labeled with the ENERGY STAR under the new requirements, its
performance must be certified by an EPA-recognized third party based on testing in an EPA-recognized lab. In addition, ENERGY STAR
manufacturer partners must participate in verification testing programs run by the approved certification bodies. By the end of 2010, EPA
had recognized 21 accreditation bodies, 132 laboratories, and 15 certification bodies.
Enforcing proper use of the ENERGY STAR mark is essential to maintaining the integrity of the program. As the result of multiple
off-the-shelf testing efforts, EPA disqualified 17 products from the ENERGY STAR program in 2010 for failure to meet performance
standards. Manufacturers of those products were required to discontinue use of the label and take additional steps to limit product exposure
in the market. In an effort to ensure fair and consistent commitment among ENERGY STAR partners, EPA also took steps this year to
suspend the partner status of manufacturers failing to comply with program requirements.

Peer-reviewed carbon-conversion factors are used to ensure consistency with generally accepted measures of greenhouse gas (GHG)
emissions, and peer-reviewed methodologies are used to calculate GHG reductions from these programs.	
3c. Data Oversight
The Energy Star Labeling Branch is responsible for overseeing (1) source data reporting and (2) the information systems utilized in
producing the performance result for the Energy Star Labeling program. The Energy Star Residential Branch is responsible for overseeing
(1) source data reporting and (2) the information systems utilized in producing the performance result for the Energy Star Homes program.
The Energy Star Commercial & Industrial Branch is responsible for overseeing (1) source data reporting and (2) the information systems
utilized in producing the performance result for the Energy Star Commercial Buildings program.
3d. Calculation Methodology                                                                                                    |
Explanation of Assumptions: Most of the voluntary climate programs' focus is on energy  efficiency. For these programs, EPA estimates
the expected reduction in electricity consumption in kilowatt-hours (kWh). Emissions prevented are calculated as the product of the kWh of
electricity saved and an annual emission factor (e.g., metric tons carbon equivalent (MMTCE) prevented per kWh). Other programs focus
on directly lowering greenhouse gas emissions (e.g., non-CO2 Partnership programs, Landfill Methane Outreach, and Coalbed Methane
Outreach); for these, greenhouse gas emission reductions are estimated on a project-by-project basis.
                                                            89                                     Back to Table of Contents

-------
 Goal 1                                                   Objective 1                                             Measure G02

Explanation of the Calculations: The Integrated Planning Model, used to develop baseline data for carbon emissions, is an important
analytical tool for evaluating emission scenarios affecting the U.S. power sector.

Baseline information is discussed at length in the U.S. Climate Action Report 2002. The report includes a complete chapter dedicated to
the U.S. greenhouse gas inventory (sources, industries, emissions, volumes, changes, trends, etc.).  A second chapter addresses projected
greenhouse gases in the future (model assumptions, growth, sources, gases, sectors, etc.) Please see http://www.gcrio.org/CAR2002 and
www.epa.gov/globalwarming/publications/car/index.html

Unit of Measure: Million metric tons of carbon equivalent (MMTE) of greenhouse gas emissions

Additional information:

The IPM has an approved quality assurance project plan that is available from EPA's program office.

Background information on the IPM can be found on the website for EPA's Council for Regulatory Environmental Modeling:
http://cfpub.epa.gov/crem/knowledge  base/crem report.cfm?deid=74919
4.  Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Branch Chief, Energy Star Labeling Branch is responsible for the Energy Star Labeling program.
Branch Chief, Energy Star Residential Branch is responsible for the Energy Star Homes program.
Branch Chief, Energy Star Commercial & Industrial Branch is responsible for the Energy Star Commercial Buildings program.	
4b. Data Limitations/Qualifications                                                                                               |
These are indirect measures of GHG emissions (carbon conversion factors and methods to convert material-specific reductions to GHG
emissions reductions). Although EPA devotes considerable effort to obtaining the best possible information on which to evaluate emissions
reductions from its voluntary programs, errors in the performance data could be introduced through uncertainties in carbon conversion
factors, engineering analyses, and econometric analyses. Comprehensive documentation regarding the IPM and uncertainties associated
with it can be found at the IPM website: http://www.epa.gov/airmarkets/progsregs/epa-ipm/.
Also, the voluntary nature of the programs may affect reporting.
4c. Third-Party Audits
The Administration regularly evaluates the effectiveness of its climate programs through interagency evaluations. The second such
interagency evaluation, led by the White House Council on Environmental Quality, examined the status of U.S. climate change programs.
The review included participants from EPA and the Departments of State, Energy, Commerce, Transportation, and Agriculture. The results

                                                            90                                      Back to Table of Contents

-------
 Goal 1                                                   Objective 1                                             Measure G02
were published in the U.S. Climate Action Report-2002  as part of the United States' submission to the Framework Convention on Climate
Change (FCCC). The previous evaluation was published in the U.S. Climate Action Report-1997 . A 1997 audit by EPA's Office of the
Inspector General concluded that the climate programs examined "used good management practices" and "effectively estimated the impact
their activities had on reducing risks to health and the environment..."
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                            91                                      Back to Table of Contents

-------
  Goal 1                                             Objective 1                                         Measure G06
  Measure Code :  G06 - Million metric tons of carbon equivalent (mmtco2e) of
  greenhouse gas reductions in the transportation sector.
  Office of Air and Radiation  (OAR)
   1. Measure and DQR Metadata
Goal Number and Title
Objective Number and Title
Sub-Objective Number and Title
Strategic Target Code and Title
Managing Office
1 - Taking Action on Climate Change and Improving Air Quality
1 - Address Climate Change
1 - Address Climate Change
2 - Additional programs from across EPA will promote practices to help Americans save energy and conserv
Office of Transportation and Air Quality
Performance Measure Term Definitions
Carbon equivalent of Greenhouse Gas Emissions:  Carbon dioxide (CO2) is the base of the global warming potential (GWP) system and
has a GWP of 1. All other greenhouse gases' ability to increase global warming is expressed in terms of CO2. The CO2e for a gas is derived
by multiplying the tons of the gas by that gas's GWP. Commonly expressed as "million metric tons of carbon dioxide
equivalents" (MMTCO2e)

Transportation Sector: Mobile Sources


 2. Data Definition and Source Reporting
 2a. Original Data Source
 Carbon emissions related to baseline energy use (e.g., business-as-usual" without the impact of EPA's voluntary climate programs) comes
 from the Energy Information Agency (EIA) and from EPA's Integrated Planning Model (IPM) of the U.S. electric power sector. Baseline
 data for non-carbon dioxide (CO2) emissions, including nitrous oxide and other high global warming potential gases, are maintained by
 EPA.  The non-CO2 data are compiled with input from industry and also independently from partners' information.

 Data collected by EPA's voluntary programs include partner reports on facility- specific improvements (e.g. space upgraded,
 kilowatt-hours (kWh) reduced), national market data on shipments of efficient products, and engineering measurements of equipment
 power levels and usage patterns.

 Additional Information:
 The accomplishments of many of EPA's voluntary programs are documented in the Climate Protection Partnerships Division Annual
 Report. The most recent version is ENERGY STAR and Other Climate Protection Partnerships 2008 Annual Report.
 http://www.energystar.gov/ia/partners/annualreports/annual_report_2008.pdf
                                                       92                                  Back to Table of Contents

-------
 Goal 1	Objective 1	Measure G06
2b. Source Data Collection                                                                                                      |
Partners provide information on freight transportation activity. Data Collection is ongoing, as new partners join and existing partners are
retained.
2c. Source Data Reporting                                                                                                       |
Data is submitted through the use of EPA-provided assessment tools. Data is submitted annually and entered into a program data base.


3. Information Systems and Data Quality Procedures	
3a. Information Systems                                                                                                       |
Climate Protection Partnerships Division Tracking System.  The tracking system's primary purpose is to maintain a record of the annual
greenhouse gas emissions reduction goals and accomplishments for the voluntary climate program using information from partners and
other sources.

The data base contains source data from partners.	
3b. Data Quality Procedures                                                                                                     |
Partners do contribute actual emissions data biannually after their facility-specific improvements but these emissions data are not used in
tracking the performance measure.  EPA, however, validates the estimates of greenhouse gas reductions based on the actual emissions data
received.

For transportation emissions, data is calculated from operation activity (fuel use, miles driven, etc).  Partner activity metrics were
developed and peer reviewed according to EPA peer review requirements. Peer-reviewed carbon-conversion factors are used to ensure
consistency with generally accepted measures of greenhouse gas (GHG) emissions, and peer-reviewed methodologies are used to calculate
GHG reductions from these programs.	
3c. Data Oversight                                                                                                             |
Supervisory EPS, Transportation and Climate Division (TCD) is program manager, with overall oversight responsibility.
Environmental Scientist, TCD is responsible for maintaining data results and program goals and results.
Environmental Engineer, TCD is responsible for maintaining the information systems (partner forms and data base.)	
3d. Calculation Methodology                                                                                                    |
Explanation of Assumptions: Most of the voluntary climate programs' focus is on energy efficiency. For these programs, EPA estimates
the expected reduction in electricity consumption in kilowatt-hours (kWh). Emissions prevented are calculated as the product of the kWh of
electricity saved and an annual emission factor (e.g., metric tons carbon equivalent (MMTCE) prevented per kWh). Other programs focus
on directly lowering greenhouse gas emissions (e.g., non-CO2 Partnership programs, Landfill Methane Outreach, and Coalbed Methane
Outreach); for these, greenhouse gas emission reductions are estimated on a project-by-project basis. Other programs focused on
transportation (e.g., SmartWay) calculate emissions reductions as the product of fuel  saved and an annual emission factor (e.g., metric tons
carbon equivalent (MMTCE) prevented per gallon of fuel saved).

Explanation of the Calculations: The Integrated Planning Model, used to develop baseline data for carbon emissions, is an important
                                                            93                                      Back to Table of Contents

-------
 Goal 1                                                  Objective 1                                             Measure G06
analytical tool for evaluating emission scenarios affecting the U.S. power sector.

Baseline information is discussed at length in the U.S. Climate Action Report 2002.  The report includes a complete chapter dedicated to
the U.S. greenhouse gas inventory (sources, industries, emissions, volumes, changes, trends, etc.). A second chapter addresses projected
greenhouse gases in the future (model assumptions, growth, sources, gases, sectors, etc.) Please see http://www.gcrio.org/CAR2002 and
www.epa.gov/globalwarming/publications/car/index.html

Unit of Measure: Million metric tons of carbon equivalent (MMTE) of greenhouse gas emissions

Additional information:

The IPM has an approved quality assurance project plan that is available from EPA's program office.

Background information on the IPM can be found on the website for EPA's Council  for Regulatory Environmental Modeling:
http://cfpub.epa.gov/crem/knowledge  base/crem repoit.cfm?deid=74919
4.  Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Program Analyst, Planning & Budget Office, Office of Transportation and Air Quality oversees the reporting process.	
4b. Data Limitations/Qualifications                                                                                               |
These are indirect measures of GHG emissions (carbon conversion factors and methods to convert material-specific reductions to GHG
emissions reductions). Although EPA devotes considerable effort to obtaining the best possible information on which to evaluate emissions
reductions from its voluntary programs, errors in the performance data could be introduced through uncertainties in carbon conversion
factors, engineering analyses, and econometric analyses. Comprehensive documentation regarding the IPM and uncertainties associated
with it can be found at the IPM website:  http://www.epa.gov/airmarkets/progsregs/epa-ipm/. Also, the voluntary nature of the programs
may affect reporting.
4c. Third-Party Audits
The Administration regularly evaluates the effectiveness of its climate programs through interagency evaluations. The second such
interagency evaluation, led by the White House Council on Environmental Quality, examined the status of U.S. climate change programs.
The review included participants from EPA and the Departments of State, Energy, Commerce, Transportation, and Agriculture. The results
were published in the U.S. Climate Action Report-2002 as part of the United States' submission to the Framework Convention on Climate
Change (FCCC). The previous evaluation was published in the U.S. Climate Action Report-1997 . A 1997 audit by EPA's Office of the
Inspector General concluded that the climate programs examined "used good management practices" and "effectively estimated the impact
their activities had on reducing risks to health and the environment..."	

                                                            94                                      Back to Table of Contents

-------
Goal 1                                                          Objective 1                                                   Measure G06
Record Last Updated: 02/13/2012 01:16:48 PM
                                                                    95                                            Back to Table of Contents

-------
  Goal 1                                               Objective 1                                          Measure G16
  Measure Code :  G16 - Million metric tons of carbon equivalent (mmtco2e) of
  greenhouse gas reductions in the industry sector.
  Office of Air and Radiation (OAR)
   1. Measure and DQR Metadata
   Goal Number and Title                          1 - Taking Action on Climate Change and Improving Air Quality
   Objective Number and Title	1-Address Climate Change	
   Sub-Objective Number and Title                   1 - Address Climate Change
   Strategic Target Code and Title                    2 - Additional programs from across EPA will promote practices to help Americans save energy and conserv
   Managing Office                                 Office of Atmospheric Programs
   Performance Measure Term Definitions
Carbon equivalent of Greenhouse Gas Emissions: Carbon dioxide (CO2) is the base of the global wanning potential (GWP) system and
has a GWP of 1. All other greenhouse gases' ability to increase global warming is expressed in terms of CO2. The CO2e for a gas is derived
by multiplying the tons of the gas by that gas's GWP. Commonly expressed as "million metric tons of carbon dioxide
equivalents" (MMTCO2e).

Industry Sector: The industrial sector is an important part of the U.S. economy: manufacturing goods valued at nearly $5.5 trillion,
contributing over 11 percent to the U.S. GDP, and providing more than 12.7 million jobs paying an average of $47,500 annually. The
industrial sector also generates more than a quarter of the nation's annual GHG emissions.  Through EPA's voluntary programs, EPA
enables the industrial sector to cost-effectively reduce GHG emissions.
 2. Data Definition and Source Reporting	
 2a. Original Data Source
 Carbon emissions related to baseline energy use (e.g., business-as-usual" without the impact of EPA's voluntary climate programs) comes
 from the Energy Information Agency (EIA) and from EPA's Integrated Planning Model (IPM) of the U.S. electric power sector. Baseline
 data for non-carbon dioxide (CO2) emissions, including nitrous oxide and other high global warming potential gases, are maintained by
 EPA. The non-CO2 data are compiled with input from industry and also independently from partners' information.

 Data collected by EPA's voluntary programs include partner reports on facility- specific improvements (e.g. space upgraded,
 kilowatt-hours (kWh) reduced), national market data on shipments of efficient products, and engineering measurements of equipment
 power levels and usage patterns.

                                                         96                                   Back to Table of Contents

-------
 Goal 1                                                  Objective 1                                             Measure G16

Additional Information:
The accomplishments of many of EPA's voluntary programs are documented in the Climate Protection Partnerships Division Annual
Report. The most recent version is ENERGY STAR and Other Climate Protection Partnerships  2008 Annual Report.
http://www.energystar.gov/ia/partners/annualreports/annual_report_2008.pdf
2b. Source Data Collection
See Section 3b
2c. Source Data Reporting
See Section 3b
3.  Information Systems and Data Quality Procedures
3a. Information Systems
Climate Protection Partnerships Division Tracking System. The tracking system's primary purpose is to maintain a record of the annual
greenhouse gas emissions reduction goals and accomplishments for the voluntary climate program using information from partners and
other sources.

The Climate Protection Partnerships Division Tracking System contains transformed data.

The Climate Protection Partnerships Division Tracking System meets relevant EPA standards for information system integrity.
3b. Data Quality Procedures                                                                                                    |
The Industry sector includes a variety of programs.  Data Quality procedures vary by program as follows:

The Combined Heat and Power (CHP) Partnership Partnership dismantles the market barriers stifling investment in environmentally
beneficial CHP projects. Program partners such as project owners voluntarily provide project-specific information on newly operational
CHP projects to EPA. These data are screened and any issues resolved.  Energy savings are determined on a project-by-project basis, based
on fuel type, system capacity, and operational profile. Estimates of the use of fossil and renewable fuels are developed, as well as the
efficiency of thermal and electrical use or generation, as appropriate. Emissions reductions are calculated on a project-by-project basis to
reflect the greater efficiency of onsite CHP. Avoided emissions of GHGs from more efficient energy generation are determined using
marginal emissions factors derived from energy efficiency scenario runs of IPM, and displaced emissions from boiler-produced thermal
energy are developed through engineering estimates. In addition,  emissions reductions may include avoided transmission and distribution
losses, as appropriate. Only the emissions reductions from projects that meet the assistance criteria for the program are included in the
program benefit estimates. EPA also addresses the potential for double counting benefits between this and other partnerships by having
program staff meet annually to identify and resolve any overlap issues.

The Green Power Partnership boosts supply of clean energy by helping U.S. organizations purchase electricity from eligible renewable
generation sources. As a condition of partnership, program partners submit data annually on their purchases of qualifying green power

                                                            97                                     Back to Table of Contents

-------
 Goal 1                                                   Objective 1                                             Measure G16
products. These data are screened and any issues resolved. Avoided emissions of GHGs are determined using marginal emissions factors
for CO2 derived from scenario runs of IPM. The potential for double counting, such as counting green power purchases that may be
required as part of a renewable portfolio standard or may rely on resources that are already part of the system mix, is addressed through a
partnership requirement that green power purchases be incremental to what is already required. EPA estimates that the vast majority of the
green power purchases made by program partners are due to the partnership, as partners comply with aggressive green power procurement
requirements (usually at incremental cost) to remain in the program. Further, EPA estimates that its efforts to foster a growing voluntary
green power market have likely led to additional voluntary green power purchases that have not been reported through the program.

EPA's methane programs facilitate recovering methane from landfills, natural gas extraction systems, agriculture, and coal mines, as well
as using methane as a clean energy resource. The expenditures used in the program analyses include the capital costs agreed to by partners
to bring projects into compliance with program specifications and any additional operating costs engendered by program participation.

Within the Natural Gas STAR Program, as a condition of partnership, program partners submit implementation plans to EPA describing the
emissions reduction practices they plan to implement and evaluate. In addition, partners submit progress reports detailing specific
emissions reduction activities and accomplishments each year. EPA does not attribute all reported emissions reductions to Natural Gas
STAR. Partners may only include actions that were undertaken voluntarily, not those reductions attributable to compliance with existing
regulations. Emissions reductions are estimated by the partners either from direct before-and-after measurements or by applying
peer-reviewed emissions reduction factors.

Within the Landfill Methane  Outreach Program, EPA maintains a comprehensive database of the operational data on landfills and landfill
gas energy projects in the United States. The data are updated frequently based on information submitted by industry, the Landfill Methane
Outreach Program's (LMOP's) outreach efforts, and other sources.  Reductions of methane that are the result of compliance with EPA's air
regulations are not included in the  program estimates. In addition, only the emissions reductions from projects that meet the LMOP
assistance criteria are included in the program benefit estimates. EPA uses emissions factors that are appropriate to the project. The factors
are based on research, discussions  with experts in the landfill gas industry, and published references.

Within the Coalbed Methane Outreach Program, through collaboration  with the U.S. Mine Safety & Health Administration, state oil and
gas commissions, and the mining companies themselves, EPA collects mine-specific data annually and estimates the total methane emitted
from the mines and the quantity of gas recovered and used. There are no regulatory requirements for recovering and using CMM; such
efforts are entirely voluntary. EPA estimates CMM recovery attributable to its program activities on a mine-specific basis, based on  the
program's interaction with each mine.

Within the Voluntary Aluminum Industry Partnership program, VAIP partners agree to report aluminum production and anode effect
frequency and duration in order to  estimate annual FGHG emissions. Reductions are calculated by comparing current emissions to a BAU
baseline that uses the industry's 1990 emissions rate.  Changes in the emissions rate (per ton production) are used to estimate the annual
GHG emissions and reductions that are a result of the program. The aluminum industry began making significant efforts to reduce FGHG
emissions as a direct result of EPA's climate partnership program. Therefore, all reductions achieved by partners are assumed to be the
result of the program.
                                                            98                                      Back to Table of Contents

-------
 Goal 1                                                   Objective 1                                             Measure G16

Within the HFC-23 Emission Reduction Program, program partners report HCFC-22 production and HFC-23 emissions to a third party that
aggregates the estimates and submits the total estimates for the previous year to EPA.  Reductions are calculated by comparing current
emissions to a BAU baseline that uses the industry's 1990 emissions rate.  Changes in the emissions rate are used to estimate the annual
GHG emissions and reductions that are a consequence of the program. Subsequent to a series of meetings with EPA, industry began
making significant efforts to reduce HFC-23 emissions. All U.S. producers participate in the program; therefore, all reductions achieved by
manufacturers are assumed to be the result of the program.

EPA's Environmental Stewardship Programs include the FGHG Partnership for the Semiconductor Industry and the SF6 Partnerships for
Electric Power Systems and Magnesium Industries. Partners report emissions and emissions reductions based on jointly developed
estimation methods and reporting protocols. Data collection methods are sector specific, and data are submitted  to EPA either directly or
through a designated third party. Reductions are calculated by comparing current emissions to a BAU baseline, using industry-wide or
company-specific emissions rates in a base year. The reductions in emissions rates are used to calculate the overall GHG emissions
reductions from the program. The share of the reductions attributable to EPA's programs is identified based on a detailed review of
program activities and industry-specific information.

Within the Responsible Appliance Disposal (RAD) Program, as a condition of partnership, RAD partners submit annual data to EPA on
their achievements. Submitted data includes the number and type of appliances collected and processed as well as  the quantity and fate of
the individual components. GHG reductions are calculated by measuring the emissions avoided by recovering refrigerant, foam blowing
agents, and recycling durable components in addition to the energy savings from early appliance retirement from utility programs.

Within the GreenChill Partnership, partner emissions reductions are calculated both year-to-year and aggregate. Partners set annual
refrigerant emissions  reduction goals and submit refrigerant management plans to detail their reduction initiatives.

Peer-reviewed carbon-conversion factors are used to ensure consistency with generally accepted measures of greenhouse gas (GHG)
emissions, and peer-reviewed methodologies are used to calculate GHG reductions from these programs.

3c. Data Oversight                                                                                                             |
The Non-CO2 Program Branch is responsible for overseeing (1) source data reporting and  (2)  the information systems utilized in producing
the performance result for Methane Programs and the Voluntary Aluminum Industry Partnership program.
The Energy Supply & Industry Branch is responsible for overseeing (1) source data reporting and (2) the information systems utilized in
producing the performance result for the Combined Heat and Power and Green Power Partnership programs.
The Alternatives and  Emissions Reduction Branch is responsible for overseeing (1) source data reporting and (2) the information systems
utilized in producing the performance result for the GreenChill Partnership, the Responsible Appliance Disposal, and the HFC-23 Emission
Reduction Program.

3d. Calculation Methodology                                                                                                     |
Explanation of Assumptions: Most of the voluntary climate programs' focus is on energy efficiency. For these programs, EPA  estimates
                                                            99                                       Back to Table of Contents

-------
 Goal 1                                                  Objective 1                                             Measure G16
the expected reduction in electricity consumption in kilowatt-hours (kWh). Emissions prevented are calculated as the product of the kWh of
electricity saved and an annual emission factor (e.g., metric tons carbon equivalent (MMTCE) prevented per kWh). Other programs focus
on directly lowering greenhouse gas emissions (e.g., non-CO2 Partnership programs, Landfill Methane Outreach, and Coalbed Methane
Outreach); for these, greenhouse gas emission reductions are estimated on a project-by-project basis.

Explanation of the Calculations: The Integrated Planning Model, used to develop baseline data for carbon emissions, is an important
analytical tool for evaluating emission scenarios affecting the U.S. power sector.

Baseline information is discussed at length in the U.S. Climate Action Report 2002.  The report includes a complete chapter dedicated to
the U.S. greenhouse gas inventory (sources, industries, emissions, volumes, changes, trends, etc.). A second chapter addresses projected
greenhouse gases in the future (model assumptions, growth, sources, gases, sectors, etc.) Please see http://www.gcrio.org/CAR2002 and
www.epa.gov/globalwarming/publications/car/index.html

Unit of Measure: Million metric tons of carbon equivalent (MMTE) of greenhouse gas emissions

Additional information:

The IPM has an approved quality assurance project plan that is available from EPA's program office.

Background information on the IPM can be found on the website for EPA's Council for Regulatory Environmental Modeling:
http://cfpub.epa.gov/crem/knowledge  base/crem repoit.cfm?deid=74919
4.  Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Branch Chief, Non-CO2 Program Branch is responsible for overseeing final reporting for Methane Programs and the Voluntary Aluminum
Industry Partnership program.
Branch Chief, Energy Supply & Industry Branch is responsible for overseeing final reporting for the Combined Heat and Power and Green
Power Partnership programs.
Branch Chief, Alternatives and Emissions Reduction Branch is responsible for overseeing final reporting for the GreenChill Partnership,
the Responsible Appliance Disposal, and the HFC-23 Emission Reduction Program.

4b. Data Limitations/Qualifications                                                                                                |
These are indirect measures of GHG emissions (carbon conversion factors and methods to convert material-specific reductions to GHG
emissions reductions). Although EPA devotes considerable effort to obtaining the best possible information on which to evaluate emissions

                                                           100                                      Back to Table of Contents

-------
 Goal 1                                                   Objective 1                                             Measure G16
reductions from its voluntary programs, errors in the performance data could be introduced through uncertainties in carbon conversion
factors, engineering analyses, and econometric analyses. Comprehensive documentation regarding the IPM and uncertainties associated
with it can be found at the IPM website: http://www.epa.gov/airmarkets/progsregs/epa-ipm/. Also, the voluntary nature of the programs
may affect reporting.
4c. Third-Party Audits
The Administration regularly evaluates the effectiveness of its climate programs through interagency evaluations. The second such
interagency evaluation, led by the White House Council on Environmental Quality, examined the status of U.S. climate change programs.
The review included participants from EPA and the Departments of State, Energy, Commerce, Transportation, and Agriculture. The results
were published in the U.S. Climate Action Report-2002 as part of the United States' submission to the Framework Convention on Climate
Change (FCCC). The previous evaluation was published in the U.S. Climate Action Report-1997 . A 1997 audit by EPA's Office of the
Inspector General concluded that the climate programs examined "used good management practices" and "effectively estimated the impact
their activities had on reducing risks to health and the environment..."	
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                            101                                      Back to Table of Contents

-------
  Goal 1                                                Objective 2                                          Measure 001
  Measure Code :  001 - Cumulative percentage reduction in tons of toxicity-weighted
  (for cancer risk) emissions of air toxics from  1993 baseline.
  Office of Air and Radiation (OAR)
   1. Measure and DQR Metadata
   Goal Number and Title                          1 - Taking Action on Climate Change and Improving Air Quality
   Objective Number and Title	    2-improve Air Quality
   Sub-Objective Number and Title
2 - Reduce Air Toxics
   Strategic Target Code and Title                    1 - BV 2015/ reduce toxicity-weighted (for cancer) emissions of air toxics
   Managing Office                                 Office of Air Quality Planning and Standards
   Performance Measure Term Definitions
Toxicity-weighted emissions: Toxicity-weighted emissions are an approach to normalize the mass of the HAP release (in tons per year) by
a toxicity factor. The toxicity factors are based on either the HAPs cancer potency or noncancer potency. The more toxic the HAP the more
"weight" it receives.

Air toxics: Air toxics, also known as hazardous air pollutants, are those pollutants emitted into the air that are known or suspected to cause
cancer or other serious health effects, such as reproductive effects or birth defects, or adverse environmental effects. As defined by the
Section 112 of the Clean Air Act; the EPA currently regulates 187 air toxics released into the environment


Cancer risk: The probability of contracting cancer over the course of a lifetime (assumed to be 70 years for the purposes of most risk
characterization). A risk level of "N" in a million implies a likelihood that up to "N" people, out of one million equally exposed people
would contract cancer if exposed continuously (24 hours per day) to the specific concentration over 70 years (an assumed lifetime). This
risk would be an excess cancer risk that is in addition to any cancer risk borne by a person not exposed to these air toxics.
 2. Data Definition and Source Reporting
 2a. Original Data Source
 Emissions inventories are from many primary sources.

 The baseline National Toxics Inventory (for base years 1990 - 1993) is based on data collected during the development of Maximum
 Achievable Control Technology (MACT) standards, state and local data, Toxics Release Inventory (TRI) data, and emissions estimates
 using accepted emission inventory methodologies.
                                                         102                                   Back to Table of Contents

-------
 Goal 1                                                  Objective 2                                             Measure 001

The primary source of data in the 1996 and 1999 toxics emissions inventories are state and local air pollution control agencies and Tribes.
These data vary in completeness, format, and quality.  EPA evaluates these data and supplements them with data gathered while developing
Maximum Achievable Control Technology (MACT) and residual risk standards, industry data, and Toxics Release Inventory data.

The health risk data were obtained from various data sources including EPA, the U.S. Agency for Toxic Substances and Disease Registry,
California Environmental Protection Agency, and the International Agency for Research on Cancer. The numbers from the health risk
database are used for estimating the risk of contracting cancer and the level of hazard associated with adverse health effects other than
cancer.	
2b. Source Data Collection                                                                                                       |
Source Data Collection Methods: Field monitoring; estimation

Date/time Intervals Covered by Source Data: Each inventory year provides an annual emissions sum for that year

EPA QA requirements/guidance  governing collection: The overarching QA requirements and guidance  are covered in the OAQPS Quality
Assurance Project Plan [insert reference].

EPA's uniform data standards relevant to the NEI for HAPs are the: SIC/NAICS, Latitude/Longitude, Chemical Identification, Facility
Identification, Date, Tribal and Contact Data Standards.

For more information on compliance of the NEI for HAPs with EPA's Information Quality Guidelines and new EPA data standards, please
refer to the following web site for a paper presented at the 2003 Emission Inventory Conference in San Diego. "The Challenge of Meeting
New EPA Data Standards and Information Quality Guidelines in the Development of the 2002 NEI Point Source Data for HAPs", Anne
Pope, et al. www.epa.gov/ttn/chief/conference/eil2/dm/pope.pdf.

Geographical Extent of Source Data: National

Spatial Detail Covered By the Source Data: 2002 and2005 NEI data—by facility address. Earlier—by county

Emissions Data: The National Emissions Inventory (NEI) for Hazardous Air Pollutants (HAPs) includes emissions from large and  small
industrial sources inventoried as  point sources, smaller stationary area and other sources, such as fires inventoried as non-point sources, and
mobile sources.

Prior to the 1999 NEI for HAPs,  there was the National Toxics Inventory (NTI). The baseline NTI (for base years 1990 - 1993)  includes
emissions information for 188 hazardous air pollutants from more than 900 stationary sources and from mobile sources. The baseline NTI
contains county level emissions data and cannot be used for modeling because it does not contain facility specific data.

The 2002 NEI and a slightly modified/updated 2005 NEI for HAPs contain stationary and mobile source estimates.  These inventories also
                                                           103                                      Back to  Table of Contents

-------
 Goal 1                                                  Objective 2                                             Measure 001
contain estimates of facility-specific HAP emissions and their source specific parameters such as location (latitude and longitude) and
facility characteristics (stack height, exit velocity, temperature, etc.). Furthermore for 2005, a 2005 inventory was developed for the
National Air Toxics Assessment (NATA) http://www.epa.gov/nata2005/, which provides the most updated source of air toxics emissions
for 2005.

The 2008 NEI contains HAP emissions reported by state, local, and tribal agencies as well as data from the 2008 TRI and EPA data
developed as part of MACT regulation development.  Detailed documentation including QA procedures is underdevelopment as of
January, 2012.

Information on EPA's Health Criteria Data for Risk Characterization:
http://www.epa.gov/ttn/atw/toxsource/summary.html
Contents: Tabulated dose response values for long-term (chronic) inhalation and oral
exposures; and values for short term (acute) inhalation exposure

EPA's Health Criteria Data for Risk Characterization is  a compendium of cancer and noncancer health risk criteria used to develop a risk
metric. This compendium includes tabulated values for long-term (chronic) inhalation for many of the 188 hazardous air pollutants.

Audience: Public

2c. Source Data Reporting
Form/mechanism for receiving data and entering into EPA system: During the development of the 1999 National Emission Inventory
(NEI) for Hazardous Air Pollutants (HAPs),  all primary data submitters and reviewers were required to submit their data and revisions to
EPA in a standardized format using the Agency's Central Data Exchange (CDX). For more information on CDX, please go the following
web site: www.epa.gov/ttn/chief/nif/cdx.html.

This approach was also used for the 2002 and 2005 NEI. Starting with the 2008 NEI, a new CDX-based mechanism was used called the
Emissions Inventory System (EIS). http://www.epa.gov/ttn/chief/eis/gateway/index.html. The data are transmitted automatically through
CDX into the EIS data system.

Timing and frequency of reporting: Other [NEI data are calculated every 3 years]
3.  Information Systems and  Data Quality Procedures
3a.  Information Systems
The NEI data and documentation are available at the following sites:

Emissions Inventory System (EIS): http://www.epa.gov/ttn/chief/eis/gateway/index.html

                                                           104                                     Back to Table of Contents

-------
 Goal 1                                                   Objective 2                                             Measure 001
Available inventories: 2002 NEI, 2005 NEI, 2008 NEI
Contents: Detailed raw final inventories
Audience: EPA staff and state/local/tribal reporting agencies

The EIS is the interface for state, local, and tribal agencies to upload their emissions inventory data. It works using the Central Data
Exchange (CDX) network to directly transfer data from external agencies to EPA. EIS also allows EPA inventory development staff to
upload data to augment inventories, particularly for HAP emissions, which the states are not required to submit to EPA. EIS includes a
"Quality  Assurance Environment" that allows states to quality assure their data before submitting to EPA. During this phase of use, EIS
runs hundreds of quality assurance checks on the data to ensure that the format (e.g., required data fields) and content (e.g., data codes,
range checks) of the data are valid.  After using the QA environment, states submit using the production environment, which also runs the
QA checks.  EIS further allows reporting agencies to make changes as needed to correct  any data that passed the QA checks but is not
correct. EIS allows both data submitters and all EPA staff to view the data. EIS reports  facilitate the QA and augmentation of the data by
EPA inventory preparation staff. EIS facilitates EPA's automatic compilation of all agency data and EPA data using a hierarchical
selection process, but which EPA staff define the order of precedence for using datasets when multiple emissions values exist from more
than one  group (for example, state data versus EPA estimated data).

Clearinghouse for Inventories and Emission Factors (CHIEF):
-Contents: Modeling data files for each state, summary data files for the nation, documentation, and README file
-Audience: State/1 ocal/Tribal agencies, industry, EPA, and the public.
-1999 NEI: http://www.epa.gov/ttn/chief/net/1999inventory.html
Contents: 1999 NEI for HAPs data development materials;
1999 Data Incorporation Plan  - describes how EPA compiled the 1999 NEI for HAPs; QC tool for data submitters; Data Augmentation
Memo describes procedures EPA will use to augment data; 99 NTI Q's and A's provides answers to frequently asked questions; NIF (Input
Format) files and descriptions; CDX Data Submittal Procedures - instructions on how to submit data using CDX; Training materials on
development of HAP emission inventories; and Emission factor documents, databases, and models.
-2002 NEI: http://www.epa.gov/ttn/chief/net/2002inventory.htmlffinventorydata
-2005 NEI: http://www.epa.gov/ttn/chief/net/2005inventory.html#inventorydata
-2005 NATA: http://www.epa.gov/ttn/atw/nata2005/methods.html#emissions
-2008 NEI: http://www.epa.gov/ttn/chief/net/2008inventory.html

Additional information:
3b. Data Quality Procedures                                                                                                      |
Starting with the 2008 NEI, EPA has used the Emissions Inventory System (EIS) for collecting and compiling the National Emissions
Inventory (NEI). EIS includes a "Quality Assurance Environment" that allows states to quality assure their data before submitting to EPA.
During this phase of use, EIS runs hundreds of quality assurance checks (-650 as of January 2012) on the data to ensure that the format
(e.g., required data fields) and content (e.g., data codes, emissions range checks, duplicate prevention) of the data are valid. After using the
QA environment, states submit using the production environment, which also runs the QA checks. QA checks are partly documented in
Appendix 5 of the 2008 NEI Implementation Plan available at http://www.epa.gov/ttn/chief/net/neip/index.html and fully documented on
                                                           105                                      Back to Table of Contents

-------
 Goal 1                                                   Objective 2                                             Measure 001
the EIS gateway at https://eis.epa.gov/eis-systetn-web/content/qaCheck/search.html.  Data submitters are given feedback reports containing
errors for missed requirements and warnings for non-required checks, such as emissions range checks. After data are compiled, EPA
inventory preparation staff perform numerous procedures on the data that are not yet automated.  In many cases, EPA further consulted
with the data external data providers to obtain revised data submissions to correct issues identified. These checks and data improvements
included:
•       Comparison to past inventories including 2005 NATA to identify missing data (facilities, pollutants), particularly for facilities
identified in past efforts as high risk
•       Comparison of latitude longitude locations to county boundaries
•       Augmentation  of HAP emissions data with TRI
•       Augmentation  of HAP emissions data using emission factor ratios
•       Augmentation  of HAP emissions with EPA data developed for MACT and RTR standards
•       Outlier analysis

Detailed documentation including QA procedures is underdevelopment as of January, 2012.

Prior to 2008, EIS was unavailable and so many of the data techniques used by EIS were done in a more manual fashion. The EPA
performed extensive quality assurance/quality control (QA/QC) activities, including checking data provided by other organizations to
improve the quality of the emission inventory.  Some of these activities include: (1) the use of an automated format QC tool to identify
potential errors of data integrity, code values, and range checks; (2) use of geographical information system (GIS) tools to verify facility
locations; and (3) automated content analysis by pollutant, source category and facility to identify potential problems with emission
estimates such as outliers, duplicate sites, duplicate emissions, coverage of a source category, etc. The content analysis includes a variety
of comparative and statistical analyses. The comparative analyses help reviewers prioritize which source categories and pollutants to
review in more detail based on comparisons using current inventory data and prior inventories. The statistical analyses help  reviewers
identify potential outliers by providing the minimum, maximum, average, standard deviation, and selected percentile values based on
current data. Documentation  on procedures used prior to 2008 is most readily available in the documentation for the 2002 NEI, available at
http://www.epa.gov/ttn/chief/net/2002inventory.html.

The NTI database contains data fields that indicate if a field has been augmented and identifies the augmentation method. After performing
the content analysis, the EPA contacts data providers to reconcile potential errors. The draft NTI is posted for external review and includes
a README file, with instructions on review of data and submission of revisions, state-by-state modeling files with all modeled data fields,
and summary files  to assist in the review of the data.  One of the summary files includes a comparison of point source data submitted by
different organizations. During the external review of the data, state and local agencies, Tribes, and industry provide external QA of the
inventory.  The EPA evaluates proposed revisions from external reviewers and prepares memos for individual reviewers documenting
incorporation of revisions and explanations if revisions were not incorporated. All revisions are tracked in the database with the source of
original data and sources of subsequent revision.

The external QA and the internal QC of the inventory have resulted in significant changes in the initial emission estimates, as seen by
comparison of the initial draft NEI for HAPs and its final version.  For more information on QA/QC of the NEI for HAPs, please refer to
                                                             106                                       Back to Table of Contents

-------
 Goal 1                                                   Objective 2                                              Measure 001
the following web site for a paper presented at the 2002 Emission Inventory Conference in Atlanta: "QA/QC - An Integral Step in the
Development of the 1999 National Emission Inventory for HAPs", Anne Pope, et al. www.epa.gov/ttn/chief/conference/ei 11/qa/pope.pdf

The tables used in the EPA's Health Criteria Data for Risk Characterization (found at www.epa.gov/ttn/atw/toxsource/summary.html) are
compiled assessments from various sources for many of the  188 substances listed as hazardous air pollutants under the Clean Air Act of
1990. The data are reviewed to make sure they support hazard identification and dose-response assessment for chronic exposures as
defined in the National Academy of Sciences (NAS) risk assessment paradigm (www.epa.gov/ttn/atw/toxsource/paradigm.html). Because
the health criteria data were obtained from various sources they are prioritized for use (in developing the performance measure, for
example) according to 1) conceptual consistency with EPA risk assessment guidelines and 2) various levels of scientific peer review. The
prioritization process is aimed at incorporating the best available scientific data.

3c. Data Oversight	
Source Data: Air Quality Assessment Division, Emissions Inventory Assessment Group
Information Systems: Health & Environmental Impacts Division, Air Toxics Assessment Group
3d. Calculation Methodology	
Explanation of the Calculations: As the NEI is only developed every three years, EPA utilizes an emissions modeling system to project
inventories for "off-years" and to project the inventory into the future. This model, the EMS-HAP (Emissions Modeling System for
Hazardous Air Pollutants), can project future emissions, by adjusting stationary source emission data to account for growth and emission
reductions resulting from emission reduction scenarios such as the implementation of the Maximum Achievable Control Technology
(MACT) standards.

Information on the Emissions Modeling System for Hazardous Air Pollutants (EMS-HAP):
http://www.epa.gov/scram001/userg/other/emshapv3ug.pdf
http: //www. epa. gov/ttn/chi ef/emch/proj ecti on/em shap. html
Contents:  1996 NTI and 1999 NEI for HAPs Audience: public

Explanation of Assumptions: Once the EMS-HAP process has been performed, the EPA would tox-weight the inventory by "weighting"
the emissions for each pollutant with the appropriate health risk criteria. This would be accomplished through a multi-step process.
Initially, pollutant by pollutant values would be obtained from the NEI for the current year and the baseline year (1990/93).  Conversion of
actual tons for each pollutant for the current year and the baseline year to "toxicity-weighted" tons would be accomplished by multiplying
the appropriate values from the health criteria database such as the unit risk estimate (URE) or lifetime cancer risk (defined at
http://www.epa.gov/ttn/atw/toxsource/summary.html) to get the noncancer tons. These toxicity-weighted values act as a surrogate for risk
and allow EPA to compare the toxicity-weighted values against a 1990/1993 baseline of toxicity-weighted values to determine the
percentage reduction in risk on an annual basis.

Information on EPA's Health Criteria Data for Risk Characterization (Health Criteria Data):
http://www.epa.gov/ttn/atw/toxsource/summary.html
Contents: Tabulated dose
                                                            107                                      Back to Table of Contents

-------
 Goal 1                                                   Objective 2                                             Measure 001
response values for
long-term (chronic) inhalation
and oral exposures; and values for
short-term (acute) inhalation
exposure.
Audience: Public

Identification of Unit of Measure and Timeframe: Cumulative percentage reduction in tons of toxicity-weighted emissions as a
surrogate for actual risks reduction to the public.
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Oversight of Final Reporting: OAQPS will update the actual toxicity-weighted emissions approximately every three years to coincide
with updated toxic inventories.

Timing of Results Reporting: Annually. NEI data are calculated every three years; in years when NEI data are not calculated, the annual
measure is reported based upon modeled results.
4b. Data Limitations/Qualifications                                                                                                |
While emissions estimating techniques have improved over the years, broad assumptions about the behavior of sources and serious data
limitations still exist. The NTI and the NEI for HAPs contain data from other primary references. Because of the different data sources,
not all information in the NTI and the NEI for HAPs has been developed using identical methods. Also, for the same reason, there are
likely some geographic areas with more detail and accuracy than others.

The  1996 NTI and 1999 NEI for HAPs are a significant improvement over the baseline NTI because of the added facility-level detail (e.g.,
stack heights, latitude/longitude locations), making it more useful for dispersion model input.

For further discussion of the data limitations and the error estimates in the  1999 NEI for HAPs, please refer to the discussion of Information
Quality Guidelines in the documentation at: www.epa.gov/ttn/chief/net/index.htmltfhaps99

The tables used in the EPA's Health Criteria Data for Risk Characterization (found at www.epa.gov/ttn/atw/toxsource/summary.html) are
compiled assessments from various sources for many of the 188 substances listed as hazardous air pollutants under the Clean Air Act of
1990. Because different sources developed these assessments at different times for purposes that were similar but not identical, results are
not totally consistent. To resolve these discrepancies and ensure the validity of the data, EPA applied a consistent priority scheme
consistent with EPA risk assessment guidelines and various levels of scientific peer review.  These risk assessment guidelines can be found
at http://www.epa.gov/risk/guidance.htm.


                                                            108                                     Back to Table of Contents

-------
 Goal 1                                                   Objective 2                                             Measure 001
While the Agency has made every effort to utilize the best available science in selecting appropriate health criteria data for
toxi city-weighting calculations, there are inherent limitations and errors (uncertainties) associated with this type of data. Most of the
agencies health criteria are derived from response models and laboratory experiments involving animals. The parameter used to convert
from exposure to cancer risk (i.e. the Unit Risk Estimate or URE) is based on default science policy processes used routinely in EPA
assessments. First, some air toxics are known to be carcinogens in animals but lack data in humans. These have been assumed to be human
carcinogens. Second, all the air toxics in this assessment were assumed to have linear relationships between exposure and the probability of
cancer (i.e.  effects at low exposures were extrapolated from higher, measurable, exposures by a straight line). Third, the URE used for
some air toxics compounds represents a maximum likelihood estimate, which might be taken to mean the best scientific estimate. For other
air toxics compounds, however, the URE used was an "upper bound" estimate, meaning that it probably leads to an overestimation of risk if
it is incorrect. For these upper bound estimates, it is assumed that the URE continues to apply even  at low exposures. It is likely, therefore,
that this linear model over-predicts the risk at exposures encountered in the environment. The cancer weighting-values for this approach
should be considered "upper bound" in the science policy sense.

All of the noncancer risk estimates have a built-in margin of safety. All of the Reference Concentrations (RfCs) used in toxi city-weighting
of noncancer are conservative, meaning that they represent exposures which probably do not result  in any health effects, with a margin of
safety built into the RfC to account for sources of uncertainty and variability. Like the URE used in cancer weighting the values are,
therefore, considered "upper bound" in the science policy sense. Further details on limitations and  uncertainties associated with the
agencies health data can be found at: www.epa.gov/ttn/atw/nata/roy/page9.htmltfL10.	
4c. Third-Party Audits                                                                                                           |
In 2004, the Office of the Inspector General (OIG) released a final evaluation report on "EPA's Method for Calculating Air Toxics
Emissions for Reporting Results Needs Improvement" (report can be found at www.epa.gov/oig/reports/2004/20040331 -2004-p-OOO 12.pdf
). The report stated that although the methods used have improved substantially, unvalidated assumptions and other limitations underlying
the NTI continue to impact its use as a GPRA performance measure.  As a result of this evaluation and the OIG recommendations for
improvement, EPA prepared an action plan and is looking at ways to improve the accuracy and reliability of the data. EPA will meet
bi-annually with OIG to report on its progress in completing the activities as outlined in the action plan.

EPA staff, state and local agencies,  Tribes, industry and the public review the NTI and the NEI for HAPs. To assist in the review of the
1999 NEI for HAPs, the EPA provided a comparison of data from the three data sources (MACT/residual risk data, TRI, and state, local
and Tribal inventories) for each facility. For the 1999 NEI for HAPs, two periods were available for external review - October 2001 -
February 2002 and October 2002 - March 2003.  The final 1999 NEI was completed and posted on  the Agency website in the fall of 2003.

The EMS-HAP has been subjected to the scrutiny of leading scientists throughout the country in a process called "scientific peer review".
This ensures that EPA uses the best available scientific methods and information. In 2001, EPA's Science Advisory Board (SAB) reviewed
the EMS-HAP model as part of the  1996 national-scale assessment.  The review was generally supportive of the assessment purpose,
methods, and presentation; the committee considers this an important step toward a better understanding of air toxics.  Additional
information is  available on the Internet: www. epa. gov/ttn/atw/nata/peer. html.
                                                            109                                      Back to Table of Contents

-------
Goal 1                                                          Objective 2                                                    Measure 001






Record Last Updated: 02/13/2012 01:16:46 PM
                                                                    110                                            Back to Table of Contents

-------
  Goal 1                                                Objective 2                                          Measure A01
  Measure Code : A01 - Annual emissions  of sulfur dioxide  (SO2) from electric power
  generation sources.
  Office of Air and Radiation (OAR)
   1. Measure and  DQR Metadata
   Goal Number and Title                           1 - Taking Action on Climate Change and Improving Air Quality
   Objective Number and Title	2-improve Air Quality	
   Sub-Objective Number and Title                   1 - Reduce Criteria Pollutants and Regional Haze
   Strategic Target Code and Title                    1 - BV 2015, concentrations of ozone (smog) in monitored counties will decrease to .073 ppm
   Managing Office                                  Office of Atmospheric Programs
   Performance Measure Term Definitions
Emissions of SO2: Sulfur dioxide (also sulphur dioxide) is the chemical compound with the formula SO2.

Electric power generation sources: The Acid Rain Program, established under Title IV of the Clean Air Act Amendments of 1990, requires
major reductions in sulfur dioxide (SO2) and nitrogen oxide (NOx) emissions from the U.S. electric power generation industry.  The
program implements Title IV by continuing to measure, quality assure, and track emissions for SO2 and/or NOx from Continuous Emissions
Monitoring Systems (CEMS) or equivalent direct measurement methods at over 3,600 affected electric generation units in the U.S.


 2. Data Definition and Source Reporting
 2a. Original Data Source
 More than 3,400 fossil fuel-fired utility units affected under the Title IV Acid Rain Program collect hourly measurements of SO2, NOx,
 volumetric flow, CO2, and other emission-related parameters using certified continuous emission monitoring systems (CEMS) or equivalent
 continuous monitoring methods.

 For a description of EPA's Acid Rain Program, see the program's website at http ://www. epa. gov/aci drain/index.html, and the electronic
 Code of Federal Regulations at http://www.epa.gov/docs/epacfr40/chapt-I.info/subch-C.html (40 CFR parts 72-78.)	
 2b. Source Data Collection
 Source Data Collection Methods: Field monitoring using certified continuous emission monitoring systems (CEMS) or equivalent
 continuous monitoring methods, collected hourly.

 EPA QA requirements/guidance governing collection:  Promulgated QA/QC requirements dictate performing a series of quality assurance
 tests of CEMS performance. For these tests, emissions data are collected under highly structured, carefully designed testing conditions,
                                                         111                                    Back to Table of Contents

-------
 Goal 1                                                   Objective 2                                             Measure A01
which involve either high quality standard reference materials or multiple instruments performing simultaneous emission measurements.
The resulting data are screened and analyzed using a battery of statistical procedures, including one that tests for systematic bias.  If a CEM
fails the bias test, indicating a potential for systematic underestimation of emissions, the source of the error must be identified and corrected
or the data are adjusted to minimize the bias. Each affected plant is required to maintain a written QA plan documenting performance of
these procedures and tests.

The ETS provides instant feedback to sources on data reporting problems, format errors, and inconsistencies. The electronic  data file QA
checks are described at http://www.epa.gov/airmarkets/business/report-emissions.html

Geographical Extent of Source Data: National

Spatial Detail Covered By the Source Data: Spatial detail for SO2 emissions can be obtained at the following website:
http://camddataandmaps.epa.gov/gdm/index.cfm?fuseaction=emissions.wizard This website allows access to current and historical
emissions data via Quick Reports.  Annual,  quarterly, monthly, daily and hourly data are available at the unit level and the monitoring
location level.
2c. Source Data Reporting
Form/mechanism for receiving data and entering into EPA system: Beginning with the first quarter of 2009, and quarterly thereafter,
all industry sources regulated under the Acid Rain and Clean Air Interstate Rule (CAIR) programs are required use the Emissions
Collection and Monitoring Plan System (ECMPS) to submit their monitoring plan, QA/cert test, and emissions data to the EPA.

The new XML file format allows the data to be organized based on dates and hours instead of pollutant type.

See also the ECMPS Reporting Instructions Emissions document:
http://www.epa.gov/airmarkets/business/ecmps/docs/ECMPSEMRI2009Q2.pdf

Timing and frequency of reporting: Emissions data are submitted to the ECMPS and represent hourly values for measured parameters,
calculated hourly emissions values, instrument calibration data, and aggregated summary data. An emissions file contains one calendar
quarter of hourly and aggregate emissions measurements for a specified unit or group of related units, including stacks and pipes.

Each unit that is required to submit emissions data for a particular calendar quarter must be included in one and only one emissions file for
that quarter. Each emissions file should contain all relevant operating, daily quality assurance, and emissions data for all units, common
stacks, multiple stacks, or common pipes that were in a common monitoring configuration for any part of the quarter.

You must submit an emissions file for each quarter or, for ozone season only reporters, for the second and third calendar quarters of each
year.
                                                            112                                     Back to Table of Contents

-------
 Goal 1                                                  Objective 2                                            Measure A01
3.  Information Systems and Data Quality Procedures
3a.  Information Systems                                                                                                       |
Emissions Tracking System (ETS) /
Emissions Collection and Monitoring Plan System (ECMPS)

Additional information:
EPA's Clean Air Markets Division (CAMD) has undertaken a project to re-engineer the process and data systems associated with
emissions, monitoring plan, and certification data. As part of the project, CAMD reviewed how monitoring plan information, certification/
recertification applications, on-going quality assurance data, and emissions data are maintained, quality assured and submitted. CAMD also
reviewed the tools available for checking and submitting data on a quarterly and ozone season basis. Once the review was complete,
CAMD developed a number of goals for the ECMPS project. They include:
•      Creating a single client tool for all users to check and submit data.
•      Providing users with the ability to quality assure data prior to submission.
•      Providing users with one set of feedback.
•      Allowing for seamless updates to the client tool.
•      Providing direct access to EPA's database through the client tool.
•      Maintaining select data outside of the electronic data report.
•      Creating new XML file format.
•      Developing new security requirements.

Adding flexibility to the process is one of the main reasons for changing how monitoring and emissions data are quality assured and
submitted. There are several changes to the process that will involve adding flexibility:
•      Monitoring plans will no longer be required as part of the quarterly file.

•      On-going quality assurance test data may be submitted after the tests are performed—users will not have to wait to submit the data
as part of a quarterly report.

[Source: http://www.epa.gov/airmarkets/business/ecmps/index.html1

The ECMPS contain source data.

The ECMPS meets relevant EPA standards for information system integrity.
3b.  Data Quality Procedures                                                                                                     |
                                                           113                                     Back to Table of Contents

-------
 Goal 1                                                   Objective 2                                             Measure A01
EPA analyzes all quarterly reports to detect deficiencies and to identify reports that must be resubmitted to correct problems. EPA also
identifies reports that were not submitted by the appropriate reporting deadline. Revised quarterly reports, with corrected deficiencies found
during the data review process, must be obtained from sources by a specified deadline. All data are reviewed, and preliminary and final
emissions data reports are prepared for public release and compliance determination.

For a review of the ETS data audit process, see: http://www.epa.gov/airmarkets/
presentations/docs/epri06/epri electronic audit revi sed.ppt.	
3c. Data Oversight                                                                                                             |
Branch Chief, Emissions Monitoring Branch is responsible for source data reporting.

Branch Chief, Market Operations Branch is responsible for the information systems utilized in producing the performance result.

3d. Calculation Methodology
Definition of variables: The ECMPS Reporting Instructions Emissions document at
http://www.epa.gov/airmarkets/business/ecmps/docs/ECMPSEMRI200902.pdf is the data dictionary for the ECMPS.

Explanation of Calculations: Promulgated methods are used to aggregate emissions data across all United States' utilities for each
pollutant and related source operating parameters such as heat inputs.The ECMPS Reporting Instructions Emissions document at
http://www.epa.gov/airmarkets/business/ecmps/docs/ECMPSEMRI2009Q2.pdf provides the methods used to aggregate emissions data
across all United States' utilities.

Unit of analysis: Tons of emission
4.  Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Branch Chief, Assessment And Communications Branch, oversees final reporting by the National Program Office.
4b. Data Limitations/Qualifications                                                                                               |
None
4c. Third-Party Audits                                                                                                           |
In July of 2010, the Quality Staff of the Office of Environmental Information completed a Quality System Assessment (QSA) for the Office
of Atmospheric Programs. The results of the assessment were summarized as follows: "Please note that there are no findings requiring
corrective action. Review of QA requirements and interviews with management and staff revealed no weaknesses in the overall Quality
System management for OAP. Controls appear to be in place, the QA structure appears effective, there is project-level planning QA
documentation (QAPPs, QARFs) in place as well as the appropriate training and records management practices".
                                                            114                                     Back to Table of Contents

-------
Goal 1                                                          Objective 2                                                    Measure A01
Record Last Updated: 02/13/2012 01:16:47 PM
                                                                    115                                            Back to Table of Contents

-------
  Goal 1                                               Objective 2                                          Measure M9
  Measure Code :  M9 - Cumulative percentage reduction in population-weighted
  ambient concentration of ozone in monitored counties from 2003 baseline.
  Office of Air and Radiation (OAR)
   1. Measure and DQR Metadata
   Goal Number and Title                          1 - Taking Action on Climate Change and Improving Air Quality
   Objective Number and Title	2-improve Air Quality	
   Sub-Objective Number and Title                   1 - Reduce Criteria Pollutants and Regional Haze
   Strategic Target Code and Title                    1 - BV 2015, concentrations of ozone (smog) in monitored counties will decrease to .073 ppm
   Managing Office                                 Office of Air Quality Planning and Standards
   Performance Measure Term Definitions
Population-weighted: Multiply (or weight) these concentrations by the number of people living in the county where the monitor is located.
The population estimates are from the U.S. Census Bureau (2000 decennial census).

Ambient concentration: EPA tracks improvements in air quality on an annual basis by measuring the change in ambient air quality
concentrations of 8-hour ozone in counties with monitoring data weighted by the number of people living in these counties. This measure
makes use of actual, observed changes in ambient ozone levels over time to determine NAAQS program effectiveness. Three year averages
of the 4  highest daily maximum ozone values (i.e., design values) are used to help mitigate the influence of meteorology which would
otherwise confound measurement of actual program progress. Other than this that I pulled from the attached, I could add that ambient air is
the air we breathe vs emitted air from a pollution source, and a concentration is measured at a monitor.

Ozone: Ozone (O3) is a gas composed of three oxygen atoms. It is not usually emitted directly into the air, but at ground-level is created by a
chemical reaction between oxides of nitrogen (NOx) and volatile organic compounds (VOC) in the presence of sunlight. Ozone has the
same chemical structure whether it occurs miles above the earth or at ground-level and can be "good" or "bad," depending on its location in
the atmosphere.

Monitored counties: Calculate 8-hour ozone design values for 2001-2003 for every county with adequate monitoring data. A monitoring
site's design value for 8-hour ozone is expressed as the average of the fourth-highest daily maximum 8-hour average ozone concentration
for each of three consecutive years. A county's design value is the highest of these site-level design values. The national ozone monitoring
network conforms to uniform criteria for monitor siting, instrumentation, and quality assurance.


 2. Data Definition  and Source Reporting	
                                                                                                                     I
                                                        116                                   Back to Table of Contents

-------
 Goal 1                                                  Objective 2                                             Measure M9
2a. Original Data Source
State and local agency data are from State and Local Air Monitoring Stations (SLAMS).  Population data are from the Census
Bureau/Department of Commerce (2000 Census)	
2b. Source Data Collection
Source Data Collection Methods: Field monitoring; survey (2000 Census)

Date/time intervals covered by source data: 2003 to present (for air pollution data). 2000 (for census data)

EPA QA requirements/guidance governing collection: To ensure quality data, the SLAMS are required to meet the following: 1) each site
must meet network design and site criteria; 2) each site must provide adequate QA assessment, control, and corrective action functions
according to minimum program requirements; 3) all sampling methods and equipment must meet EPA reference or equivalent
requirements; 4) acceptable data validation and record keeping procedures must be followed;  and 5) data from SLAMS must be
summarized and reported annually to EPA. Finally, there are system audits that regularly review the overall air quality data collection
activity for any needed changes or corrections. Further information is available on the Internet at
http://www.epa.gov/cludygxb/programs/namslam.html and through United States EPA's Quality Assurance Handbook (EPA-454/R-98-004
Section 15).


Geographical Extent of Source Data: National

Spatial Detail Covered By the Source Data: State, Local and Tribal air pollution control agencies	
2c. Source Data Reporting
State, Local and Tribal air pollution control agencies submit data within 30 days after the end of each calendar quarter. The data can be
submitted in one of three different formats, and is submitted using an Exchange Network Node or the agency's Central Data Exchange web
interface. The submitted data are then quality assured and loaded into the AQS database.
3.  Information Systems and  Data Quality Procedures
3a.  Information Systems                                                                                                      |
The Air Quality Subsystem (AQS) stores ambient air quality data used to evaluate an area's air quality levels relative to the National
Ambient Air Quality Standards (NAAQS).

AQS has been enhanced to comply with the Agency's data standards (e.g.,  latitude/longitude, chemical nomenclature).

AQS stores the as-submitted source data and data that are aggregated to the daily, monthly, quarterly and annual values by the system.

                                                                                                                           I
                                                           117                                      Back to Table of Contents

-------
 Goal 1                                                   Objective 2                                             Measure M9
3b. Data Quality Procedures	|
AQS:  The QA/QC of the national air monitoring program has several major components: the Data Quality Objective (DQO) process,
reference and equivalent methods program, EPA's National Performance Audit Program (NPAP), system audits, and network reviews.
Please  see www.epa.gov/ttn/amtic/npaplist.html for more information.

The AQS QA/QC process also involves participation in the EPA's National Performance Audit Program (NPAP),  system audits, and
network reviews. Please see www.epa.gov/ttn/amtic/npaplist.html for more information. Under NPAP, all agencies required to report gaseous
criteria  pollutant data from their ambient air monitoring stations to EPA's Air Quality System (AQS) for comparison to the National Ambient Air
Quality Standard (NAAQS) are required to participate in EPA's NPAP TTP program. Guidance for participating in this program requires NPAP audits
of at least 20% of a Primary Quality Assurance Organization's (PQAO's) sites each year; and all sites in 5 years.
3c. Data Oversight                                                                                                             |
Team Member, Central Operations and Resources  Staff, OAQPS	
3d. Calculation Methodology                                                                                                    |
Decision Rules for Selecting Data:
All available air quality measurement data is included in the Design Value calculations except as indicated below:.
1.      Individual measurements that are flagged as being exceedances caused by "Exceptional Events" (as defined in 40 CFR Part 50.14)
and that are concurred by the EPA Regional Office are excluded.

Definitions of Variables:
For each AQS monitor, the following variables are calculated:


8-Hour Average:  Arithmetic mean of eight consecutive hourly measurements, with the time for the average defined to be the begin hour.
(There  will be 24 8-hour averages for each day.) Missing values (measurements for a specifc hour) are handled as follows: If there are less
than 6 measurements in the 8-hour period, 1A of the Method Detection Limit for the method  is used in place of the missing value.

Daily Maximum:  The maximum 8-hour average for the calendar day.

Annual 4th Maximum: The fourth highest daily maximum for the year.

Three-Year Design Value:  The average of the annual 4th maxima for the three year period.

Explanation of Calculations: Air quality levels are evaluated relative to the baseline level and the design value. The change in air quality
concentrations is then multiplied by the number of people living in the county.

Explanation of Assumptions:  Design values are  calculated for every county with adequate monitoring data. The design value is the
mathematically determined pollutant concentration at a particular site that must be reduced to, or maintained at or below the National
Ambient Air Quality  Standards (NAAQS) in order to assure attainment. The design value may be calculated based on ambient
                                                            118                                     Back to Table of Contents

-------
 Goal 1                                                   Objective 2                                             Measure M9
measurements observed at a local monitor in a 3-year period or on model estimates. The design value varies from year to year due to both
the pollutant emissions and natural variability such as meteorological conditions, wildfires, dust storms, volcanic activities etc. For more
information on design values, including a definition, see www.epa.gov/ttn/oarpg/t 1 /memoranda/cdv.pdf. This analysis assumes that the
populations of the areas are held constant at 2000 Census levels.  Data comparisons over several years allow assessment of the air
program's success.

Unit of analysis: Cumulative percent reduction in population-weighted ambient concentration


4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting
Director, Central Operations and Resources Staff, OAQPS	
4b. Data Limitations/Qualifications
There is uncertainty in the projections and near term variations in air quality (due to meteorological conditions, for example).	
4c. Third-Party Audits
2008 PIG system audit 2010 System Risk Assessment	
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                            119                                     Back to Table of Contents

-------
  Goal 1                                              Objective 2                                        Measure M91
  Measure Code : M91 - Cumulative percentage reduction in population-weighted
  ambient concentration of fine particulate matter (PM-2.5) in all monitored counties
  from 2003  baseline.
  Office of Air and Radiation (OAR)
   1. Measure and DQR Metadata	
   Goal Number and Title                         1 - Taking Action on Climate Change and Improving Air Quality
   Objective Number and Title	2-improve Air Quality	
   Sub-Objective Number and Title                  1 - Reduce Criteria Pollutants and Regional Haze
   Strategic Target Code and Title                   2 - BV 2015,concentrations of inhalable fine particles in monitored counties will decrease to 10.5 |ag/m3
   Managing Office                                Office of Air Quality Planning and Standards
   Performance Measure Term Definitions
Population-weighted: The ambient concentration multiplied by total county population, using constant population values for all years.

Ambient concentration: The highest reported site-level annual standard design value; i.e., the 3-year average annual mean 24-hour average
concentration of PM-2.5.

Fine particulate matter (PM 2.5): Particles with a diameter of 5 microns or less.

Monitored counties: The counties in the current time-frame with at least one site meeting completeness criteria that also were present in the
base period (i.e., contained at least one complete site in the period 2001-2003).


 2. Data  Definition and Source Reporting
 2a. Original Data Source
 State and local agency data are from State and Local Air Monitoring Stations (SLAMS). Population data are from the Census
 Bureau/Department of Commerce (2000 Census)	
 2b. Source Data Collection
 Source Data Collection Methods: Field monitoring; survey (2000 Census)

 Date/Time Intervals Covered by Source Data: 2003 to present (for air pollution data).  2000 (for census data)

 EPA QA Requirements/Guidance Governing Collection: To ensure quality data, the SLAMS are required to meet the following: 1) each
                                                       120                                  Back to Table of Contents

-------
 Goal 1                                                  Objective 2                                            Measure M91
site must meet network design and site criteria; 2) each site must provide adequate QA assessment, control, and corrective action functions
according to minimum program requirements; 3) all sampling methods and equipment must meet EPA reference or equivalent
requirements; 4) acceptable data validation and record keeping procedures must be followed; and 5) data from SLAMS must be
summarized and reported annually to EPA. Finally, there are system audits that regularly review the overall air quality data collection
activity for any needed changes or corrections. Further information is available on the Internet at
http://www.epa.gov/cludygxb/programs/namslam.html and through United States EPA's Quality Assurance Handbook (EPA-454/R-98-004
Section 15).

Geographical Extent of Source Data: National

Spatial Detail Covered By the Source Data: 437 counties in the 48 continental States  plus D.C.
2c. Source Data Reporting
Agencies submit air quality data to AQS thru the Agency's Central Data Exchange (CDX).
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
The Air Quality Subsystem (AQS) stores ambient air quality data used to evaluate an area's air quality levels relative to the National
Ambient Air Quality Standards (NAAQS).

AQS has been enhanced to comply with the Agency's data standards (e.g.,  latitude/longitude, chemical nomenclature).

All annual mean concentration data used in the performance analysis were extracted from the AQS. Population data were obtained from
the Bureau of the Census.

Additional information:
In January 2002, EPA  completed the reengineering of AQS to make it a more user friendly, Windows-based system. As a result, air quality
data are more easily accessible via the Internet.

Beginning in July 2003, agencies submitted air quality data to AQS thru the Agency's Central Data Exchange (CDX). CDX is intended to
be the portal through which all environmental data coming to or leaving the Agency will pass.
3b. Data Quality Procedures                                                                                                   |
The AQS QA/QC process also involves participation in the EPA's National Performance Audit Program (NPAP), system audits, and
network reviews. Please see www.epa.gov/ttn/amtic/npaplist.html for more information. Under NPAP, all agencies required to report gaseous
criteria pollutant data from their ambient air monitoring stations to EPA's Air Quality System (AQS) for comparison to the National Ambient Air
Quality Standard (NAAQS) are required to participate in EPA's NPAP TTP program. Guidance for participating in this program requires NPAP audits
of at least 20% of a Primary Quality Assurance Organization's (PQAO's) sites each year; and all sites in 5 years.	
3c. Data Oversight                                                                                                            |
                                                           121                                      Back to Table of Contents

-------
 Goal 1                                                   Objective 2                                             Measure M91
National Air Data Group [Outreach and Information Division, OAQPS] oversees operations of the Air Quality System, the database used to
store and deliver the source data.

Air Quality Monitoring Group [Air Quality Assessment Division (AQAD), OAQPS] oversees the monitoring and quality assurance of the
source data.

Air Quality Analysis Group (AQAG) [AQAD, OAQPS] oversees the transformation and data reporting aspects associated with the
Calculation of this performance measure.	
3d. Calculation Methodology
Explanation of Calculations: Air quality levels are evaluated relative to the baseline level and the design value. The change in air quality
concentrations is then multiplied by the number of people living in the county.

Explanation of Assumptions: Design values are calculated for every county with adequate monitoring data. The design value is the
mathematically determined pollutant concentration at a particular site that must be reduced to, or maintained at or below the National
Ambient Air Quality Standards (NAAQS) in order to assure attainment. The design value may be calculated based on ambient
measurements observed at a local monitor in a 3-year period or on model estimates. The design value varies from year to year due to both
the pollutant emissions and natural variability such as meteorological conditions, wildfires, dust storms, volcanic activities etc. For more
information on design values, including a definition, see www.epa.gov/ttn/oarpg/t 1 /memoranda/cdv.pdf. This analysis assumes that the
populations of the areas are held constant at 2000 Census levels. Data comparisons over several years allow assessment of the air
program's success.

Unit of analysis: Cumulative percent reduction in population-weighted ambient concentration
4.  Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Air Quality Assessment Group, OAQPS, OAR is directly responsible for the calculations associated with this performance measure.
4b. Data Limitations/Qualifications
There is uncertainty in the projections and near term variations in air quality (due to meteorological conditions, for example).	
4c. Third-Party Audits
Design Values used in this performance measure are vetted with the State and Local data reporting agencies.
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                            122                                      Back to Table of Contents

-------
Goal 1                                                    Objective 2                                              Measure M91
                                                             123                                      Back to Table of Contents

-------
  Goal 1                                                Objective 2                                          Measure O34
  Measure Code :  O34 - Cumulative millions of tons of Nitrogen Oxides (NOx)
  reduced since 2000 from mobile sources
  Office of Air and Radiation (OAR)
   1. Measure and DQR Metadata
   Goal Number and Title                          1 - Taking Action on Climate Change and Improving Air Quality
   Objective Number and Title	2-improve Air Quality	
   Sub-Objective Number and Title                   1 - Reduce Criteria Pollutants and Regional Haze
   Strategic Target Code and Title                    3 - BV 2015/ reduce emissions of nitrogen oxides (NOx)
   Managing Office                                 Office of Transportation and Air Quality
   Performance Measure Term Definitions
Mobile sources: Includes onroad cars/trucks, nonroad engines such as farm/construction, locomotives, commercial marine and aircraft.
Nitrogen oxide: NO2 (nitrogen dioxide) is a combustion product formed from the reaction of nitrogen (in the ambient air) and fuel
(gasoline, diesel fuel, - or - for stationary sources - coal) as defined by the EPA National Ambient Air Quality Standard and measurement
methods.


 2. Data Definition and Source Reporting	
 2a. Original Data Source	
 Estimates for on-road and off-road mobile source emissions are built from inventories fed into the relevant models.

 Data for the models are from many sources, including Vehicle Miles Traveled (VMT) estimates by state (Federal Highway
 Administration), the mix of VMT by type of vehicle (Federal Highway Administration), temperature, gasoline properties, and the designs
 of Inspection/Maintenance (I/M) programs.
 2b. Source Data Collection
 Source Data Collection Methods: Emission tests for engines/vehicles come from EPA, other government agencies (including state/local
 governments), academic institutions and industry. The data come from actual emission tests measuring HC (HydroCarbon), CO (Carbon
 Monoxide), NOx (Nitrogen Oxides), and PM (Particulate Matter). It is important to note that total oxides of nitrogen (NO and NO2) are
 both measured with emission standards applying to the sum of both oxides.  Usage survyes for vehicle miles traveled are obtained from
 DOT surveys and fuel usage for nonroad vehicles/engines are obtained from a variety of sources such as DOE.

 Geographical Extent of Source Data: National

 Spatial Detail  Covered By the Source Data: County
                                                         124                                   Back to Table of Contents

-------
   Goal 1                                                 Objective 2                                            Measure O34
  2c. Source Data Reporting	|
  Form/mechanism for receiving data and entering into EPA system: EPA develops and receives emission data in a g/mile or g/unit
  Work (or unit fuel consumed) basis.

  Timing and frequency of reporting: The inputs to MOVES/MOBILE 6 and NONROAD 2008 and other models are reviewed and
  updated, sometimes on an annual basis for some parameters. Generally, Vehicle Miles Traveled (VMT), the mix of VMT by type of vehicle
  (Federal Highway Administration (FHWA)-types), temperature, gasoline properties, and the designs of Inspection/Maintenance (I/M)
  programs are updated each year.

  Emission factors for all mobile sources and activity estimates for non-road sources are revised at the time EPA's Office of Transportation
  and Air Quality provides new information.

  Updates to the inputs to the models means the emissions inventories will change.	
  3. Information Systems and Data Quality Procedures
  3a.  Information Systems
  National Emissions Inventory Database. Obtained by modeling runs using MOBILE/MOVES, NONROAD, and other models.

  Please see: http://www.epa.gov/ttnchie 1 /trends/ for a summary of national emission inventories and how the numbers are obtained in
  general.

  The emission inventory contains source test data as well as usage information compiled from other sources. Also, for consistency from
  year to year and to provide a baseline over time, the emission inventories are updated for these performance measures only when it is
  essential to do so.  The source data (emissions and usage) are "transformed" into emission inventories.

  The models and input undergo peer review receiving scientific input from a variety of sources including academic institutions and public
  comments.


|  3b. Data Quality Procedures                                                                                                   |
  The emissions inventories are reviewed by both internal and external parties, including the states, locals and industries. EPA works with all
  of these parties in these reviews. Also EPA reviews the inventories comparing them to others derived in earlier years to assure that changes
  in inputs provide reasonable changes in the inventories themselves.
|  3c. Data Oversight                                                                                                           |
  EPA emission inventories for the performance measures are reviewed by various OTAQ Center directors in the Assessment and Standards
  Division. The Center Directors are responsible for vehicle, engine, fuel, and modeling data used in various EPA programs.
I                                                                                                                            I
                                                            125                                     Back to Table of Contents

-------
 Goal 1                                                   Objective 2                                            Measure O34
3d. Calculation Methodology	
Explanation of the Calculations:

EPA uses models to estimate mobile source emissions, for both past and future years.  The emission inventory estimate is detailed down to
the county level and with over 30 line items representing mobile sources.

The MOVES (Motor Vehicle Emission Simulator) model replacing the earlier MOBILE6 vehicle emission factor model is a software tool
for predicting gram per mile emissions of hydrocarbons, carbon monoxide, oxides of nitrogen, carbon dioxide, particulate matter, and
toxics from cars, trucks, and motorcycles under various conditions. Inputs to the model include fleet composition, activity, temporal
information, and control  program characteristics.  For more information on the MOBILE6 model, please visit
http://www.epa.gov/otaq/m6.htm.

The NONROAD 2008 emission inventory model replacing an earlier version of NONROAD is a software tool for predicting emissions of
hydrocarbons, carbon monoxide, oxides of nitrogen, parti culate matter, and sulfur dioxides from small and large off road vehicles,
equipment, and engines.  Inputs to the model include fleet composition, activity and temporal information. For more information on the
NONROAD model, please visit http://www.epa.gov/oms/nonrdmdl.htm.

Over the years, improved emission and usage data have led to updated emission inventories more consistent with air quality  data.

Additional information:
To keep pace with new analysis needs, new modeling approaches, and new data, EPA is currently working on a new modeling system
termed the Multi-scale Motor Vehicles and Equipment Emission System (MOVES). This new system will estimate emissions for on road
and off road sources, cover a broad range of pollutants, and allow multiple scale analysis, from fine scale analysis to national inventory
estimation.  When fully implemented, MOVES will serve as the replacement for MOBILE6 and NONROAD. The new system will not
necessarily be a single piece of software, but instead will encompass the necessary tools, algorithms,  underlying data and guidance
necessary for use in all official analyses associated with regulatory development, compliance with statutory requirements, and
national/regional inventory projections. Additional information is available on the Internet at http://www.epa.gov/otaq/ngm.htm

Unit of analysis: tons of emissions, vehicle miles traveled and hours (or fuel) used	


4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The director for Health Effects, Toxics and Benefits Center, Director of the Air Quality and Modeling Center and the Associate Director of
the Assessment and Standards Division are ultimately responsible for the performance measures. These individuals, as well as the other
Center Directors, are responsible for assuring that the emission inventory and reduction numbers used in EPA regulatory and other
programs are accurate and have obtained extensive academic, public and other review. ]	
4b. Data Limitations/Qualifications
                                                            126                                      Back to Table of Contents

-------
 Goal 1                                                   Objective 2                                             Measure O34
The limitations of the inventory estimates for mobile sources come from limitations in the modeled emission factors (based on emission
factor testing and models predicting overall fleet emission factors in g/mile) and also in the estimated vehicle miles traveled for each
vehicle class (derived from Department of Transportation data)..

For nonroad emissions, the estimates come from a model using equipment populations, emission factors per hour or unit of work, and an
estimate of usage. This nonroad emissions model accounts for over 200 types of nonroad equipment. Any limitations in the input data will
carry over into limitations in the emission inventory estimates.

Additional information about data integrity for the MOVES/MOBILE6 and NONROAD  models is available on the Internet at
http://www.epa.gov/otaq/m6.htm and http://www.epa.gov/oms/nonrdmdl.htm, respectively.

When the method for estimating emissions changes significantly, older estimates of emissions in years prior to the most recent year are
usually revised to avoid a sudden discontinuity in the apparent emissions trend may be revised to be consistent with teh new methodology
when possible.

Methods for estimating emission inventories are frequently updated to reflected the most  up-to-date inputs and assumptions.  Past emission
estimates that inform our performance measures frequently do not keep pace with the changing inventories associated with more recent
EPA rulemakings. EPA developed the initial numbers for these perfromance measures in 2002, making both current and future year
projections for on-road and nonroad. The emission estimates have been updated numerous times since then  for rulemaking packages and
will be updated for these performance measures.	
4c. Third-Party Audits
All of the inputs  for the models, the models themselves and the resultant emission inventories are reviewed as appropriate by academic
experts and also by state and local governments which use some of this information for their State Implementation Plans to meet the
National Ambient Air Quality Standards.	
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                            127                                      Back to Table of Contents

-------
  Goal 1                                                Objective 2                                          Measure P34
  Measure Code :  P34 - Cumulative tons of PM-2.5 reduced since 2000 from mobile
  sources
  Office of Air and Radiation (OAR)
   1. Measure and DQR Metadata
   Goal Number and Title                          1 - Taking Action on Climate Change and Improving Air Quality
   Objective Number and Title	2-improve Air Quality	
   Sub-Objective Number and Title                   1 - Reduce Criteria Pollutants and Regional Haze
   Strategic Target Code and Title                    5 - BV 2015/ reduce emissions of direct particulate matter (PM)
   Managing Office                                 Office of Transportation and Air Quality
   Performance Measure Term Definitions
Mobile sources: Includes onroad cars/trucks, nonroad engines such as farms/construction, locomotives, commercial marine, and aircraft.

Particulate matter (PM-2.5): Solid material 2.5 microns or smaller as defined by the EPA National Ambient Air Quality Standard and
measurement methods.


 2. Data Definition and Source Reporting	
 2a. Original Data Source	
 Estimates for on-road and off-road mobile source emissions are built from inventories fed into the relevant models.

 Data for the models are from many sources, including Vehicle Miles Traveled (VMT) estimates by state (Federal Highway
 Administration), the mix of VMT by type of vehicle (Federal Highway Administration), temperature, gasoline properties, and the designs
 of Inspection/Maintenance (I/M) programs. Usage data for nonroad comes largely from fuel consumption information from DOE.
 2b. Source Data Collection
 Source Data Collection Methods: Emission tests for engines/vehicles come from EPA, other government agencies (including state/local
 governments), academic institutions, and industry. The data come from actual emission tests measuring HC, CO, NOx , and PM emissions.
 Usage surveys for vehicle miles traveled  are obtained from DOT surveys  and fuel usage for nonroad vehicles/engines are obtained from a
 variety of sources such as DOE.

 Geographical Extent of Source Data: National and state level

 Spatial Detail Covered By the Source Data: County level data
 2c. Source Data Reporting
                                                         128                                   Back to Table of Contents

-------
 Goal 1	Objective 2	Measure P34
Form/mechanism for receiving data and entering into EPA system: EPA develops and receives emission data in a g/mile or g/unit work
(or unit fuel consumed) basis.

Timing and frequency of reporting: The inputs to MOVES/MOBILE 6 and NONROAD 2008 and other models are reviewed and
updated, sometimes on an annual basis for some parameters. Generally, Vehicle Miles Traveled (VMT), the mix of VMT by type of vehicle
(Federal Highway Administration (FHWA)-types), temperature, gasoline properties, and the designs of Inspection/Maintenance (I/M)
programs are updated each year.

Emission factors for all mobile sources and activity estimates for non-road sources are revised at the time EPA's Office of Transportation
and Air Quality provides  new information.

Updates to the inputs to the models means the emissions inventories will change.	


3. Information Systems and Data  Quality Procedures	
3a. Information Systems                                                                                                      |
National Emissions Inventory Database. Obtained by modeling runs using MOBILE/MOVES, NONROAD, and other models.

Please see: http://www.epa.gov/ttn/chief/trends/ for a summary of national emission inventories and how the numbers are obtained in
general.

The emission  inventory contains source test data as well as usage information compiled from other sources. Also, for consistency from
year to year and to provide a baseline over time, the emission inventories are updated for these performance measure only when it is
essential to do so.  The source data (emissions and usage) are "transformed" into emission inventories.

The models and input undergo peer review receiving scientific input from a variety of sources including academic institutions and public
comments.
3b. Data Quality Procedures                                                                                                   |
The emissions inventories are reviewed by both internal and external parties, including the states, locals and industries. EPA works with all
of these parties in these reviews. Also, EPA reviews the inventories comparing them to other derived in earlier years to assure that changes
in inputs provide reasonable  changes in the inventories themselves	
3c. Data Oversight
EPA emission inventories for the performance measure are reviewed by various OTAQ Center Directors in the Assessment and Standards
Division. The Center Directors are responsible for vehicle,  engine, fuel, and modeling data used in various EPA programs.

3d. Calculation Methodology
Explanation of the Calculations:
                                                          129                                     Back to Table of Contents

-------
 Goal 1                                                  Objective 2                                             Measure P34

EPA uses models to estimate mobile source emissions, for both past and future years. The emission inventory estimate is detailed down to
the county level and with over 30 line items representing mobile sources.

The MOVES (Motor Vehicle Emission Simulator) model replacing the earlier MOBILE6 vehicle emission factor model is a software tool
for predicting gram per mile emissions of hydrocarbons, carbon monoxide, oxides of nitrogen, carbon dioxide, particulate matter, and
toxics from cars, trucks, and motorcycles under various conditions. Inputs to the model include fleet composition, activity, temporal
information, and control program characteristics.  For more information on the MOBILE6 model, please visit
http://www.epa.gov/otaq/m6.htm.

The NONROAD 2008 emission inventory model replacing earlier versions of NONROAD is a software tool for predicting emissions of
hydrocarbons, carbon monoxide, oxides of nitrogen, parti culate matter, and sulfur dioxides from small and large off road vehicles,
equipment, and engines.  Inputs to the model include fleet composition, activity and temporal information. For more information on the
NONROAD model, please visit http://www.epa.gov/oms/nonrdmdl.htm.

Additional information:
To keep pace with new analysis needs, new modeling approaches, and new data, EPA is currently working on a new modeling system
termed the Multi-scale Motor Vehicles and Equipment Emission System (MOVES). This new system will estimate emissions for on road
and off road sources, cover a broad range of pollutants, and allow multiple scale analysis, from fine scale analysis to national inventory
estimation.  When fully implemented, MOVES will serve as the replacement for MOBILE6 and NONROAD. The new system will not
necessarily be a single piece of software, but instead will encompass the necessary tools, algorithms, underlying data and guidance
necessary for use in all official analyses associated with regulatory development, compliance with statutory requirements, and
national/regional inventory projections. Additional information is available on the Internet at http://www.epa.gov/otaq/ngm.htm

Unit of analysis: tons of emissions, vehicle miles traveled, and hours (or fuel) used]	


4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting	
Team Member, Planning and Budget Office, OTAQ	
4b. Data Limitations/Qualifications	
The limitations of the inventory estimates for mobile sources come from limitations in the modeled emission factors (based on emission
factor testing and models predicting overall fleet emission factors in g/mile)  and also in the estimated vehicle miles traveled for each
vehicle class (derived from Department of Transportation data)..

For nonroad emissions, the estimates come from a model using equipment populations, emission factors per hour or unit of work, and an
estimate of usage.  This nonroad emissions model accounts for over 200 types of nonroad equipment. Any limitations in the input data will
carry over into limitations in the  emission inventory estimates.
                                                            130                                     Back to Table  of Contents

-------
 Goal 1                                                   Objective 2                                             Measure P34

Additional information about data integrity for the MOVES/MOBILE6 and NONROAD models is available on the Internet at
http://www.epa.gov/otaq/m6.htm and http://www.epa.gov/oms/nonrdmdl.htm, respectively.

When the method for estimating emissions changes significantly, older estimates of emissions in years prior to the most recent year may be
revised to be consistent with the new methodology when possible.

Methods for estimating emission inventories are frequently updated to reflect the most up-to-date inputs and assumptions. Past emission
estimates that inform our performance measure frequently do not keep pace with the changing inventories associated with more measures
in 2002, making both current and future year projections for on-road and nonroad. The emission estimates have been updated numerous
times since then for rulemaking packages and will be updated for these performance measures.
4c. Third-Party Audits
All of the inputs for the models,  the models themselves, and the resultant emission inventories are reviewed as appropriate by academic
experts and, also, by state/local governments which use some of this information for their State Implementation Plans to meet the National
Ambient Air Quality  Standards.	
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                            131                                      Back to Table of Contents

-------
  Goal 1                                              Objective 3                                         Measure S01
  Measure Code : SOI - Remaining US Consumption of hydrochlorofluorocarbons
  (HCFCs), chemicals that deplete the Earth's protective ozone layer, measured in
  tons of Ozone Depleting Potential (ODP).
  Office of Air and Radiation (OAR)
   1. Measure and DQR Metadata	
   Goal Number and Title                          1 - Taking Action on Climate Change and Improving Air Quality
   Objective Number and Title	3 - Restore the Ozone Layer	
   Sub-Objective Number and Title                  1 ~ Reduce Consumption of Ozone-depleting Substances
   Strategic Target Code and Title                    1 - BV 2015, U.S. reduce consumption of hydrochlorofluorocarbons (HCFCs), chemicals
   Managing Office                                Office of Atmospheric Programs
   Performance Measure Term Definitions
Remaining: The term "Remaining" is defined as that which remains, especially after something else has been removed.

US consumption: Class II controlled substances are compounds that have an ozone depletion potential (ODP) less than 0.2, and are all
hydrochlorofluorocarbons (HCFCs). HCFCs were developed as transitional substitutes for Class I substances and are subject to a later
phaseout schedule than Class I substances.

Although there are currently 34 controlled HCFCs, only a few are commonly used. The most widely used have been HCFC-22 (usually a
refrigerant), HCFC-141b (a solvent and foam-blowing agent), and HCFC-142b (a foam-blowing agent and component in refrigerant
blends).

As a Party to the Montreal Protocol, the U.S. must incrementally decrease HCFC consumption and production, culminating in a complete
HCFC phaseout in 2030. The major milestones that are upcoming for developed countries are a reduction in 2010 to at least 75 percent
below baseline HCFC levels and a reduction in 2015 to at least 90 percent below baseline.

Section 605 of the Clean Air Act sets the U.S. phaseout targets  for Class II substances. In 1993, the EPA established the phaseout
framework and the "worst-first" approach that focused first on  HCFC-22, HCFC-141b, and HCFC-142b because these three HCFCs have
the highest ODPs of all HCFCs. To meet the required 2004 reduction, the EPA phased out HCFC-141b in 2003 and froze the production
and consumption of HCFC-22 and HCFC-142b. In 2009, EPA reduced the production and import of virgin HCFC-22 and HCFC-142b and
limited the use of those compounds to meet the Montreal Protocol's 2010 milestones.

EPA ensures that HCFC consumption in the U.S. is 75% below the U.S. baseline (as required under the Montreal Protocol) by issuing
allowances to producers and importers of HCFCs.  The "2010 HCFC Allocation Rule" allocated allowances for each year between 2010 and
                                                       132                                   Back to Table of Contents

-------
  Goal 1                                                  Objective 3                                            Measure S01
2014. To meet the stepdown, the number of allowances for HCFC-22 and HCFC-142b were less than for the 2003-2009 control periods.
EPA also issued allowances for HCFC-123, HCFC-124, HCFC-225ca, and HCFC-225cb. The rules also limited the use of virgin HCFC-22
and HCFC-142b to existing refrigeration and air-conditioning equipment. The "Pre-Charted Appliances Rule" banned the sale or
distribution of air-conditioning and refrigeration products containing HCFC-22, HCFC-142b, or blends containing one or both of these
substances, beginning January 1, 2010.

 The "2010 HCFC Allocation Rule" was challenged in the U.S.  Court of Appeals for the D.C. Circuit in Arkema v EPA . In August, 2010,
the court decided against EPA. EPA interprets the Court's decision as vacating the portion of the rule that establishes
company-by-company production and consumption baselines and calendar-year allowances for HCFC-22 and HCFC-142b.  All  other
aspects of the rule are intact. On August 5, 2011, EPA issued an interim final rule that establishes new company-by-company HCFC-22 and
HCFC-142b baselines and allocates production and consumption allowances for 2011.

EPA is developing regulations that will issue allowances for the 2012-2014 control periods in response to the court's decision in  Arkema v
EPA .

Hydrochlorofluorocarbon (HCFC): a compound consisting of hydrogen, chlorine, fluorine, and carbon
The HCFCs are one class of chemicals being used to replace the chlorofluorocarbons (CFCs). They contain chlorine and thus deplete
stratospheric ozone, but to a much lesser extent than CFCs. HCFCs have ozone depletion potentials (ODPs) ranging from 0.01 to 0.1.

Class II Ozone-Depleting Substance (ODS): a chemical with an ozone-depletion potential of less than 0.2
Currently,  all of the HCFCs are class II substances, and the only Class II substances are HCFCs.

Ozone Depletion Potential (ODP): a number that refers to the  amount of ozone depletion caused by a substance
The ODP is the ratio of the impact on ozone of a chemical compared to the impact of a similar mass of CFC-11. Thus, the ODP of CFC-11
is defined to be 1.0. Other CFCs and HCFCs  have ODPs that range from 0.01 to 1.0.

Tons of Ozone Depleting Potential: metric tons of ODS weighted by their Ozone Depletion Potential (ODP), otherwise referred to as ODP
tons.

See http://www.epa.gov/ozone/desc.html for  additional information on ODSs. See http://www.epa.gov.ozone/intpol/index.html  for
additional information about the Montreal Protocol. See http://www.unmfs.org/ for more information about the Multilateral Fund.
 2. Data Definition and Source Reporting
 2a. Original Data Source
 US Companies Producing, Importing and Exporting ODS. Progress on restricting domestic exempted consumption of Class II HCFCs is
 tracked by monitoring industry reports of compliance with EPA's phase-out regulations. Data are provided by U.S. companies producing,
 importing, and exporting ODS.  Corporate data are typically submitted as quarterly reports.  Specific requirements, as outlined in the Clean
                                                           133                                     Back to Table of Contents

-------
 Goal 1                                                    Objective 3                                              Measure S01
Air Act, are available on the Internet at: http://www.epa.gov/ozone/title6/index.html.

The International Trade Commission also provides monthly information on US production, imports, and exports.
2b. Source Data Collection                                                                                                         |
Source Data Collection Methods: § 82.24  Recordkeeping and reporting requirements for class II controlled substances.
 a) Recordkeeping and reporting.  Any person who produces, imports, exports, transforms, or destroys class II controlled substances must
comply with the following recordkeeping and reporting requirements:
(1) Reports required by this section must be mailed to the Administrator within 30 days of the end of the applicable reporting period, unless
otherwise specified.
(2) Revisions of reports that are required by this section must be mailed to the Administrator within 180 days of the end of the applicable
reporting period, unless otherwise specified.
(3) Records and  copies of reports required by this section must be retained for three years.
(4) Quantities of class II controlled substances must be stated in terms of kilograms in reports required by this section.
(5) Reports and records  required by this section may be used for purposes of compliance determinations. These requirements are not
intended as a limitation on the use of other evidence admissible under the Federal Rules of Evidence. Failure to provide the reports,
petitions and records required by this section and to certify the accuracy of the information in the reports, petitions and records required by
this section, will be considered a violation of this subpart. False statements made in reports, petitions and records will be considered
violations of Section 113 of the Clean Air Act and under 18 U.S.C. 1001.
(b) Producers.  Persons ("producers") who produce class II controlled substances during a control period must comply with the following
recordkeeping and reporting requirements:
(1) Reporting Producers.  For each quarter, each producer of a class II controlled substance must provide the Administrator with a report
containing the following information:
(i) The quantity (in kilograms) of production of each class II controlled substance used in processes resulting in their transformation by the
producer and the quantity (in kilograms) intended for transformation by a second party;
(ii) The quantity (in kilograms) of production of each class II controlled substance used in processes resulting in their destruction by the
producer and the quantity (in kilograms) intended for destruction by a second party;
(iii) The expended allowances for each class II controlled substance;
(iv) The producer's total of expended and unexpended production allowances, consumption allowances, export production allowances, and
Article 5 allowances at the end of that quarter;
(v) The quantity (in kilograms) of class II controlled substances sold or transferred during the quarter to a person other than the producer
for use in processes resulting in their transformation or eventual destruction;
(vi) A list of the quantities and names of class II controlled substances, exported by the producer to a Party to the Protocol, that will be
transformed or destroyed and therefore were not produced expending production or consumption allowances;
(vii) For transformation in the U.S. or by a person of another Party, one copy of a transformation verification from the transformer for a
specific class II controlled substance and a list of additional quantities shipped to that same transformer for the quarter;
(viii) For destruction in the U.S. or by a person of another Party, one copy of a destruction verification as required in paragraph  (e) of this
section for a particular destroyer, destroying the same class II controlled substance, and a list of additional quantities shipped to that same
destroyer for the quarter;
                                                             134                                       Back to Table of Contents

-------
 Goal 1                                                    Objective 3                                              Measure S01
(ix) In cases where the producer produced class II controlled substances using export production allowances, a list of U.S. entities that
purchased those class II controlled substances and exported them to a Party to the Protocol;
(x) In cases where the producer produced class II controlled substances using Article 5 allowances, a list of U.S. entities that purchased
those class II controlled substances and exported them to Article 5 countries; and
(xi) A list of the HCFC 141b-exemption allowance holders from whom orders were received and the quantity (in kilograms) of
HCFC-141b requested and produced.
(2) Recordkeeping—Producers.  Every producer of a class II controlled substance during a control period must maintain the following
records:
(i) Dated records of the quantity (in kilograms) of each class II controlled substance produced at each facility;
(ii) Dated records of the quantity (in kilograms) of class II controlled substances produced for use in processes that result in their
transformation or for use in processes that result in their destruction;
(iii) Dated records of the quantity (in kilograms) of class II controlled substances sold for use in processes that result in their transformation
or for use in processes that result in their destruction;
(iv) Dated records of the quantity (in kilograms) of class II controlled substances produced with export production allowances or Article 5
allowances;
(v) Copies of invoices or receipts documenting sale of class II controlled substances for use in processes that result in their transformation
or for use in processes that result in their destruction;
(vi) Dated records of the quantity (in kilograms) of each class II controlled substance used at each facility as feedstocks or destroyed in the
manufacture of a class II controlled substance or in the manufacture of any other substance, and any class II controlled substance
introduced into the production process of the same class II controlled substance at each facility;
(vii) Dated records of the quantity (in kilograms) of raw materials and feedstock chemicals used at each facility for the production of class
II controlled substances;
(viii) Dated records of the shipments of each class II controlled substance produced at each plant;
(ix) The quantity (in kilograms) of class II controlled substances, the date received, and names and addresses of the source of used
materials containing class II controlled substances which are recycled or reclaimed at each plant;
(x) Records of the date, the class II controlled substance, and the estimated quantity of any spill or release of a class  II controlled substance
that equals or exceeds 100 pounds;
(xi) Transformation verification in the case of transformation, or the destruction verification in the case of destruction as required in
paragraph (e) of this section showing that the purchaser or recipient of a class II controlled substance, in the U.S. or  in another country that
is a Party, certifies the intent to either transform or destroy the class II controlled substance, or sell the class II controlled substance for
transformation or destruction in cases when allowances were not expended;
(xii) Written verifications from a U.S. purchaser that the class II controlled substance was exported to a Party in accordance with the
requirements in this section, in cases where export production allowances were expended to produce the class II controlled substance;
(xiii) Written verifications from a U.S. purchaser that the class II controlled  substance was exported to an Article 5 country in cases where
Article 5 allowances were expended to produce the class II controlled substance;
(xiv) Written verifications from a U.S. purchaser that HCFC-141b was manufactured for the express purpose of meeting HCFC-141b
exemption needs in accordance with information submitted under §82.16(h), in cases where HCFC-141b exemption allowances were
expended to produce the HCFC-141b.
                                                             135                                       Back to Table of Contents

-------
 Goal 1                                                    Objective 3                                              Measure S01
(3) For any person who fails to maintain the records required by this paragraph, or to submit the report required by this paragraph, the
Administrator may assume that the person has produced at full capacity during the period for which records were not kept, for purposes of
determining whether the person has violated the prohibitions at §82.15.
(c) Importers. Persons ("importers") who import class II controlled substances during a control period must comply with the following
recordkeeping and reporting requirements:
(1) Reporting Importers. For each quarter, an importer of a class II controlled substance (including importers of used class II controlled
substances) must submit to the Administrator a report containing the following information:
(i) Summaries of the records required in paragraphs (c)(2)(i) through (xvi) of this section for the previous quarter;
(ii) The total quantity (in kilograms) imported of each class II controlled substance for that quarter;
(iii) The commodity code for the class II controlled substances imported, which must be one of those listed in Appendix K to this subpart;
(iv) The quantity (in kilograms) of those class II controlled substances imported that are used class II controlled substances;
(v) The quantity (in kilograms) of class II controlled  substances imported for that quarter and totaled by chemical for the control period to
date;
(vi) For substances for which EPA has apportioned baseline production and consumption allowances, the importer's total sum of expended
and unexpended consumption allowances by chemical as of the end of that quarter;
(vii) The quantity (in kilograms) of class II controlled substances imported for use in processes resulting in their transformation or
destruction;
(viii) The quantity (in kilograms) of class II controlled substances sold or transferred during that quarter to each person for use in processes
resulting in their transformation or eventual destruction; and
(ix) Transformation verifications showing that the purchaser or recipient of imported class II controlled substances intends to transform
those substances or destruction verifications showing that the purchaser or recipient intends  to destroy the class II controlled substances (as
provided in paragraph (e) of this section).
(x) [Reserved]
(xi) A list of the HCFC 141b-exemption allowance holders from whom orders were received and the quantity (in kilograms) of
HCFC-141b requested and imported.
(2) Recordkeeping—Importers.  An importer of a class II controlled substance (including used class II controlled substances) must
maintain the following records:
(i) The quantity (in kilograms) of each class II controlled substance imported, either alone or in mixtures, including the percentage of each
mixture which consists of a class II controlled substance;
(ii) The quantity (in kilograms) of those class II controlled substances imported that are used and the information provided with the petition
where a petition is required under paragraph (c)(3) of this section;
(iii) The quantity (in kilograms) of class II controlled substances other than transhipments or used substances imported for use in processes
resulting in their transformation or destruction;
(iv) The quantity (in kilograms) of class II controlled substances other than transhipments or used substances imported and sold for use in
processes that result in their destruction or transformation;
(v) The date on which the class II controlled substances were imported;
(vi) The port of entry through which the class II controlled substances passed;
(vii) The country from which the imported class II controlled substances were imported;
                                                             136                                       Back to Table of Contents

-------
 Goal 1                                                   Objective 3                                             Measure S01
(viii) The commodity code for the class II controlled substances shipped, which must be one of those listed in Appendix K to this subpart;
(ix) The importer number for the shipment;
(x) A copy of the bill of lading for the import;
(xi) The invoice for the import;
(xii) The quantity (in kilograms) of imports of used class II controlled substances;
(xiii) The U.S. Customs entry form;
(xiv) Dated records documenting the sale or transfer of class II controlled substances for use in processes resulting in their transformation
or destruction;
(xv) Copies of transformation verifications or destruction verifications indicating that the class II controlled substances will be transformed
or destroyed (as provided in paragraph (e) of this section).
(xvi) Written verifications from a U.S. purchaser that HCFC-141b was imported for the express purpose of meeting HCFC-141b exemption
needs in accordance with information submitted under §82.16(h), and that the quantity will not be resold, in cases where HCFC-141b
exemption allowances were expended to import the HCFC-141b.
(3) Petition to import used class II controlled substances and transhipment-Importers.  For each individual shipment over 5 pounds of a
used class II controlled substance as defined in §82.3 for which EPA has apportioned baseline production and consumption allowances, an
importer must submit directly to the Administrator, at least 40 working days before the shipment is to leave the foreign port of export, the
following information in a petition:
(i) The name and quantity (in kilograms) of the used class II controlled substance to be imported;
(ii) The name and address of the importer, the importer ID number, the contact person, and the phone and fax numbers;
(iii) Name, address, contact person, phone number and fax number of all previous source facilities from which the used  class II controlled
substance was recovered;
(iv) A detailed description of the previous use of the class II controlled substance at each source facility and a best estimate of when the
specific controlled  substance was put into the equipment at each source facility, and, when possible, documents indicating the date the
material was put into the equipment;
(v) A list of the name, make and model number of the equipment from which the material was recovered  at each source facility;
(vi) Name, address, contact person, phone number and fax number of the exporter and of all persons to whom the material was transferred
or sold after it was  recovered from the source facility;
(vii) The U.S. port  of entry for the import, the expected date of shipment and the vessel transporting the chemical. If at the time of
submitting a petition the importer does not know the U.S. port of entry, the expected date of shipment and the vessel transporting the
chemical, and the importer receives a non-objection notice for the individual  shipment in the petition, the importer is required to notify the
Administrator of this information prior to the actual U.S. Customs entry of the individual shipment;
(viii) A description of the intended use of the used class II controlled substance, and, when possible, the name, address,  contact person,
phone number and  fax number of the ultimate purchaser in the United States;
(ix) The name, address, contact person, phone number and fax number of the U.S. reclamation facility, where applicable;
(x) If someone at the source facility recovered the class II controlled substance from the  equipment, the name and phone and fax numbers
of that person;
(xi) If the imported class II controlled substance was reclaimed in a foreign Party, the name, address, contact person, phone number and fax
number of any or all foreign reclamation facility(ies) responsible for reclaiming the cited shipment;
                                                            137                                     Back to Table of Contents

-------
 Goal 1                                                    Objective 3                                              Measure S01
(xii) An export license from the appropriate government agency in the country of export and, if recovered in another country, the export
license from the appropriate government agency in that country;
(xiii) If the imported used class II controlled substance is intended to be sold as a refrigerant in the U.S., the name and address of the U.S.
reclaimer who will bring the material to the standard required under subpart F of this part, if not already reclaimed to those specifications;
and
(xiv) A certification of accuracy of the information submitted in the petition.
(4) Review of petition  to import used class II controlled substances and transhipments—Importers.  Starting on the first working day
following receipt by the Administrator of a petition to import a used class II controlled substance, the Administrator will initiate a review of
the information submitted under paragraph (c)(3) of this section and take action within 40 working days to issue either an objection-notice
or a non-objection notice for the individual shipment to the person who submitted the petition to import the used class II controlled
substance.
(i) The Administrator  may issue an objection notice to a petition for the following reasons:
(A) If the Administrator determines that the information is insufficient, that is, if the petition lacks or appears to lack any of the information
required under paragraph (c)(3) of this section;
(B) If the Administrator determines that any portion of the petition contains false or misleading information, or the Administrator has
information from other U.S. or foreign government agencies indicating that the petition contains false or misleading information;
(C) If the transaction appears to be contrary to provisions of the Vienna Convention on Substances that Deplete the Ozone Layer, the
Montreal Protocol and Decisions by the Parties, or the non-compliance procedures outlined and instituted by the Implementation
Committee of the Montreal Protocol;
(D) If the appropriate  government agency in the exporting country has not agreed to issue an export license for the cited individual
shipment of used class II controlled substance;
(E) If reclamation capacity is installed or is being installed for that specific class II controlled substance in the country of recovery or
country of export and  the capacity is funded in full or in part through the Multilateral Fund.
(ii) Within ten (10) working days after receipt of the objection notice, the importer may re-petition the Administrator, only if the
Administrator indicated  "insufficient information" as the basis for the objection notice. If no appeal is taken by the tenth working day after
the date on the objection notice, the objection shall become final.  Only one re-petition will be accepted for any original petition received  by
EPA.
(iii) Any information contained in the re-petition which is inconsistent with the original petition must be identified and a description of the
reason for the inconsistency must accompany the re-petition.
(iv) In cases where the Administrator does not object to the petition based on the criteria listed in paragraph (c)(4)(i) of this section, the
Administrator will issue a non-objection notice.
(v) To pass the approved used class II controlled substances through U.S. Customs, the petition and the non-objection notice issued by EPA
must accompany the shipment through U.S. Customs.
(vi) If for some reason, following EPA's issuance of a non-objection notice, new information is brought to EPA's attention which shows
that the non-objection notice was issued based on false information, then EPA has the right to:
(A) Revoke the non-objection notice;
(B) Pursue all means to ensure that the class II controlled substance is not imported into the U.S.;  and
(C) Take appropriate enforcement actions.
                                                             138                                      Back to Table of Contents

-------
 Goal 1                                                    Objective 3                                              Measure S01
(vii) Once the Administrator issues a non-objection notice, the person receiving the non-objection notice is permitted to import the
individual shipment of used class II controlled substance only within the same control period as the date stamped on the non-objection
notice.
(viii) A person receiving a non-objection notice from the Administrator for a petition to import used class II controlled substances must
maintain the following records:
(A) A copy of the petition;
(B) The EPA non-objection notice;
(C) The bill of lading for the import; and
(D) U.S. Customs entry documents for the import that must include one of the commodity codes from Appendix K to this subpart.
(5) Recordkeeping for transhipments—Importers. Any person who tranships a class II controlled substance must maintain records that
indicate:
(i) That the class II controlled substance shipment originated in a foreign country;
(ii) That the class II controlled substance shipment is destined for another foreign country; and
(iii) That the class II controlled substance shipment will  not enter interstate commerce within the U.S.
(d) Exporters. Persons ("exporters") who export class II controlled substances during a control period must comply with the following
reporting requirements:
(1) Reporting—Exporters.  For any exports of class II controlled substances not reported under §82.20 (additional consumption
allowances), or under paragraph (b)(2) of this section (reporting for producers of class II controlled substances), each exporter who
exported a class II controlled substance must submit to the Administrator the following information within 30 days after the end of each
quarter in which the unreported exports left the U.S.:
(i) The names and addresses of the exporter and the recipient of the exports;
(ii) The exporter's Employer Identification Number;
(iii) The type and quantity  (in kilograms) of each class II controlled substance exported and what percentage, if any of the class II
controlled substance is used;
(iv) The date  on which, and the port from which, the class II controlled substances  were exported from the U.S. or its territories;
(v) The country to which the class II controlled substances were exported;
(vi) The quantity (in kilograms) exported to each Article 5 country;
(vii) The commodity code  for the class II controlled substances shipped, which must be one of those listed in Appendix K to this subpart;
(viii) For persons reporting transformation or destruction, the invoice or sales agreement containing language similar to the transformation
verifications that the purchaser or recipient of imported class II controlled substances intends to transform those substances, or destruction
verifications showing that the purchaser or recipient intends to destroy the class II controlled substances (as provided in paragraph (e) of
this section).
(2) Reporting export production allowances—Exporters. In addition to the information required in paragraph (d)(l) of this section, any
exporter using export production allowances must also provide the following to the Administrator:
(i) The Employer Identification Number on the Shipper's Export Declaration Form or Employer Identification Number of the shipping
agent shown on the U.S. Customs Form 7525;
(ii) The exporting vessel on which the class II controlled substances were shipped; and
(iii) The quantity (in kilograms) exported to each Party.
                                                             139                                       Back to Table of Contents

-------
 Goal 1                                                    Objective 3                                              Measure S01
(3) Reporting Article 5 allowances—Exporters.  In addition to the information required in paragraph (d)(l) of this section, any exporter
using Article 5 allowances must also provide the following to the Administrator:
(i) The Employer Identification Number on the Shipper's Export Declaration Form or Employer Identification Number of the shipping
agent shown on the U.S. Customs Form 7525; and
(ii) The exporting vessel on which the class II controlled substances were shipped.
(4) Reporting used class II controlled substances—Exporters.  Any exporter of used class II controlled substances must indicate on the bill
of lading or invoice that the class II controlled substance is used, as defined in §82.3.
(e) Transformation and destruction.   Any person who transforms or destroys class II controlled substances must comply with the following
recordkeeping and reporting requirements:
(1) Recordkeeping—Transformation and destruction.  Any person who transforms or destroys class II controlled substances produced or
imported by another person must maintain the following:
(i) Copies of the invoices or receipts documenting the sale or transfer of the class II controlled substances to the person;
(ii) Records identifying the producer or importer of the class II controlled substances received by the person;
(iii) Dated records of inventories of class II controlled substances at each plant on the first day of each quarter;
(iv) Dated records of the quantity (in kilograms) of each class II controlled substance transformed or destroyed;
(v) In the case where class II controlled substances were purchased or transferred for transformation purposes, a copy of the person's
transformation verification as provided under paragraph (e)(3)of this section.
(vi) Dated records of the names, commercial  use, and quantities (in kilograms) of the resulting chemical(s) when the class II controlled
substances are transformed; and
(vii) Dated records of shipments to purchasers of the resulting chemical(s) when the class II controlled substances are transformed.
(viii) In the case where class II controlled substances were purchased or transferred for destruction purposes, a copy of the person's
destruction verification,  as provided under paragraph (e)(5) of this section.
(2) Reporting—Transformation and destruction. Any person who transforms or destroys class II controlled substances and who has
submitted a transformation verification ((paragraph (e)(3) of this section) or a destruction verification  (paragraph (e)(5) of this section) to
the producer or importer of the class II controlled substances,  must report the following:
(i) The names and quantities (in kilograms) of the class II controlled substances transformed for each control period within 45 days of the
end of such control period; and
(ii) The names and quantities (in kilograms) of the class II controlled  substances destroyed for each control period within 45 days of the
end of such control period.
(3) Reporting—Transformation.  Any person who purchases class II controlled substances for purposes of transformation must provide the
producer or importer with a transformation verification that the class II controlled substances are to be used in processes that result in their
transformation.
(i) The transformation verification shall include the following:
(A) Identity and  address of the person intending to transform the class II controlled substances;
(B) The quantity (in kilograms) of class II controlled substances intended for transformation;
(C) Identity of shipments by purchase order number(s), purchaser account number(s), by  location(s), or other means of identification;
(D) Period of time over which the person intends to transform the class II controlled substances; and
(E) Signature of the verifying person.
                                                             140                                      Back to Table of Contents

-------
 Goal 1                                                   Objective 3                                             Measure S01
(ii) [Reserved]
(4) Reporting—Destruction.  Any person who destroys class II controlled substances shall provide EPA with a one-time report containing
the following information:
(i) The destruction unit's destruction efficiency;
(ii) The methods used to record the volume destroyed;
(iii) The methods used to determine destruction efficiency;
(iv) The name of other relevant federal or state regulations that may apply to the destruction process;
(v) Any changes to the information in paragraphs (e)(4)(i), (ii), and (iii) of this section must be reflected in a revision to be submitted to
EPA within 60 days of the change(s).
(5) Reporting—Destruction.  Any person who purchases or receives and subsequently destroys class II controlled substances that were
originally produced without expending allowances shall provide the producer or importer from whom it purchased or received the class II
controlled substances with a verification that the class II controlled substances will be used in processes that result in their destruction.
(i) The destruction verification shall include the following:
(A) Identity and address of the person intending to destroy class II controlled substances;
(B) Indication of whether those class II controlled substances will be completely destroyed, as defined in §82.3, or less than completely
destroyed, in which case the destruction efficiency at which such substances will be destroyed must be included;
(C) Period of time over which the person intends to destroy class II controlled substances; and
(D) Signature of the verifying person.
(ii) [Reserved]
(f) Heels-Recordkeeping and reporting. Any person who brings into the U.S. a rail car, tank truck, or ISO tank containing a heel, as
defined in §82.3, of class II controlled substances, must take the following actions:
(1) Indicate on the bill of lading or invoice that the class II controlled substance in the container is a  heel.
(2) Report within 30 days of the end of the control period the quantity (in kilograms) brought into the U.S. and certify:
(i) That the residual quantity (in kilograms) in each shipment is no more than 10 percent of the volume of the container;
(ii) That the residual quantity (in kilograms) in each shipment will either:
(A) Remain in the container and be included in a future shipment;
(B) Be recovered and transformed;
(C) Be recovered and destroyed; or
(D) Be recovered for a non-emissive use.
(3) Report on the final disposition of each shipment within 30 days of the end of the control period.
(g) HCFC 141b exemption allowances—Reporting and recordkeeping. (1)  Any person allocated HCFC-141b exemption allowances who
confers a quantity of the HCFC-141b exemption allowances to a producer or import and places an order for the production or import of
HCFC-141b with a verification that the HCFC-141b will only be used for the exempted purpose and not be resold must submit semi-annual
reports, due 30 days after the end of the second and fourth respectively, to the Administrator containing the following information:
(i) Total quantity (in kilograms) HCFC-141b received during the 6 month period; and
(ii) The identity of the supplier of HCFC-141b on a shipment-by-shipment basis during the 6 month period.
(2) Any person allocated HCFC-141b exemption allowances must keep records of letters to producers and importers conferring
unexpended HCFC-141b exemption allowances for the specified control period in the notice, orders for the production or import of
                                                             141                                      Back to Table of Contents

-------
 Goal 1                                                 Objective 3                                            Measure S01
HCFC-141b under those letters and written verifications that the HCFC-141b was produced or imported for the express purpose of meeting
HCFC-141b exemption needs in accordance with information submitted under §82.16(h), and that the quantity will not be resold.
[68 FR 2848, Jan. 21, 2003, as amended at 71 FR 41172, July 20, 2006

EPA QA requirements/guidance governing collection.  Reporting and record-keeping requirements are published in 40 CFR Part 82,
Subpart A, Sections 82.9 through 82.13. These sections of the Stratospheric Ozone Protection Rule specify the required data and
accompanying documentation that companies must submit or maintain on-site to demonstrate their compliance with the regulations.	
2c. Source Data Reporting                                                                                                    |
Form/mechanism for receiving data and entering into EPA system: Data can be submitted on paper form or via EPA's Central Data
Exchange. Complete information on reporting options/format can be found at: http://www.epa.gov/ozone/record/index.html


Timing and frequency of reporting: Quarterly (EPA's regulations specify a quarterly reporting system for U.S. companies) and monthly
(for the International Trade Commission).

Quarterly Schedule for US Companies
Quarter 1: January 1 - March 31
Quarter 2: April 1 - June 30
Quarters: July 1 -Sept. 30
Quarter 4: October 1 - Dec.  31
3. Information Systems and Data Quality Procedures
3a.  Information Systems
The Allowance Tracking System (ATS) database is maintained by the Stratospheric Protection Division (SPD). ATS is used to compile and
analyze quarterly information from companies on U.S. production, imports, exports, transformations, and allowance trades of
ozone-depleting substances (ODS), as well as monthly information on domestic production, imports, and exports from the International
Trade Commission.

The Allowance Tracking System contains transformed data.

The Allowance Tracking System meets relevant EPA standards for information system integrity.
3b. Data Quality Procedures                                                                                                  |
The ATS is programmed to ensure consistency of the data elements reported by companies. The tracking system flags inconsistent data for
review and resolution by the tracking system manager. This information is then cross-checked with compliance data submitted by
reporting companies. SPD maintains a user's manual for the ATS that specifies the standard operating procedures for data entry and data
analysis.

                                                          142                                     Back to Table of Contents

-------
 Goal 1                                                  Objective 3                                             Measure S01

The data are subject to an annual quality assurance review, coordinated by Office of Air and Radiation (OAR) staff separate from those on
the team normally responsible for data collection and maintenance.

Regional inspectors also perform inspections and audits on-site at the producers', importers', and exporters' facilities. These audits verify
the accuracy of compliance data submitted to EPA through examination of company records.

The ATS data are subject to a Quality Assurance Plan (Quality Assurance Plan, USEPA Office of Atmospheric Programs, July 2002).
3c. Data Oversight
Branch Chief, Stratospheric Program Implementation Program, OAP, OAR	
3d. Calculation Methodology
Explanation of Calculations: Data are aggregated across all U.S. companies for each individual ODS to analyze U.S. total consumption
and production.

Unit of analysis: Tons of ODP
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Branch Chief, Stratospheric Program Implementation Program, OAP, OAR
4b. Data Limitations/Qualifications
None, since companies are required by the Clean Air Act to report data.
4c. Third-Party Audits
The Government Accounting Office (GAO) completed a review of U.S. participation in five international environmental agreements, and
analyzed data submissions from the U.S. under the Montreal Protocol on Substances the Deplete the Ozone Layer.  No deficiencies were
identified in their January 2003 report.  The report may be found at the following website: http://www.gao.gov/new.items/d02960t.pdf
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                           143                                     Back to Table of Contents

-------
 Goal 2                                         No Associated Objective                                   Measure SW1
 Measure Code : SW1 - Percentage of planned research products completed on time
 by the Safe and Sustainable Water Resources research program.
 Office of Research and Development (ORD)
   1. Measure and DQR Metadata
   Goal Number and Title                          2 - Protecting America's Waters
   Objective Number and Title
   Sub-Objective Number and Title	
   Strategic Target Code and Title	
   Managing Office                                Office of Program Accountability and Resource Management- Planning
   Performance Measure Term Definitions
A research product is "a deliverable that results from a specific research project or task. Research products may require translation or
synthesis before integration into an output ready for partner use."

 This secondary performance measure tracks the timely completion of research products.

Sustainability Research Strategy, available from: http://epa.gov/sciencematters/april2011/truenorth.htm

http://www.epa.gov/risk_assessment/health-risk.htm


 2. Data Definition and Source Reporting	
 2a. Original Data Source
 EPA and its partners confirm the schedule for completing research outputs and products that are transformed or synthesized into outputs.
 ORD tracks progress toward delivering the outputs; clients are notified of progress.  Scheduled milestones are compared to actual progress
 on a quarterly basis. At the end of the  fiscal year, outputs are either classified as "met" or "not met" to determine the overall percentage of
 planned products that have been met by the research program. The actual product completion date is self-reported.
 2b. Source Data Collection

 Each output is assigned to a Lab or Center representative before the start of the fiscal year.  This individual provides quarterly status
 updates via ORD's Resource Management System. Status reports are reviewed by senior management, including the Lab or Center
 Director and National Program Director. Overall status data is generated and reviewed by ORD's Office of Program Accountability and
 Resource Management.
 2c. Source Data Reporting
                                                       144                                   Back to Table of Contents

-------
 Goal 2                                            No Associated Objective
Quarterly status updates are provided via ORD's Resource Management System.
         Measure SW1
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
Internal database or internal tracking system such as the Resources Management System (RMS).
3b. Data Quality Procedures
EPA and its partners confirm the schedule for completing research outputs and products that are transformed or synthesized into outputs.
ORD tracks progress toward delivering the outputs; clients are notified of progress. Scheduled milestones are compared to actual progress
on a quarterly basis. At the end of the fiscal year, outputs are either classified as "met" or "not met" to determine the overall percentage of
planned products that have been met by the program.	
3c. Data Oversight	

The National Program Director oversees the source data reporting, specifically, the process of establishing agreement with program
stakeholders and senior ORD managers on the list and content of the planned products, and subsequent progress, completion, and delivery
of these products.
3d. Calculation Methodology
At the end of the fiscal year, outputs are either classified as "met" or "not met". An overall percentage of planned products met by the
program is reported.	
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The Office of Program Accountability and Resource Management is responsible for reporting program progress in meeting its target of
completion of 100% of program planned products.

4b. Data Limitations/Qualifications
This measure does not capture directly the quality or impact of the research products.
4c. Third-Party Audits

Not applicable
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                          145
Back to Table of Contents

-------
  Goal 2                                               Objective 1                                          Measure dw2
  Measure Code :  dw2 - Percent of person months during which community water
  systems provide drinking water that meets all applicable health-based standards.
  Office of Water (OW)
   1. Measure and DQR Metadata
   Goal Number and Title
2 - Protecting America's Waters
   Objective Number and Title
                                               1- Protect Human Health
   Sub-Objective Number and Title
                                               1 - Water Safe to Drink
   Strategic Target Code and Title
1 - By 2015,provide drinking water that meets applicable health-based drinking standards for communities
   Managing Office
  Office of Ground Water and Drinking Water
   Performance Measure Term Definitions
Community water systems —The U.S. Environmental Protection Agency (EPA) defines a community water system (CWS) as a public water system
that serves at least 15 service connections used by year-round residents or regularly serves at least 25 year-round residents. CWSs provide water to more
than 280 million persons in the United States. They are a tremendously diverse group. CWSs range from very small, privately owned systems whose
primary business is not supplying drinking water (e.g., mobile home parks) to very large publicly owned systems that serve millions of customers.
2006 Community Water System Survey Volume I: Overview
http://water.epa.gov/aboutow/ogwdw/upload/cwssreportvolumeI2006.pdf

Person months - All persons served by CWSs times 12 months (3,525.1 million for FY2011). This measure is calculated by multiplying
the number of months in the most recent four quarter period in which health-based violations overlap by the retail population served.

Health-based standards ~ exceedances of a maximum contaminant level (MCL) and violations of a treatment technique
Effective treatment
 2. Data Definition and Source Reporting
 2a. Original Data Source
 Data are provided by agencies with primacy (primary enforcement authority) for the Public Water System Supervision (PWSS) program.
 These agencies are either: States, EPA for non-delegated states or territories, and the Navajo Nation Indian tribe, the only tribe with
 primacy. Primacy agencies collect the data from the regulated water systems, determine compliance, and report a subset of the  data to EPA
 (a subset of the inventory data and summary violations).	
 2b. Source Data Collection
 State certified laboratories report contaminant occurrence to states that, in turn, determine exceedances of maximum contaminant levels or
 non-compliance with treatment techniques and report these violations to EPA.
                                                         146                                   Back to Table of Contents

-------
 Goal 2                                                  Objective 1                                             Measure dw2

Under the drinking water regulations, water systems must use approved analytical methods for testing for contaminants.	
2c. Source Data Reporting
Public Water Sanitary System (PWSS) Regulation-Specific Reporting Requirements Guidance. Available on the Internet at
http://www.epa.gov/safewater/regs.html
System, user, and reporting requirements documents can be found on the EPA web site, http://www.epa.gov/safewater/.

States may choose to use electronic Data Verification (eDV) tool to help improve data quality.	


3. Information Systems and Data Quality Procedures	
3a. Information Systems
SDWIS/STATE, a software information system jointly designed by states and EPA, to support states as they implement the drinking water
program. SDWIS/STATE is an optional data base application available for use by states and EPA regions to support implementation of
their drinking water programs, [from 3.d]: SDWIS/STATE is an optional data base application available for use by states and EPA regions to support implementation of their
drinking water programs.
U.S. EPA, Office of Ground Water and Drinking Water. Data and Databases. Drinking Water Data & Databases - SDWIS/STATE, July
2002. Information available on the Internet: http://www.epa.gov/safewater/sdwis st/current.html

SDWIS/FED User and System Guidance Manuals (includes data entry instructions, data On-line  Data Element Dictionary-a database
application, Error Code Data Base (ECDB) - a database application, users guide, release notes, etc.) Available on the Internet at
http://www.epa.gov/safewater/sdwisfed/sdwis.htm

System and user documents are accessed via the database link http://www.epa.gov/safewater/databases.html, and specific rule reporting
requirements documents are accessed via the regulations, guidance, and policy documents link http://www.epa.gov/safewater/regs.html.

Documentation is also available at the Association of State Drinking Water Administrators web site at www.ASDWA.org.

SDWIS/Fed does not have a Quality Assurance Project Plan. The SDWIS/FED equivalent is the Data Reliability Action Plan [2006
Drinking Water Data Reliability Analysis and Action Plan, EPA-816-R-07-010 March 2008] The DRAP contains the processes and
procedures and major activities to be employed and undertaken for assuring the data in SDWIS meet required data quality standards. This
plan has three major components: assurance, assessment, and control.

Office of Water Quality Management Plan, available at http://www.epa.gov/water/info.html	
3b. Data Quality Procedures	
The Office of Ground Water and Drinking Water is modifying its approach to data quality review based on the recommendations of the
Data Quality Workgroup and on the Drinking Water Strategy for monitoring data.

                                                           147                                      Back to Table of Contents

-------
 Goal 2                                                   Objective 1                                              Measure dw2
There are quality assurance manuals for states and Regions, which provide standard operating procedures for conducting routine
assessments of the quality of the data, including timely corrective action(s).

Reporting requirements can be found on the EPA web site, http://www.epa.gov/safewater/.
SDWIS/FED edit checks built into the software to reject erroneous data
EPA offers to reduce reporting and database errors:
1) training to states on data entry, data retrieval, compliance determination, reporting requirements and error correction, 2) user and system
documentation produced with each software release and maintained on EPA's web site, 3) Specific error correction and reconciliation
support through a troubleshooter's guide, 4) a system-generated summary with detailed reports documenting the results of each data
submission, 5) an error code database for states to use when they have questions on how to enter or correct data, and 6) User support
hotline available 5 days a week.
3c. Data Oversight

The Infrastructure Branch Chief is responsible for overseeing source data reporting.
The Associate Director of Drinking Water Protection is responsible for overseeing information systems utilized in producing performance
results.	
3d. Calculation Methodology                                                                                                      |
Person months - All persons served by CWSs times 12 months (3,525.1 million for FY2011). This measure is calculated by multiplying
the number of months in the most recent four quarter period in which health-based violations overlap by the retail population served.

SDWIS contains basic water system information, population served, and detailed records of violations of the Safe Drinking Water Act and
the statute's implementing health-based drinking water regulations.

SDWIS/FED data On-line Data Element Dictionary-a database application Available on the Internet at
http://www.epa.gov/safewater/sdwisfed/sdwis.htm

Additional information:  Several improvements are underway.

First, EPA will continue to work with states to implement the DRAP, which has  already improved the completeness, accuracy, timeliness,
and consistency of the data in SDWIS/FED through: 1) training courses for specific compliance determination and reporting requirements,
2) state-specific technical assistance, 3) targeted data audits conducted each year to better understand challenges with specific rules and 4)
assistance to regions and states in the identification and reconciliation of missing,  incomplete, or conflicting data.

Second,  more states (as  of  January 2011,  55 States, Tribes,  and  territories  are  using  SDWIS/STATE)  will use  SDWIS/STATE,
SDWIS/STATE is an optional data base application available for use by states and EPA regions to support implementation of their drinking
water programs.
U.S. EPA, Office of Ground Water and Drinking Water. Data and Databases. Drinking Water Data & Databases - SDWIS/STATE, July
2002. Information available on the Internet: http://www.epa.gov/safewater/sdwis  st/current.html a software information system jointly
                                                            148                                      Back to Table of Contents

-------
 Goal 2                                                   Objective 1                                              Measure dw2
designed by states and EPA, to support states as they implement the drinking water program.

Third, in 2006 EPA modified SDWIS/FED to (1) simplify the database, (2) minimize data entry options resulting in complex software,  (3)
enforce Agency data standards, and (4) ease the flow of data to EPA through a secure data exchange environment incorporating modern
technologies, all of which will improve the accuracy of the data. Data are stored in a data warehouse system that is optimized for analysis,
data retrieval, and data integration from other data sources. It has improved the program's ability to more efficiently use information to
support decision-making and effectively manage the program.

EPA has also begun a multi-year effort to develop the next generation information system to replace SDWIS/State.  In addition to reducing
the total cost of ownership to EPA, a high priority goal of this effort is to support improved data quality through the evaluation of all public
water system monitoring data.


4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting

The Deputy Director for the Office of Groundwater and Drinking Water and the Evaluation and Accountability Team Leader for the Office
of Water are responsible for coordinating the reporting of all measures for the Office of Water.

4b. Data Limitations/Qualifications
Recent state data verification and other quality assurance analyses indicate that the most significant data quality problem is under-reporting
by the states of monitoring and health-based standards violations and inventory characteristics.  The most significant under-reporting
occurs in monitoring violations. Even though those are not covered in the health based violation category, which is covered by the
performance measure, failures to monitor could mask treatment technique and MCL violations.  Such under-reporting of violations limits
EPA's ability to: 1) accurately portray the percent of people affected by health-based violations, 2) target enforcement oversight, 3) target
program assistance to primacy agencies, and 4) provide information to the public on the safety of their drinking water facilities	
4c. Third-Party Audits

N/A
 Record Last Updated: 02/13/2012 01:16:46 PM
                                                             149                                      Back to Table of Contents

-------
  Goal 2                                              Objective 1                                           Measure E
  Measure Code : E - Percent of the population in Indian country served by
  community water systems that receive drinking water that meets all applicable
  health-based drinking water standards
  Office of Water (OW)
   1. Measure and DQR Metadata	
   Goal Number and Title                         2 - Protecting America's Waters
   Objective Number and Title                      1 - Protect Human Health
   Sub-Objective Number and Title
                                              1 - Water Safe to Drink
   Strategic Target Code and Title                   2 - BV 2015, drinking water that meets health-based drinking water standards for Indian countries
   Managing Office                                Office of Groundwater and Drinking Water
   Performance Measure Term Definitions
The definition of Indian country used by the US Department of Justice can be found at this web link:
http://www.justice.gov/usao/eousa/foia_reading_room/usam/title9/crm00677.htm


Community water systems —The U.S. Environmental Protection Agency (EPA) defines a community water system (CWS) as a public water system
that serves at least 15 service connections used by year-round residents or regularly serves at least 25 year-round residents. In FY2011 737 CWSs in
Indian country regulated by the EPA and Navajo Nation provided water to more than 918 thousand persons.

Health-based drinking water standards— exceedances of a maximum contaminant level (MCL) and violations of a treatment technique


 2. Data Definition and Source Reporting	
 2a. Original Data Source	
 EPA excepts for community water systems serving the Navajo Nation, were the tribe has primacy responsibility for implementing the Safe
 Drinking Water Act.
 2b. Source  Data Collection	
 The EPA Office of Ground Water and Drinking Water (Headquarters) calculates this measure using data reported in the Safe Drinking
 Water Information System-Federal (SDWIS-FED) and provides the results to EPA Regions and the Navajo Nation.

 This measure includes federally-regulated contaminants of the following violation types: Maximum Contaminant Level, Maximum
 Residual Disinfection Limit, and Treatment Technique violations. It includes any violations from currently open and closed community

                                                       150                                   Back to Table of Contents

-------
 Goal 2                                                  Objective 1                                               Measure E
water systems (CWSs) that overlap any part of the most recent four quarters.
2c. Source Data Reporting
Public Water Sanitary System (PWSS) Regulation-Specific Reporting Requirements Guidance. Available on the Internet at
http://www.epa.gov/safewater/regs.html
System, user, and reporting requirements documents can be found on the EPA web site, http://www.epa.gov/safewater/.


3.  Information  Systems and  Data Quality Procedures	
3a.  Information Systems
SDWIS/STATE, a software information system jointly designed by states and EPA, to support states and EPA Regions as they implement
the  drinking water program. SDWIS/STATE is an optional data base application available for use by states and EPA regions to support
implementation of their drinking water programs.  EPA Region 9 utilize an access database system (DIME) to collect and report on tribal
community water systems in Region 9.

SDWIS/FED User and System Guidance Manuals (includes data entry instructions, data On-line Data Element Dictionary-a database
application, Error Code Data Base (ECDB) - a database application, users guide, release notes, etc.) Available on the Internet at
http://www.epa.gov/safewater/sdwisfed/sdwis.htm

System and user documents are accessed via the database link http://www.epa.gov/safewater/databases.html, and specific rule reporting
requirements documents are accessed via the regulations, guidance, and policy documents link http://www.epa.gov/safewater/regs.html.

SDWIS/Fed does not have a Quality Assurance Project Plan.  The SDWIS/FED equivalent is the Data Reliability Action Plan [2006
Drinking Water Data Reliability Analysis and Action Plan, EPA-816-R-07-010 March 2008] The DRAP contains the processes and
procedures and major activities to be employed and undertaken for assuring the data in SDWIS meet required data quality standards.  This
plan has three major components: assurance, assessment, and control.

Office of Water Quality Management Plan, available at http://www.epa.gov/water/info.html
3b.  Data Quality Procedures
The Office of Ground Water and Drinking Water is modifying its approach to data quality review based on the recommendations of the
Data Quality Workgroup and on the Drinking Water Strategy for monitoring data.

There are quality assurance manuals for states and Regions, which provide standard operating procedures for conducting routine
assessments of the quality of the data, including timely corrective action(s).

Reporting requirements can be found on the EPA web site, http://www.epa.gov/safewater/.
SDWIS/FED edit checks built into the software to reject erroneous data

                                                           151                                    Back to Table of Contents

-------
 Goal 2                                                  Objective 1                                               Measure E
EPA offers to reduce reporting and database errors:
1) training to states on data entry, data retrieval, compliance determination, reporting requirements and error correction, 2) user and system
documentation produced with each software release and maintained on EPA's web site, 3) Specific error correction and reconciliation
support through a troubleshooter's guide, 4) a system-generated summary with detailed reports documenting the results of each data
submission, 5) an error code database for states to use when they have questions on how to enter or correct data, and 6) User support
hotline available 5 days a week.	
3c. Data Oversight	

 The Drinking Water Protection Division Director oversees the source data reporting and the information systems producing the
performance result.
3d. Calculation Methodology
SDWIS/STATE, a software information system jointly designed by states and EPA, to support states as they implement the drinking water
program. SDWIS/STATE is an optional data base application available for use by states and EPA regions to support implementation of
their drinking water programs, [from 3.d]: SDWIS/STATE is an optional data base application available for use by states and EPA regions
to support implementation of their drinking water programs.
U.S. EPA, Office of Ground Water and Drinking Water. Data and Databases. Drinking Water Data & Databases - SDWIS/STATE, July
2002. Information available on the Internet: http://www.epa.gov/safewater/sdwis st/current.html

SDWIS/FED User and System Guidance Manuals (includes data entry instructions, data On-line Data Element Dictionary-a database
application, Error Code Data Base (ECDB) - a database application, users guide,  release notes, etc.) Available on the Internet at
http://www.epa.gov/safewater/sdwisfed/sdwis.htm

System and user documents are accessed via the database link http://www.epa.gov/safewater/databases.html, and specific rule reporting
requirements documents are accessed via the regulations, guidance, and policy documents link http://www.epa.gov/safewater/regs.html.

Documentation is also available at the Association of State Drinking Water Administrators web site at www.ASDWA.org.

SDWIS/Fed does not have a Quality Assurance Project Plan.  The SDWIS/FED equivalent is the Data Reliability Action Plan [2006
Drinking Water Data Reliability Analysis and Action Plan, EPA-816-R-07-010 March 2008] The DRAP contains the processes and
procedures and major activities to be employed and undertaken for assuring the data in SDWIS meet required data quality standards. This
plan has three major components: assurance, assessment, and control.

Office of Water Quality Management Plan, available at http://www.epa.gov/water/info.html	


4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting	

                                                           152                                      Back to Table of Contents

-------
 Goal 2                                                    Objective 1                                                  Measure E
The Evaluation and Accountability Team Leader is responsible for overseeing the final reporting for the Office of Water

4b. Data Limitations/Qualifications
Recent state and EPA Regional data verification and other quality assurance analyses indicate that the most significant data quality problem
is under-reporting by the states of monitoring and health-based standards violations and inventory characteristics. The most significant
under-reporting occurs in monitoring violations. Even though those are not covered in the health based violation category, which is
covered by the performance measure, failures to monitor could mask treatment technique and MCL violations. Such under-reporting of
violations limits EPA's ability to: 1) accurately portray the percent of people affected by health-based violations, 2) target enforcement
oversight, 3) target program assistance to primacy agencies, and 4) provide information to the public on the safety of their drinking water
facilities	
4c. Third-Party Audits                                                                                                                |

N/A
 Record Last Updated: 02/13/2012 01:17:01 PM
                                                              153                                        Back to Table of Contents

-------
  Goal 2                                              Objective 2                                         Measure bps
  Measure Code : bps - Number of TMDLs that are established or approved by EPA
  [Total TMDL] on a schedule consistent with national policy (cumulative). [A TMDL
  is a technical plan for reducing pollutants in order to attain water quality standards.
  The terms "approved" and  "established" refer to the completion and approval of
  the TMDL itself.]
  Office of Water (OW)
   1. Measure and DQR Metadata
   Goal Number and Title
2 - Protecting America's Waters
   Objective Number and Title
2 - Protect and Restore Watersheds and Aquatic Ecosystems
   Sub-Objective Number and Title
1 - Improve Water Quality on a Watershed Basis
   Strategic Target Code and Title
1 - Attain water quality standards for all pollutants and impairments in more than 3,360 water bodies id
   Managing Office
  Office of Wetlands Oceans and Watersheds
   Performance Measure Term Definitions
TMDL: A Total Daily Maximum Load (TMDL) is a calculation of the maximum amount of a pollutant that a waterbody can receive and
still safely meet water quality standards. A TMDL is a technical plan for reducing pollutants in order to attain water quality standards. For
the purposes of this measure, each individual pollutant for which an allocation has been established/approved is counted as a TMDL.  The
development of TMDLs for an impaired waterbody is a critical step toward meeting water restoration goals.
TMDLs focus on clearly defined environmental goals and establish a pollutant budget, which is then implemented via permit requirements
or a wide variety of state, local, and federal programs (which may be regulatory, non-regulatory, or incentive-based, depending on the
program), as well as voluntary action by citizens.

TMDLs established/approved: The terms "approved" and "established" refer to the completion and approval of the TMDL itself. While
the majority of TMDLs are developed by states, territories, or authorized tribes, EPA in some instances may establish a TMDL if:
•  EPA disapproves TMDLs submitted by states, territories, or authorized tribes,
•  States, territories, or authorized tribes do not submit TMDLs in a timely manner,
•  EPA is required to do so pursuant to litigation settlements or judicial orders, or
•  States ask EPA to establish TMDLs for particular water bodies.

Schedule consistent with national policy: National policy states that TMDLs are typically established and approved within 8 to 13 years
of the water having been listed as impaired under Clean Water Act Section 303(d).  The "state pace" is the number of TMDLs  needing to be
completed in a given state in a given fiscal year (these TMDLs may eventually be developed by either the state and approved by EPA or
                                                       154
                                              Back to Table of Contents

-------
  Goal 2                                                 Objective 2                                            Measure bps
established by EPA). State pace is based on state litigation or other schedules or straight-line rates that ensure that national policy is met.
Regions collaborate with States to set targets for the number of TMDLs projected to be completed in a given fiscal year. EPA policy has
been that targets should be within 80 to 100% of the pace.

Cumulative trend information:

Background:
 • EPA and States have developed more than 49,000 TMDLs thru FY 2011.
 • Projecting state TMDL production numbers several months in advance continues to be a challenge as resource constraints and technical
   and legal challenges still exist.  There has also been a notable shift toward the development of more difficult TMDLs  that  take more
   time and resources.
 • As TMDLs and other watershed-related activities are developed and implemented, waterbodies that were once impaired will meet water
   quality standards.  Thus these TMDL measures are closely tied to the program assessment measures WQ-SP10.N11 and WQ-SP-11,
   "Number of waterbody segments identified by States in 2002 as not attaining standards, where water quality standards are  now fully
   attained," and "remove the  specific causes of waterbody impairment identified by states in 2002."
 • The number of TMDLs needed to address outstanding causes of impairment changes with each 303(d) list cycle; therefore, a baseline as
   such is not appropriate for these measures.
 • For more information, please visit http://www.epa.gov/owow/tmdl/


 2. Data  Definition and Source  Reporting	
 2a. Original Data Source                                                                                                      |

 State-submitted and EPA-approved TMDLs or EPA-established TMDLs	
 2b. Source Data Collection
 State-submitted and EPA-approved TMDLs and EPA-established TMDLs are publicly reviewed during their development. Electronic and
 hard copies of state-submitted and EPA-approved TMDLs are made available by states and often linked to EPA Web sites. The Watershed
 Assessment, Tracking, and Environmental Results system allows search for TMDL documents at
 http://www.epa.gov/waters/tmdl/tmdl_document_search.html.
 Explanation:
 Office of Water Quality Management Plan. EPA requires that organizations prepare a document called a QMP that: documents  the
 organization's quality policy; describes its quality system; and identifies the environmental programs to which the quality system applies
 (e.g., those programs involved in the collection or use of environmental data).	
 2c. Source Data Reporting                                                                                                     |

 Relevant information from each TMDL is entered into the Assessment and Total Maximum Daily Load (TMDL) Tracking And
 ImplementatioN System (ATTAINS) data entry system and made available to the public via the web reports. See
 http://www.epa.gov/waters/ir.	
                                                           155                                     Back to Table of Contents

-------
 Goal 2                                                 Objective 2                                            Measure bps


3. Information Systems and Data Quality Procedures	
3a. Information Systems
 The Assessment and Total Maximum Daily Load (TMDL) Tracking And ImplementatioN System (ATTAINS) is the database which
captures water quality information related to this measure. ATTAINS is an integrated system that documents and manages the connections
between state assessment and listing decisions reported under sections 305(b) and 303(d) (i.e., integrated reporting) and completed TMDLs.
This system holds information about assessment decisions and restoration actions across reporting cycles and over time until water quality
standards are attained. Annual TMDL totals by state, fiscal year, and pollutant are available at
http://iaspub.epa.gov/waterslO/attains_nation_cy.control?p_report_type=T#APRTMDLSand TMDL document searches can be conducted
at http://www.epa.gov/waters/tmdl/tmdl_document_search.html.  More information about ATTAINS can be found at
http://www.epa.gov/waters/data/prog.html and http://www.epa.gov/waters/ir/about_integrated.html.


The Watershed Assessment, Tracking, and Environmental Results System (WATERS) is used to provide water program information and
display it spatially using a geographic  information system (National Hydrography Dataset (NHD)) integrated with several of EPA's
existing databases.  These databases include the STOrage and RETrieval (STORET) database, the Assessment TMDL Tracking and
ImplementatioN System (ATTAINS),  the Water Quality Standards Database (WQSDB), and the Grants Tracking and Reporting System
(GRTS).  This water quality information was previously available only from several independent and unconnected databases. General
information about WATERS is available at: http://www.epa.gov/watersA a system architecture diagram is available at:
http://www.epa.gov/waters/about/arch.html, and information about WATERS geographic data is available at:
http: //www. epa. gov/waters/ab out/geography. html.	
3b. Data Quality Procedures                                                                                                   |
QA/QC of data is provided by EPA Regional staff and through cross-checks of ATTAINS  information regarding impaired water listings,
consistent with the Office of Water Quality Management Plan (QMP). EPA requires that organizations prepare a document called a QMP
that:  documents the organization's quality policy; describes its  quality system; and identifies the environmental programs to which the
quality system applies (e.g., those programs involved in the collection or use of environmental data).	
3c. Data Oversight                                                                                                           |

 The Assessment and Watershed Protection Division Director is responsible for overseeing the source data reporting and information
systems.	
3d. Calculation Methodology
Additional information:  Internal reviews of data quality revealed some inconsistencies in the methodology of data entry between EPA
Regional Offices. In 2005 and 2006, EPA convened a meeting of NTTS users to discuss how to improve the database. As a result, data
field definitions were clarified, the users' group was reinstituted, several training sessions were scheduled, and an ATTAINS design made
the necessary database upgrades.  One of the issues raised included the methodology used to  count TMDLs. Previous methodology
generated a TMDL "count" based on the causes of impairment removed from the 303(d) impaired waters list as well as the TMDL
pollutant. EPA proposed to change the counting methodology to directly reflect only the pollutants given allocations in TMDLs. During a

                                                          156                                     Back to Table of Contents

-------
 Goal 2                                                  Objective 2                                             Measure bps
recent EPA Office of the Inspector General review they concurred with this recommendation. This proposed change was vetted during the
TMDL Program's annual meeting in March 2007 and implemented in August 2007, resulting in a cumulative net reduction of 1,577
TMDLs.

Guidance:
Detailed measure guidance reporting can be found under the water quality sub-objective (WQ-8a) at
http://water.epa.gov/resourcejerfonnance/planning/FY-2012-NWPG-Measure-Definitions-Water-Quality.cfm
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The Headquarters point of contact for this measure works with Regions to address any questions and to ensure the TMDL information is correctly
entered into and made available to the public in ATTAINS.

Branch Chief for Watershed Branch (WB) is responsible for tracking and reporting on this measure.

4b. Data Limitations/Qualifications                                                                                               |
To meet the increasing need for readily accessible CWA information, EPA continues to improve the database and oversee quality review of
existing data.  Data quality has been improving and will continue to improve as existing data entry requirements and procedures are being
re-evaluated and communicated with data entry practitioners.
4c. Third-Party Audits                                                                                                          |
USEPA, Office of the Inspector General.  2007.  Total Maximum Daily Load Program Needs Better Data and Measures to Demonstrate
Environmental Results.   Available at http://www.epa.gov/oig/reports/2007/20070919-2007-P-00036.pdf

USEPA, Office of the Inspector General.  2005.  Sustained Commitment Needed to Further  Advance the Watershed Approach.  Available
at http://www.epa.gov/oig/reports/2005/20050921-2005-P-00025.pdf

National Research Council, Committee to Assess the Scientific Basis of the Total Maximum Daily Load Approach to Water Pollution
Reduction. 2001.  Assessing the TMDL Approach to Water Quality Management.   Washington, DC: National Academy Press.
http://www.nap.edu/openbook.php?isbn=0309075793
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                           157                                      Back to Table of Contents

-------
  Goal 2                                               Objective 2                                            Measure L
  Measure Code :  L - Number of waterbody segments identified by States in 2002 as
  not attaining standards, where water quality standards are now fully attained
  (cumulative).
  Office of Water (OW)
   1. Measure and DQR Metadata
   Goal Number and Title
2 - Protecting America's Waters
   Objective Number and Title
2 - Protect and Restore Watersheds and Aquatic Ecosystems
   Sub-Objective Number and Title
1 - Improve Water Quality on a Watershed Basis
   Strategic Target Code and Title
1 - Attain water quality standards for all pollutants and impairments in more than 3,360 water bodies id
   Managing Office
  Office of Wetlands Oceans and Watersheds
   Performance Measure Term Definitions
Waterbody segments:  A geographically defined portion of navigable waters, waters of the contiguous zone, and ocean waters under the
jurisdiction of the United States, including segments of rivers, streams, lakes, wetlands, coastal waters and ocean waters.

Identified by States in 2002 as not attaining standards:  In 2002, an estimated 39,503 water bodies were identified by states or EPA as
not meeting water quality standards.  These water bodies and water body segments were identified in state-submitted section 303(d) lists,
section 305(b) reports, and Integrated Reports, for the 2002 reporting cycle. (See EPA's guidance for such reporting under "303(d) Listing
of Impaired Waters Guidance" at http ://www.epa.gov/owow/tmdl/guidance.html.) Impairments identified after 2002 are not considered in
counting waters under this measure; such impairments may be considered when revising this measure for future updates of the Strategic
Plan.

The universe for this measure, the estimated 39,503 water bodies identified by states or EPA as not meeting water quality standards in 2002,
is sometimes referred to as the "fixed base" or "SP-10 baseline." The universe includes all waters in categories 5, 4a, 4b, and 4c in 2002. Of
these waters, 1,703 are impaired by multiple pollutants including mercury, and 6,501 are impaired by mercury alone.

States: All 50 states.

Water quality standards are now fully attained:  Attaining water quality standards means that the water body is no longer impaired for
any of the causes identified in 2002, as reflected in subsequent state-submitted assessments and EPA-approved 303(d) lists. Impairment
refers to an "impairment cause" in state- or EPA-reported data, stored in ATTAINS (Assessment Total Maximum Daily Load (TMDL)
Tracking and Implementation System) or its predecessors NTTS (National TMDL Tracking System) or ADB (Assessment Database). (Any
water body listed as impaired in these data bases must have an impairment cause entered.) There are several reasons why EPA or states may
                                                         158
                                               Back to Table of Contents

-------
  Goal 2                                                  Objective 2                                                Measure L
determine that specific waterbodies listed as impaired in 2002, the baseline year, are no longer impaired in the current reporting year. For
example, water quality might improve due to EPA or state actions to reduce point and nonpoint source discharges of pollutants. In other
cases, a state or EPA might conduct more robust monitoring studies and use these data to complete more accurate assessments of water
quality conditions. In some cases, a state might modify its water quality standards, in accordance with EPA's regulations, to update scientific
criteria or to better reflect the highest attainable conditions for its waters. Each of these examples represents a case where an impaired water
may no longer exceed water quality standards. Any such removals of waterbody impairments will be recorded based on reports from states
scheduled every two years through 2012.

Background:
• This is a cumulative measure, and it was first tracked in FY 2007.  The FY 2007 target was 1,166; actual results were 1,409. The FY 2008
target was 1,550; actual results were 2,165. The FY 2009 target for this measure was 2,270; the actual result was 2,505.  The FY 2010
target for this measure was 2,809; the actual result was 2,909.  The FY 2011 target for this measure was 3,073; the actual results were 3,119.
 2. Data Definition and Source  Reporting
 2a. Original Data Source
 Regional EPA staff, who review and approve states' 303(d) lists.
 2b. Source Data Collection
 Approval and Review of 303(d) lists by regional EPA staff:  EPA reviews and approves state determinations that water by segments have
 fully attained standards. The primary data source is state 303(d) lists of their impaired waterbodies needing development of TMDLs, and
 required submittals of monitoring information pursuant to section 305(b) of the Clean Water Act. These lists/reports are submitted each
 biennial reporting cycle. EPA regional staffs interact with the states during the process of approval of the lists to ensure the integrity of the
 data, consistent with the Office of Water Quality Management Plan (QMP).  [EPA review and approval is governed by the Office of Water
 Quality Management Plan (QMP).]

 State 303(d) submissions:
 States submit 303(d) lists of impaired waterbodies needing development of TMDLs and monitoring information pursuant to section 305(b)
 of the Clean Water Act. States prepare lists/reports using actual water quality monitoring data, probability-based monitoring information,
 and other existing and readily available information and knowledge the state has, in order to make comprehensive determinations
 addressing the total extent of the state's waterbody impairments.  States exercise considerable discretion in using monitoring data and other
 available information to make  decisions about which waters meet their designated uses in accordance with state water quality standards.

 States employ various analytical methods of data collection, compilation, and reporting including:
 1) Direct water samples of chemical, physical, and biological parameters;

                                                             159                                     Back to Table of Contents

-------
 Goal 2                                                   Objective 2                                               Measure L
2) Predictive models of water quality standards attainment;
3) Probabilistic models of pollutant sources; and
4) Compilation of data from volunteer groups, academic interests and others. EPA-supported models include BASINS, QUAL2E,
AQUATOX, and CORMIX. (Descriptions of these models and instructions for their use can be found at
http://www.epa.gov/waterscience/models/.)

Most states have provided this information in Integrated Reports, pursuant to EPA guidance.  An Integrated Report is a biennial state
submittal that includes the state's findings on the status of all its assessed waters (as required under section 305(b) of the Clean Water Act),
a listing of its impaired waters and the causes of impairment, and the status of actions being taken to restore impaired waters (as required
under section 303(d)).

QA/QC of data provided by states pursuant to individual state 303(d) lists (under CWA Section 303(d)) and/or Integrated 305(b)/303(d)
Reports) is dependent on individual state procedures. EPA enhanced two existing data management tools (STORET and the National
Assessment Database) so that they include documentation of data quality information.

EPA released the Water Quality Exchange (WQX) which provides data exchange capability to any organization that generates data of
documented quality and would like to contribute that data  to the national STORET data warehouse so that their data may be used in
combination with other sources of data to track improvements in individual watersheds. Currently data providers must transmit data and
required documentation through their own Exchange Network node. EPA rolled out a web data entry tool called WQXweb for users who
have not invested in the node technology.


2c. Source Data Reporting                                                                                                       |
Once EPA approves a state's 303(d) list, the information is entered into EPA's Assessment, TMDL Tracking, and Implementation System
ATTAINS. After approving a state's 303(d) list, EPA reviews waterbodies in the 2002 universe to determine progress in achieving annual
commitments for this measure (coded as SP-10). A waterbody may  be counted under this measure when it attains water quality standards
for all impairments identified in the 2002 reporting cycle,  as reflected in subsequent Integrated Reports.  Impairments that are identified in
later Integrated Reports are not considered for this measure.

Waters that are delisted for the following reasons can be counted toward meeting this measure:
                                                            160                                     Back to Table of Contents

-------
 Goal 2
Objective 2
Measure L
Delisting Reason in ATTAINS
8. Applicable WQS attained; due to restoration
activities
9. Applicable WQS attained; due to change in
WQS
1 0. Applicable WQS attained; according to new
assessment method.
1 1 . Applicable WQS attained; threatened water
no longer threatened.
1 2. Applicable WQS attained; reason for recovery
unspecified.
1 3. Applicable WQS attained; original basis for
listing was incorrect.
14. Data and : or information lacking to
determine water quality status; original basis for
listing was incorrect.
Can Removal of Impairment Cause Be Used in
Reporting Under SP-10?
YES
YES
YES
YES
YES
YES
YES
measure are then entered by the EPA Regional Offices into EPA's Annual Commitment System (ACS).
                                                                                               Results for this performance
Guidance Documents
•       The Office of Water has been working with states to improve the guidance under which 303(d) lists are prepared. In 2005, EPA
issued listing guidance entitled Guidance for 2006 Assessment, Listing, and Reporting Requirements Pursuant to Sections 303(d), 305(b),
and 314 of the Clean Water Act.  This document provided a comprehensive compilation of relevant guidance EPA had issued to date
regarding the Integrated Report. It included some specific changes from the 2004 guidance. For example, the 2006 Integrated Report
Guidance provided greater clarity on the content and format of those components of the Integrated Report that are recommended and
required under Clean Water Act sections 303(d), 305(b), and 314. The guidance also gave additional clarity and flexibility on reporting
alternatives to TMDLs for attaining water quality standards (e.g., utilization of reporting Category 4b). Available at:
http://www.epa.gov/owow/tmdl/2006IRG.
•       In 2008, USEPA' s Office of Water published Information Concerning 2008 Clean Water Act Sections 303(d), 305(b), and 314
Integrated Reporting and Listing Decisions.  Available at http://www.epa.gov/owow/tmdl/2008 ir memorandum.html.
•       In May 2009 EPA released Information Concerning 2010 Clean Water Act Sections 303(d), 305(b), and 314 Integrated Reporting
and Listing Decisions,   www.epa.gov/owow/tmdl/guidance/fmal52009.pdf
•   EPA issued a 2010 Integrated Report clarification memo (released May 5, 2009 available at
                                                           161
                                           Back to Table of Contents

-------
 Goal 2                                                 Objective 2                                              Measure L
http://www.epa.gov/owow/ttndl/guidance/final52009.httnl) which includes suggestions for the use of the rotating basin approach and
Category 3, circumstances and expectation for "partial approval/further review pending" determinations, and using and reporting on
Statewide Statistical Survey Data in ATTAINS and the National Water Quality Inventory Report to Congress .
•   The Consolidated Assessment and Listing Methodology - Toward a Compendium of Best Practices (released on the Web July 31,
2002, at www.epa.gov/owow/monitoring/calm.html^ intended to facilitate increased consistency in monitoring program design and the data
and decision criteria used to support water quality assessments.
The Office of Water (OW) and EPA's Regional Offices have developed the Elements of a State Water Monitoring and Assessment
Program (March 2008). This guidance describes ten elements that each state water quality monitoring program should contain and directs
states to develop monitoring strategies that propose time-frames for implementing all ten elements.
 • Reporting guidelines for this measure can be found under the water quality sub-objective (SP-10 code) at:
   http://water.epa.gov/resource jerformance/planning/FY-2012-NWPG-Measure-Deftnitions-Water-Quality.cfm#Measure%20Code_%2
   OWQ_SP10_N11
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
The Assessment and Total Maximum Daily Load (TMDL) Tracking And ImplementatioN System (ATTAINS) is the database which
captures water quality information related to this measure. ATTAINS is an integrated system that documents and manages the connections
between state assessment and listing decisions reported under sections 305(b) and 303(d) (i.e., integrated reporting) and completed TMDLs.
This system holds information about assessment decisions and restoration actions across reporting cycles and over time until water quality
standards are attained.  Annual TMDL totals by state, fiscal year, and pollutant are available at
http://iaspub.epa.gov/waterslO/attains_nation_cy.control?p_report_type=T#APRTMDLSand TMDL document searches can be conducted
at http://www.epa.gov/waters/tmdl/tmdl_document_search.html.  More information about ATTAINS can be found at
http://www.epa.gov/waters/data/prog.html and http://www.epa.gov/waters/ir/about_integrated.html.


The Watershed Assessment, Tracking, and Environmental  Results System (WATERS) is used to provide water program information and
display it  spatially using a geographic  information system (National Hydrography  Dataset (NHD)) integrated with several of EPA's
existing databases.  These databases include the  STOrage and RETrieval (STORET) database, the Assessment TMDL Tracking and
ImplementatioN System (ATTAINS), the Water Quality Standards Database  (WQSDB), and the Grants Tracking and Reporting System
(GRTS).   This water quality information was  previously  available only from several independent and unconnected databases.  General
information  about  WATERS  is  available  at:     http://www.epa.gov/watersA  a  system  architecture  diagram  is available  at:
http: //www. epa. gov/waters/ab out/arch. html,    and    information    about   WATERS    geographic    data    is    available   at:
http: //www. epa. gov/waters/ab out/geography. html.	
3b. Data Quality Procedures                                                                                                   |
Water Management Divisions in EPA Regional  Offices have responsibility for  oversight, review and quality assurance of the performance
data reported to EPA by the original data source which is the individual states.
                                                                                                                          I
                                                           162                                     Back to Table of Contents

-------
 Goal 2                                                 Objective 2                                              Measure L
3c. Data Oversight	|
(1) Source Data Reporting: Water Management Divisions in the EPA Regional Offices. (2) Information Systems Oversight: System
Manager for ATTAINS; System Manager for WATERS	
3d. Calculation Methodology                                                                                                 |
While ATTAINS is the repository for 303(d) lists and 305(b) reports, it is not yet used for tracking performance and success for this
measure. EPA is continuing to work to address discrepancies between Regional data records and ATTAINS.


USEPA, 2008, EPA 's 2008 Report on the Environment (FinalReport)  http://cfpub.epa.gov/ncea/cfm/recordisplay.cfm?deid= 190806

USEPA. 2003. Draft Report on the Environment 2003  . EPA 260-R-02-006. Available at
http://nepis.epa.gov/Exe/ZyNET.exe/500001GN. TXT?ZyActionD=ZyDocument&Client=EPA&Index=2000+Thru+2005&File=D%3A%5
CZYFILES%5CINDEX+DATA%5COOTHRU05%5CTXT%5C00000006%5C500001GN.TXT&User=anonymous&Password=anonymous
&ImageQuality=r85gl6%2Fr85gl6%2Fxl50yl50gl6%2Fi500&Display=hpfrw&Back=ZyActionS&MaximumPages=5&Query=fname%
3D%22500001GN. TXT%22

USEPA, Office of the Chief Financial Officer. 2003.  2003-2008 Strategic Plan: Direction for the Future . This and other past  Strategic
plans can be found at:  http://epa.gov/planandbudget/archive.html.
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting	

Deputy Director of the Assessment and Watershed Protection Division is responsible for overseeing final reporting of measure.

4b. Data Limitations/Qualifications	

Delays are often encountered in state 303(d) lists and 305(b) submissions, and in EPA's approval of the 303(d) portion of these biennial
submissions. EPA encourages states to effectively assess their waters and make all necessary efforts to ensure the timely submittal of
required Clean Water Act Section 303(d) impaired waters lists. EPA will continue to work with states to facilitate accurate, comprehensive,
and georeferenced data submissions. Also, EPA is heightening efforts to ensure expeditious review of the 303(d) list submissions with
national consistency. Timely submittal and EPA review of integrated reports is important to demonstrate state and EPA success in
accomplishing Strategic Plan goals for water quality.

Data may not precisely represent the extent  of impaired waters because states do not employ a monitoring design that monitors all their
waters. States, territories and tribes collect data and information on only a portion of their waterbodies. States do not use a consistent suite
of water quality indicators to assess attainment of water quality standards. For example, indicators of aquatic life use support range from
biological community assessments to levels  of dissolved oxygen to concentrations of toxic pollutants. These variations in state practices
limit how the CWA Sections 305(b) reports  and the 303(d) lists provided by states can be used to describe water quality at the national
                                                          163                                     Back to Table of Contents

-------
 Goal 2                                                  Objective 2                                               Measure L
level. There are also differences among sampling techniques and standards.

State assessments of water quality may include uncertainties associated with derived or modeled data. Differences in monitoring designs
among and within states prevent the agency from aggregating water quality assessments at the national level with known statistical
confidence. States, territories, and authorized tribes monitor to identify problems and typically lag times between data collection and
reporting can vary by state.

Additionally, states exercise considerable discretion in using monitoring data and other available information to make decisions about
which waters meet their designated uses in accordance with state water quality standards. EPA then aggregates these various state decisions
to generate national performance measures.

Impact of Supplemental Funding.  In FY 2010 and CR 2011, the program which this measure supports receives funding from the American
Recovery and Reinvestment Act (ARRA). Results from that funding will be reflected in this measure, because it cannot easily be separated
from results related to other EPA funding.	
4c. Third-Party Audits                                                                                                          |
Independent reports have cited the ways in which weaknesses in monitoring and reporting of monitoring data undermine EPA's ability to
depict the condition of the Nation's waters and to support scientifically sound water program decisions. The most recent reports include the
following:
•       USEPA, Office of the Inspector General. 2009. EPA Needs to Accelerate Adoption of Numeric Nutrient Water Quality
Standards.   Available at www.epa.gov/oig/reports/2009/20090826-09-P-0223.pdf
•       USEPA, Office of the Inspector General. 2007. Total Maximum Daily Load Program Needs Better Data and Measures to
Demonstrate Environmental Results.  Available at http ://www. epa. gov/oig/reports/2007/20070919-2007-P-0003 6.pdf.
•       Government Accountability Office. 2003.  Water Quality: Improved EPA Guidance and Support Can Help States Develop
Standards That Better Target Cleanup Efforts . GAO-03-308. Washington, DC.  www.gao.gov/new.items/d03308.pdf
•       Government Accountability Office. 2002.  Water Quality: Inconsistent State Approaches Complicate Nation's Efforts to Identify
its Most Polluted W aters. GAO-02-186. Washington, DC. www.epa.gov/waters/doc/gaofeb02.pdf
•       Government Accountability Office. 2000.  Water Quality: Key EPA and State Decisions Limited by Inconsistent and Incomplete
Data . GAO-RCED-00-54. Washington, DC.  www.gao.gov/products/RCED-00-54

In response to these evaluations, EPA has been working with states and other stakeholders to improve: 1) data coverage, so that state
reports reflect the condition of all waters of the state; 2)  data consistency to facilitate comparison and aggregation of state data to the
national level; and 3) documentation so that data limitations and discrepancies are fully understood by data users. EPA has taken several
steps in an effort to make these improvements:

First, EPA enhanced two existing data management tools (STORET and the National Assessment Database) so that they include
documentation of data quality information.

Second, EPA has developed a GIS tool called WATERS that integrates many databases including  STORET, ATTAINS, and a water quality
                                                           164                                      Back to Table of Contents

-------
 Goal 2                                                   Objective 2                                               Measure L
standards database. These integrated databases facilitate comparison and understanding of differences among state standards, monitoring
activities, and assessment results.

Third, EPA and states have developed guidance. The 2006 Integrated Report Guidance (released August 3, 2005 at
http://www.epa.gov/owow/tmdl/2006IRG) provides comprehensive direction to states on fulfilling reporting requirements of Clean Water
Act sections 305(b) and 303(d). EPA also issued a 2010 Integrated Report clarification memo (released May 5, 2009 available at
http://www.epa.gov/owow/tmdl/guidance/fmal52009.html) which includes suggestions for the use of the rotating basin approach and
Category 3, circumstances and expectation for "partial approval/further review pending" determinations, and using and reporting on
Statewide Statistical Survey Data in ATTAINS and the National Water Quality Inventory Report to Congress.

Also, the Consolidated Assessment and Listing Methodology - Toward a Compendium of Best Practices (released on the Web July 31,
2002, at www.epa.gov/owow/monitoring/calm.html) intended to facilitate increased consistency in monitoring program design and the data
and decision criteria used to support water quality assessments.

Fourth, the Office of Water (OW) and EPA's Regional Offices have developed the Elements of a State Water Monitoring and Assessment
Program (March 2008). This guidance describes ten elements that each  state water quality monitoring program should contain and directs
states to develop monitoring strategies that propose time-frames for implementing all ten elements.  (USEPA, Office of Water. 2003.
Elements of a State Water Monitoring and Assessment Program. EPA 841-B-03-003. Washington, DC. Available at
www.epa.gov/owow/monitoring/elements/.)
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                            165                                     Back to Table of Contents

-------
  Goal 3                                         No Associated Objective                                   Measure HC1
  Measure Code : HC1 - Percentage of planned research products completed on time
  by the Safe and Healthy Communities research program.
  Office of Research and Development (ORD)
   1. Measure and DQR Metadata
   Goal Number and Title                         3 - Cleaning Up Communities and Advancing Sustainable Development
   Objective Number and Title
   Sub-Objective Number and Title	
   Strategic Target Code and Title	
   Managing Office                                Office of Program Accountability and Resource Management- Planning
   Performance Measure Term Definitions
A research product is "a deliverable that results from a specific research project or task. Research products may require translation or
synthesis before integration into an output ready for partner use."

 This secondary performance measure tracks the timely completion of research products.

Sustainability Research Strategy, available from: http://epa.gov/sciencematters/april2011/truenorth.htm

http://www.epa.gov/risk_assessment/health-risk.htm


 2. Data Definition and Source Reporting	
 2a. Original Data Source
 EPA and its partners confirm the schedule for completing research outputs and products that are transformed or synthesized into outputs.
 ORD tracks progress toward delivering the outputs; clients are notified of progress. Scheduled milestones are compared to actual progress
 on a quarterly basis. At the end of the fiscal year, outputs are either classified as "met" or "not met" to determine the overall percentage of
 planned products that have been met by the research program. The actual product completion date is self-reported.
 2b. Source Data Collection

 Each output is assigned to a Lab or Center representative before the start of the fiscal year.  This individual provides quarterly status
 updates via ORD's Resource Management System. Status reports are reviewed by senior management, including the Lab or Center
 Director and National Program Director. Overall status data is generated and reviewed by ORD's Office of Program Accountability and
 Resource Management.
 2c. Source Data Reporting
                                                       166                                   Back to Table of Contents

-------
 Goal 3                                            No Associated Objective
Quarterly status updates are provided via ORD's Resource Management System.
          Measure HC1
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
Internal database or internal tracking system such as the Resources Management System (RMS).
3b. Data Quality Procedures
EPA and its partners confirm the schedule for completing research outputs and products that are transformed or synthesized into outputs.
ORD tracks progress toward delivering the outputs; clients are notified of progress. Scheduled milestones are compared to actual progress
on a quarterly basis. At the end of the fiscal year, outputs are either classified as "met" or "not met" to determine the overall percentage of
planned products that have been met by the program.	
3c. Data Oversight	

The National Program Director oversees the source data reporting, specifically, the process of establishing agreement with program
stakeholders and senior ORD managers on the list and content of the planned products, and subsequent progress, completion, and delivery
of these products.
3d. Calculation Methodology
At the end of the fiscal year, outputs are either classified as "met" or "not met". An overall percentage of planned products met by the
program is reported.	
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The Office of Program Accountability and Resource Management is responsible for reporting program progress in meeting its target of
completion of 100% of program planned products.

4b. Data Limitations/Qualifications
This measure does not capture directly the quality or impact of the research products.
4c. Third-Party Audits

Not applicable
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                          167
Back to Table of Contents

-------
  Goal 3                                               Objective 1                                          Measure B29
  Measure Code :  B29 - Brownfield properties assessed.
  Office of Solid Waste and Emergency Response (OSWER)
   1. Measure and DQR Metadata	
   Goal Number and Title                          3 - Cleaning Up Communities and Advancing Sustainable Development
   Objective Number and Title                      1 - Promote Sustainable and Livable Communities
   Sub-Objective Number and Title                   2 - Assess and Cleanup Brownfields
   Strategic Target Code and Title                    1 - By 2015, conduct environmental assessments at 20,600 (cumulative) brownfield properties
   Managing Office                                 Brownfields
   Performance Measure Term Definitions
 Properties Assessed — Number of properties that have been environmentally assessed for the first time using EPA Brownfields funding.

A property will be counted for this measure if the property has not previously been counted for this annual performance measure as a result
of other assessments completed with regular EPA Brownfields funding.

A "property" is defined as a contiguous piece of land under unitary ownership.  A property may contain several smaller components, parcels
or areas.

"Assessments" can consist of a Phase I assessment, Phase II assessment, and/or supplemental assessments. Assessments are deemed
complete when the reports for those assessments are deemed complete.

A Phase I assessment report is final when an environmental professional or state official has signed and dated the report as required in the
final rule (see 40 CFR 312.21 (c).

For Phase II, the report is final when an environmental professional or state official has prepared an environmental assessment report that
has been accepted by the grant recipient.
For a supplemental assessment, the report is considered final when it has been accepted by the cooperative agreement recipient.

For additional information: http://www.epa.gov/brownfields/index.html


 2. Data Definition  and Source Reporting	
 2a. Original Data Source                                                                                                 |

                                                        168                                   Back to Table of Contents

-------
 Goal 3                                                  Objective 1                                             Measure B29
Assessments are funded either through cooperative agreements, or through EPA contracts (for Targeted Brownfields Assessments (TBAs)).
Cooperative agreement recipients (or sub-recipients) and contractors submit performance data to EPA in quarterly reports, and property
profile reports. On a limited basis EPA personnel are allowed to update or supplement information when a cooperative agreement has been
closed and outcomes have been reported to EPA.	
2b. Source Data Collection                                                                                                     |

Field sampling is utilized during the assessment process to determine cleanup needs and to develop assessment reports. Formal completion
of assessment reports is tabulated for this measure. Data collection is ongoing as projects are implemented.  Reporting instructions indicate
that accomplishments are to be recorded as they occur.

Assessment Pathways - Assessments meeting this definition can be completed by using funds via an Assessment Award, a Targeted
Brownfields Assessment (TEA) or by completing activities funded by  128(a) awards.


Geographic Detail:  As of FY12 ACRES leverages a Google Maps application within the system to assign geocoordinates based on address
information. Any deviation from these coordinated requires a manual override by the reporting party.


All the Brownfields cooperative agreements have a QA term and condition.  Project-level QA documents (i.e. QAPPs) are a minimum
requirement for EPA funding of Brownfields activities which include environmental data collection.  The program prepares and provides
the QA term and condition to the regional  offices and requires them to include it in the cooperative agreements.  The QA term and
condition for Brownfields Assessment cooperative agreements reads as follows:

"B. Quality Assurance (QA) Requirement.  1. When environmental samples are collected as part of the brownfields assessment, the
CAR shall comply with 40 CFR Part 31.45 requirements to develop and implement quality assurance practices sufficient to produce data
adequate to meet project objectives and to minimize data loss. State law may impose additional QA requirements. "

EPA contractors conducting Targeted Brownfiels Assessment should develop site-specific Quality Assurance Project Plans (QAPP) for
environmental assessment activities or a site-specific QAPP addendum if a Generic QAPP has already been approved for assessment
activities. The EPA requires all environmental monitoring and measurement efforts be conducted in accordance with approved QAPPs.
The purpose of the QAPP is to document the project planning process, enhance the credibility of sampling results, produce data of known
quality, and potentially save time and money by gathering data that meets the needs of the project and intended use of the data. The QAPP
is a formal document describing in comprehensive detail the necessary QA/QC and other technical activities that must be conducted to
ensure the results of the work performed will satisfy performance criteria and can be used for their intended  purposes.  All QA/QC
procedures  shall be in accordance with applicable professional technical standards, EPA requirements, government regulations and
guidelines,  and specific project goals and requirements.

OSWER has available the following guidance: "Quality Assurance Guidance for Conducting Brownfields Assessments." EPA
540-R-98-038. 1998.
                                                           169                                      Back to Table of Contents

-------
 Goal 3                                                  Objective 1                                            Measure B29
Quality Assurance Guidance for Conducting Brownfields Assessments 1998.pdf
2c. Source Data Reporting
Cooperative agreement recipients (or sub-recipients) and contractors submit performance data to EPA in quarterly reports, and property
profile reports. A Property Profile Form (PPF) collects information (environmental, historical, physical) from a property-specific
investigation funded under the Brownfields Program.

Contract Agreement recipients have 3 submission options: complete and submit the Property Profile Form (PPF) in online format
connected to the Assessment, Cleanup and Redevelopment Exchange System (ACRES) database; fill out a PPF version in Microsoft Excel
format and submit it via e-mail or regular mail to the EPA Regional Representative; or for multiple properties (more than ten properties) fill
out a multi-property Excel spreadsheet and submit it via email; or regular mail to the EPA Regional Representative. Any paper forms are
entered into ACRES via EPA contractor.

The Property Profile Form is an approved OMB form - OMB No. 2050-0192. Online forms available to registered users here:
http://www.epa.gov/brownfields/pubs/index.html.
EPA contractors conducting TBAs provide the assessment report to the EPA Region, who in turn enters the data into ACRES.  In some
cases, the contractor will also provide a filled-out PPF.

In accordance with the Terms and Conditions of the Brownfields Cooperative Agreements, all Brownfields cooperative agreement
recipients (CARs) must report accomplishments to EPA on a quarterly basis. Quarterly reports are due 30 days from the end of the federal
fiscal quarter.
 2009_multiple_ppf_template_external.xls  2009_property_profile_form_instructions.pdf


 2009_property_profile_form.xls	
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
Assessment, Cleanup, and Redevelopment Exchange System (ACRES). This database is the master database of all data supporting
OBLR measures. Recipients and EPA report directly into ACRES. It includes source data and transformed data (e.g., data aggregated into
Regional totals ). http://www.epa.gov/brownfields/pubs/acres/index.htm provides more information about this database.
                                                           170                                     Back to Table of Contents

-------
 Goal 3                                                  Objective 1                                             Measure B29

ACRES quality is assured by adherence to a security plan and quality management plan:
— Security plan.   The latest version of the security plan for ACRES is dated 1 1/2009. Contact Ryan Smith in OSWER for a copy of the
security plan.

— Quality Management Plan.  ACRES operates under its own Quality Management Plan (Data Quality Management Plan for the
Assessment, Cleanup, and Redevelopment Exchange System,  Version 1.02 ), which is updated annually, has been updated as of 2/2010.
Contact Ryan Smith for the most recent copy of the QMP.

OSWER Performance Assessment Tool (PAT).  This tool serves as the primary external servicing resource for organizing and reporting
OSWER's performance data, which collects information from OSWER program systems, and conforms it for uniform reporting and data
provisioning. PAT captures data from CERCLIS; replicates business logic used by CERCLIS for calculating measures; delivers that data
to EPA staff and managers via a business intelligence dashboard interface for analytic and reporting use; ; and transmits data to BAS. No
current system specifications document is currently available for PAT, but will be provided when available. Contact Lisa Jenkins in
OSWER regarding questions about PAT.

PAT operates under the OSWER Quality Management Plan (QMP), attached.
OSWER QMP printed 2010-03-23.pdf

PAT has a security certification confirming that a security policy is not necessary because no sensitive data are handled and PAT is built
upon the Oracle-based business intelligence system. PAT's security certification indicates that it follows all security guidelines for EPA's
Oracle Portal and that PAT is (1) not defined as a "Major Application" according to NIST Special Publication 800-18, Guide for
Developing Security Plans for Information Technology Systems, section 2.3.1; (2) does not store,  process, or transmit information that the
degree of sensitivity is assessed as high by considering the requirements for availability, integrity, and confidentiality according to NIST
Special Publication 800-18, Guide for Developing Security Plans for Information Technology Systems, section 3.7.2. (3) is not covered by
EPA Order 2100.2A1 Information Technology Capital Planning and Investment Control (CPIC).  The security certification, attached, was
submitted on 9/1 1/2008.
 PAT SecurityCertificationNIST.doc

Budget Automation System (BAS). BAS is the final repository of the performance values.
3b. Data Quality Procedures
Data reported by cooperative award agreement recipients are reviewed by EPA Regional grant managers for accuracy, to verify activities
and accomplishments, and to ensure appropriate interpretation of performance measure definitions.


                                                           171                                      Back to Table of Contents

-------
 Goal 3                                                  Objective 1                                            Measure B29
Step 1. Performance measure data entered into ACRES by recipients and/or EPA HQ contractor (for data submitted by recipients in an
alternate format, such as hard copy). For each cooperative agreement recipient,, all data entered are signed off by the EPA Regional
Representative (Regional Project Officer)  identified in the terms and conditions of the cooperative agreement. For contractors, the EPA Regional
COR/WAM signs off on the data.

Step 2. Each Region conducts Regional level review of data from the ACRES system. Rejected data must be edited by the original data
source. Approved data proceed to Step 3.

Step 3. HQ conducts National level review (EPA HQ contractors) of data approved by regions. Rejected data must be edited by the region
(Step 2). Approved data is stored in ACRES.

Step 4. Each quarter, OSWER Performance Assessment Tool (PAT) database pulls the approved data (performance measure) from
ACRES.

Step 5. Headquarters approves PAT results, and PAT pushes results into ACS/Measures Central.

Step 6. ACS/Measures Central aggregates Regional data into a national total.  OBLR reporting lead reviews and confirms result

3c. Data Oversight	

Headquarters-level oversight is provided by maintained by the EPA Contract Officer Technical Representative (COTR)

There is a Regional Project Officer assigned to each cooperative agreement. That Regional Project Officer is responsible for reviewing for
completeness and correctness all data provided by cooperative agreement recipients and data related to Targeted Brownfields Assessment
(TEA) contracts; their data is reviewed at the Headquarters level.  A list of Regional Project Officers is maintained by the Regional
Brownfields Coordinator in each region.


Each region also has a data manager (some Regions have a SEE Employee as their data manager).  The responsibility  of the data manager
is to disseminate information about ACRES updates and  accomplishments updates.  This person serves as the regional point of contact for
data related issues.
3d. Calculation Methodology
"Number of Brownfields properties assessed" is an aggregate of properties assessed using funding from Assessment Grants, Regional TEA
funds, and State and Tribal 128 Voluntary Response Program funding.

The unit of measure is "Properties"	
4. Reporting and Oversight
                                                           172                                     Back to Table of Contents

-------
 Goal 3                                                   Objective 1                                              Measure B29
4a. Oversight and Timing of Results Reporting

The ACRES Project Manager is responsible for reporting accomplishments and program results recorded via ACRES.

4b. Data Limitations/Qualifications

There are some known limitations related to the nature of much of the data being recipient-reported.  Regional Project Officers review data
to minimize errors (as described above), but some known quality issues remain. Most pertinent to this measure is that outcome data are
sometimes not reported by recipients, in the event that the EPA funding expires before the work is complete (for instance, if EPA funding
is only part of the funding used for an assessment for cleanup).

Given the reporting cycle and the data entry/QA period, there is typically a several month data lag getting reported data into ACRES.	
4c. Third-Party Audits
No external reviews.
 Record Last Updated: 02/13/2012 01:16:47 PM
                                                            173                                      Back to Table of Contents

-------
  Goal 3                                               Objective 1                                          Measure B33
  Measure Code :  B33 - Acres of Brownfields properties made ready for reuse.
  Office of Solid Waste and Emergency Response (OSWER)
   1. Measure and DQR Metadata	
   Goal Number and Title                          3 - Cleaning Up Communities and Advancing Sustainable Development
   Objective Number and Title                      1 - Promote Sustainable and Livable Communities
   Sub-Objective Number and Title                   2 - Assess and Cleanup Brownfields
   Strategic Target Code and Title                    2 - By 2015, make an additional 17,800 acres of brownfield properties ready for reuse
   Managing Office                                  Brownfields
   Performance Measure Term Definitions
Acres Made Ready for Reuse - Acres associated with properties benefiting from EPA Brownfields funding that have been assessed and
determined not to require cleanup, or where cleanup has been completed and institutional controls are in place, if required, as reported by
cooperative agreement recipients.

This typically occurs when one of the following conditions applies:

1. A clean or no further action letter (or its equivalent) has been issued by the state or tribe under its voluntary response program (or its
equivalent) for cleanup activities at the property; or

2. The cooperative agreement recipient or property owner, upon the recommendation of an
environmental professional, has determined and documented that on-property work is finished. Ongoing operation and maintenance
activities or monitoring may continue after a cleanup completion designation has been made.

Note: a property can be counted under this measure if an assessment is completed and results in a determination of no further cleanup being
required.

A "property" is defined as a contiguous piece of land under unitary ownership.  A property may contain several smaller components, parcels
or areas.

For additional  information: http://www.epa.gov/brownfields/index.html


 2. Data Definition and Source Reporting	
 2a. Original Data Source                                                                                                  |
                                                         174                                   Back to Table of Contents

-------
 Goal 3                                                    Objective 1                                                Measure B33

Assessments and Cleanups are funded either through cooperative agreements, or through EPA contracts (for Targeted Brownfields
Assessments (TBAs)). Cooperative agreement recipients (or sub-recipients) and contractors submit performance data to EPA in quarterly
reports, and property profile reports. On a limited basis EPA personnel are allowed to update or supplement information when a
cooperative agreement has been closed and outcomes have been reported to EPA.	
2b. Source Data Collection                                                                                                            |

 Data collection may involve tabulation of records and review of field surveys that identify acreage. The program does not require or recommend a
specific land surveying protocol for determining acreage. Data collection is ongoing as projects are implemented.  Reporting instructions
indicate that accomplishments are to be recorded as they occur.

Acres Made Ready for Reuse can be achieved by conducting an assessment and/or cleanup activities via an Assessment, Revolving Loan Fund or
Cleanup (ARC) award, a Targeted Brownfield Assessment (TEA) or by 128(a) funding used for site specific activities.

Conditions for counting "ACRES Made Ready for Reuse " above and beyond the completion of the funded activity:

Under assessment activities:
-If neither cleanup nor Institutional Controls (ICs) are required, then the acres are ready for resuse.
-If ICs are required and they are in place, but cleanup is not required, then the acres are ready for resuse.
-If cleanup is required and later conducted, (where EPA funds assessment activity, but does not fund cleanup) than the property associated with the
original assessment is considered ready for reuse.

Under cleanup activities:
-If cleanup is required and completed and ICs are not required, then acres are ready for resuse.
-If cleanup is required and completed and ICs are required and they are in place, then acres are ready for reuse.

Geographic Detail: As of FY12 ACRES leverages a Google Maps application within the system to assign geocoordinates  based on address
information. Any deviation from these coordinated requires a manual override by the reporting party.

All the Brownfields cooperative agreements have a QA term and condition. Project-level QA documents (i.e. QAPPs) are a minimum requirement for
EPA funding of Brownfields activities which include environmental data collection. The program prepares and provides the QA term and condition to
the regional offices and requires them to include it in the cooperative agreements.  The QA term and condition for Brownfields Assessment cooperative
agreements reads as follows:

"B. Quality Assurance (QA) Requirement.  1. When environmental samples are collected as part of the brownfields assessment, the  CAR shall
comply with 40 CFR Part 31.45 requirements to develop and implement quality assurance practices sufficient to produce data adequate to meet
project objectives and to minimize data loss. State law may impose additional QA requirements. "

EPA contractors conducting Targeted Brownfiels Assessment should develop site-specific Quality Assurance Project Plans (QAPP) for environmental
assessment activities or a site-specific QAPP addendum if a Generic QAPP has already been approved for assessment activities.  The  EPA requires all
environmental monitoring and measurement efforts be conducted in accordance with approved QAPPs.  The purpose of the QAPP is to document the
project planning process, enhance the credibility of sampling results, produce data of known quality, and potentially save time and money by gathering
                                                              175                                       Back to Table of Contents

-------
 Goal 3                                                   Objective 1                                             Measure B33
data that meets the needs of the project and intended use of the data. The QAPP is a formal document describing in comprehensive detail the necessary
QA/QC and other technical activities that must be conducted to ensure the results of the work performed will satisfy performance criteria and can be
used for their intended purposes. All QA/QC procedures shall be in accordance with applicable professional technical standards, EPA requirements,
government regulations and guidelines, and specific project goals and requirements.

OSWER has available the following guidance: "Quality Assurance Guidance for Conducting Brownfields Assessments." EPA 540-R-98-038.  1998.
Quality Assurance Guidance for Conducting Brownfields Assessments 1998.pdf
2c. Source Data Reporting
Cooperative agreement recipients (or sub-recipients) and contractors submit performance data to EPA in quarterly reports, and property
profile reports.  A Property Profile Form (PPF) collects information (environmental, historical, physical) from a property-specific
investigation funded under the Brownfields Program.

Contract Agreement recipients have 3 submission options: complete and submit the Property Profile Form (PPF) in online format
connected to the Assessment, Cleanup and Redevelopment Exchange System (ACRES) database; fill out a PPF version in Microsoft Excel
format and submit it via e-mail or regular mail to the EPA Regional Representative;  or for multiple properties (more than ten properties) fill
out a multi-property Excel spreadsheet and submit it via email; or regular mail to the EPA Regional Representative. Any paper forms are
entered into ACRES via EPA contractor.

The Property Profile Form is an approved OMB form - OMB No. 2050-0192. Online forms available to registered users here:
http://www.epa.gov/brownfields/pubs/index.html.
EPA contractors conducting TBAs provide the assessment report to the EPA Region, who in turn enters the data into ACRES. In some
cases, the contractor will also provide a filled-out PPF.

In accordance with the Terms and Conditions of the Brownfields Cooperative Agreements, all Brownfields cooperative agreement
recipients (CARs) must report accomplishments to EPA on a quarterly basis. Quarterly reports are due 30 days from the end of the federal
fiscal quarter.
 2009_multiple_ppf_template_external.xls 2009_property_profile_form_instructions.pdf


 2009_property_profile_form.xls	
                                                            176                                      Back to Table of Contents

-------
 Goal 3                                                 Objective 1                                            Measure B33
3.  Information Systems and Data Quality Procedures
3a.  Information Systems _

Assessment, Cleanup, and Redevelopment Exchange System (ACRES). This database is the master database of all data supporting
OBLR measures. Recipients and EPA report directly into ACRES. It includes source data and transformed data (e.g., data aggregated into
Regional totals ). http://www.epa.gov/brownfields/pubs/acres/index.htm provides more information about this database.


ACRES quality is assured by adherence to a security plan and quality management plan:
— Security plan.  The latest version of the Security Plan for ACRES is dated 1 1/2009.

— Quality Management Plan.  ACRES operates under its own Quality Management Plan (Data Quality Management Plan for the
Assessment, Cleanup, and Redevelopment Exchange System, Version 1.02 ), which is updated annually, has been updated as of 2/2010.
Contact Ryan Smith for the most recent copy of the QMP.

OSWER Performance Assessment Tool (PAT). This tool serves as the primary external servicing resource for organizing and reporting
OSWER's performance data, which collects information from OSWER program systems, and conforms it for uniform reporting and data
provisioning.  PAT captures data from CERCLIS; replicates business logic used by CERCLIS for calculating measures; delivers that data
to EPA staff and managers via a business intelligence dashboard interface for analytic and reporting use; ; and transmits data to BAS. No
current system specifications document is currently available for PAT, but will be provided when available. Contact Lisa Jenkins in
OSWER regarding questions about PAT.

PAT operates under the OSWER Quality Management Plan (QMP),  attached.
             .,£»
OSWER QMP printed 2010-03-23.pdf

PAT has a security certification confirming that a security policy is not necessary because no sensitive data are handled and PAT is built
upon the Oracle-based business intelligence system.  PAT's security certification indicates that it follows all security guidelines for EPA's
Oracle Portal and that PAT is (1) not defined as a "Major Application" according to NIST Special Publication 800-18, Guide for
Developing Security Plans for Information Technology Systems, section 2.3.1; (2) does not store, process, or transmit information that the
degree of sensitivity is assessed as high by considering the requirements for availability, integrity, and confidentiality according to NIST
Special Publication 800-18, Guide for Developing Security Plans for Information Technology Systems, section 3.7.2.  (3) is not covered by
EPA Order 2100.2A1 Information Technology Capital Planning and Investment Control (CPIC).  The security certification, attached, was
submitted on 9/1 1/2008.
           \w\
PAT SecurityCertificationNIST.doc
                                                          177                                     Back to Table of Contents

-------
 Goal 3                                                  Objective 1                                            Measure B33
Budget Automation System (BAS). BAS is the final repository of the performance values.
3b. Data Quality Procedures
Data reported by cooperative award agreement recipients are reviewed by EPA Regional grant managers for accuracy, to verify activities
and accomplishments, and to ensure appropriate interpretation of performance measure definitions.

Step 1. Performance measure data entered into ACRES by recipients and/or EPA HQ contractor (for data submitted by recipients in an
alternate format, such as hard copy). For each cooperative agreement recipient,, all data entered are signed off by the EPA Regional
Representative (Regional Project Officer) identified in the terms and conditions of the cooperative agreement. For contractors, the EPA Regional
COR/WAM signs off on the data.

Step 2. Each Region conducts Regional level review of data from the ACRES system. Rejected data must be edited by the original data
source. Approved data proceed to Step 3.

Step 3. HQ conducts National level review (EPA HQ contractors) of data approved by regions. Rejected data must be edited by the region
(Step 2). Approved data is stored in ACRES.

Step 4. Each quarter, OSWER Performance Assessment Tool (PAT) database pulls the approved data (performance measure) from
ACRES.

Step 5. Headquarters approves PAT results, and PAT pushes results into ACS/Measures Central.

Step 6. ACS/Measures Central aggregates Regional data into a national total.  OBLR reporting lead reviews and confirms result

ACRES. ACRES quality is assured by adherence to a security plan and quality management plan:
3c. Data Oversight
Headquarters-level oversight is provided by maintained by the EPA Contract Officer Technical Representative (COTR)

There is a Regional Project Officer assigned to each cooperative agreement.  That Regional Project Officer is responsible for reviewing for
completeness and correctness all data provided by cooperative agreement recipients and data related to Targeted Brownfields Assessment
(TEA) contracts; their data is reviewed at the Headquarters level. A list of Regional Project Officers is maintained by the Regional
Brownfields Coordinator in each region.


Each region also has a data manager (some Regions have a SEE Employee as their data manager). The responsibility of the data manager
is to disseminate information about ACRES updates and accomplishments updates.  This person serves as the regional point of contact for
                                                           178                                     Back to Table of Contents

-------
 Goal 3                                                  Objective 1                                             Measure B33
data related issues.	
3d. Calculation Methodology
"Acres of Brownfields property made ready for reuse" is an aggregate of "acreage assessed that does not require cleanup" and "acreage
cleaned up as reported by Assessment Grantees, Regional Targeted Brownfields Assessments, Cleanup Grantees, RLF Grantees, and State
and Tribal 128 Voluntary Response Program Grantees for which any required institutional controls are in place."

The unit of measure is acres.
4.  Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The ACRES Project Manager is responsible for reporting accomplishments and program results recorded via ACRES.

4b. Data Limitations/Qualifications

There are some known limitations related to the nature of much of the data being recipient-reported. Regional Project Officers review data
to minimize errors (as described above), but some known quality issues remain. Most pertinent to this measure is that outcome data are
sometimes not reported by recipients,  in the event that the EPA funding expires before the work is complete (for instance, if EPA funding
is only part of the funding used for an assessment for cleanup).

Given the reporting cycle and the data entry/QA period, there is typically a several month data lag getting reported data into ACRES.
4c. Third-Party Audits
No external reviews.
 Record Last Updated: 02/13/2012 01:16:47 PM
                                                            179                                     Back to Table of Contents

-------
  Goal 3                                               Objective 1                                          Measure CH2
  Measure Code :  CH2 - Number of risk management plan audits and inspections
  conducted.
  Office of Solid Waste and Emergency Response (OSWER)
   1. Measure and DQR Metadata
   Goal Number and Title
3 - Cleaning Up Communities and Advancing Sustainable Development
   Objective Number and Title
                                               1 - Promote Sustainable and Livable Communities
   Sub-Objective Number and Title
                                               3 - Reduce Chemical Risks at Facilities and in Communities
   Strategic Target Code and Title
1 - By 2015, continue to maintain the Risk Management Plan (RMP) prevention program
   Managing Office
  The Office of Emergency Management (OEM)
   Performance Measure Term Definitions
 Risk Management Plans: Risk Management Plans are documents that are submitted by facilities that store chemicals over a certain
threshold quantity. These plans are submitted every five years and document chemical processes, accident history, emergency contact
information, etc.

Inspections: An inspection is considered "conducted" when the EPA region completes the Inspection Conclusion Data Sheet (ICDS) and
enters the information into the Integrated Compliance Information System (ICIS).  However this is not always the case. For example, in an
ongoing enforcement case, more information or a second site visit might be needed.

Audit: Audits are similar to inspections but do not proceed to enforcement.

Background:  The subobjective's goal is to reduce chemical risks at facilities and in communities. Under the authority of section 112(r) of
the Clean Air Act the Chemical Accident Prevention Provisions require facilities that produce, handle, process, distribute, or store certain
chemicals to develop a Risk Management Program, prepare a Risk Management Plan (RMP), and submit the Plan to EPA. The purpose of
this performance measure is to ensure that facilities that are required to have risk management plans do indeed have plans and are available
in case of an incident.

OSWER's Office of Emergency Management implements the Risk Management Program under Clean Air Act section 112(r). Facilities are
required to prepare Risk Management Plans (RMPs) and submit them to EPA.  In turn, EPA Headquarters (HQ) provides appropriate data to
each  Region and delegated state so that they have the RMP data for their geographical area. EPA regions and delegated states conduct
inspections.
                                                         180
                                               Back to Table of Contents

-------
 Goal 3                                                 Objective 1                                            Measure CH2
2.  Data Definition and Source  Reporting
2a. Original Data Source
Data come from one of two sources:

1) EPA Regions. For most states, EPA regions are the implementing authorities that conduct and make record of inspections.

2) States: Nine states have received delegation to operate the RMP program. These delegated States report audit numbers to the
appropriate EPA Regional office so it can maintain composite information on RMP audits.
2b. Source Data Collection
EPA personnel travel to facilities to conduct inspections, using the Risk Management Plans that the facilities have submitted, as a basis for
their inspection. EPA inspects approximately 5 percent of the entire RMP facility universe annually.
2c. Source Data Reporting                                                                                                    |

EPA regional staff complete inspections and record information on the ICDS form. Inspections are recorded in the ICIS system as they are
completed.  EPA headquarters monitors progress of the data collection regularly and reports on the data at mid year and at the end of the
fiscal year.	


3. Information Systems and Data Quality Procedures	
3a. Information Systems                                                                                                     |

The EPA Annual Commitment System (ACS) is the database for the number of risk management plan (RMP) audits.  The Integrated
Compliance Information System (ICIS) is used for tracking RMP inspection activities. The Risk Management Plan (RMP) database is used
to collect RMP information from regulated facilities, and provides essential background information for inspectors. The EPA Annual
Commitment System (ACS) is the database for the number of risk management plan (RMP) audits.
3b. Data Quality Procedures
Facilities submit RMP data via an online system, with extensive validation and quality control measures applied during and after
submission to EPA. Regions review RMP data, and compare with information obtained during inspections.  Inspection data are collected
from states by EPA's Regional offices, and reviewed at the time of Regional data entry. Inspection data are regularly compared to similar
data from the past to identify potential errors. Inspection data quality is evaluated by both Regional and Headquarters' personnel.  Regions
enter data into the Agency's Annual Commitment System, and HQ prepares an annual report.	
3c. Data Oversight	

These individuals are Regional Chemical Emergency Preparedness and Prevention managers who are responsible for overseeing the
inspections and data entry at the Regional level. Headquarters staff performs QA/QC on the data entered by the Regions and reports data
out.	
3d. Calculation Methodology	
                                                          181                                    Back to Table of Contents

-------
 Goal 3                                                    Objective 1                                              Measure CH2
Regional and National targets for the number of RMP inspections are set based on the FTE and program funding available to the Regions,
and our understanding of the resources required to conduct RMP inspections. In prior years, our experience has shown that Regional
offices can inspect approximately 5% of the universe of RMP facilities with available resources. However, this percentage is strongly
dependent on the size and complexity of facilities inspected. EPA experience indicates that the field portion of RMP facility inspections
alone can require anywhere from a single person for one day or less at a simple, single-process facility up to a team of 6-8 inspectors for
1-2 weeks or more at a large chemical plant or refinery. In recent years, EPA has shifted its inspection focus to high-risk RMP facilities by
requiring regional offices to conduct a certain percentage of RMP inspections at these facilities.  As high-risk facilities generally require the
most inspection resources, the agency has reduced the overall RMP inspection target in order to devote additional resources toward
high-risk facility inspections. EPA has established criteria for identifying high-risk RMP facilities and provides a list of these facilities at
the beginning of each fiscal year to each Regional  office.  For FY 2013, the overall national RMP inspection target has been reduced from
approximately 5% to 4%, while the percentage of high-risk facility inspections has been raised from approximately 25% to 30%.	


4.  Reporting and Oversight	
4a. Oversight and Timing of Results Reporting                                                                                        |
These individuals are OEM personnel who work on the Chemical Emergency Preparedness and Prevention programs either in technical expertise or
program evaluation.
4b. Data Limitations/Qualifications                                                                                                  |
ICIS data quality is dependent on completeness and accuracy of the data provided by  state programs and the EPA Regional offices.

Data are count data and not open to interpretation.

RMP data quality is enhanced by system validation, but accuracy is dependent on what the facility submits in their Risk Management Plan.

4c. Third-Party Audits                                                                                                             |

There are no third party audits for the RMP measure.
 Record Last Updated: 02/13/2012 01:17:00 PM
                                                             182                                       Back to Table of Contents

-------
  Goal 3                                                Objective 2                                          Measure HWO
  Measure Code : HWO - Number of hazardous waste facilities with new or updated
  controls.
  Office of Solid Waste and Emergency Response (OSWER)
   1. Measure and DQR Metadata
   Goal Number and Title
3 - Cleaning Up Communities and Advancing Sustainable Development
   Objective Number and Title
                                                2 - Preserve Land
   Sub-Objective Number and Title
                                                2 - Minimize Releases of Hazardous Waste and Petroleum Products
   Strategic Target Code and Title
1 - By 2015,prevent releases at 500 hazardous waste management facilities with initial approved controls
   Managing Office
  Office of Resource Conservation and Recovery
   Performance Measure Term Definitions
Definition of "Hazardous Waste Facilities": This universe is composed of facilities that were subject to permits as of 10-1-1997 and
subsequent years. EPA plans to update the list of units that need "updated controls" after the end of each Strategic Plan cycle.  Those
facilities that need updated controls are a smaller set within the larger permitting universe tracked for strategic and annual goals associated
with the Government Performance and Results Act (GPRA).

Definition of "New or Updated Controls":

Facilities under control is an outcome based measure as permits or similar mechanisms are not issued until facilities have met standards or
permit conditions that are based on human health or environmental standards. Examples include sites cleaned up to a protective level; any
groundwater releases controlled so no further attenuation is occurring; any remaining waste safely removed or capped (isolated); and long
term controls in place to protect people and the environment at the site, if any contamination remains. An updated control, such as a permit
renewal, indicates that the facility has upgraded its operations to ensure continued safe operation, minimizing the potential for releases and
accidents.
 2. Data Definition and Source Reporting
 2a. Original Data Source
 States and EPA's Regional offices generate the data.
 2b. Source Data Collection
 Facility data: The authorized states have ownership of their data and EPA has to rely on them to make changes. The data that determine if
 a facility has met its permit requirements are prioritized in update efforts. States and EPA's Regional offices manage data quality related to
 timeliness and accuracy.
                                                          183
                                                Back to Table of Contents

-------
 Goal 3                                                  Objective 2                                           Measure HWO
2c. Source Data Reporting

Data can be entered directly into RCRAInfo, although some use a different approach and then "translate" the information into RCRAInfo.
Supporting documentation and reference materials are maintained in Regional and state files. Users log on at the following URL:
https://rcrainfo.epa.gov/rcrainfo/logon.isp.
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
RCRAInfo, the national database which supports EPA's RCRA program, contains information on entities (generically referred to as
"handlers") engaged in hazardous waste generation and management activities regulated under the portion of the Resource Conservation
and Recovery Act (RCRA) that provides for regulation of hazardous waste.

RCRAInfo has several different modules, and allows for tracking of information on the regulated universe of RCRA hazardous waste
handlers, such as facility status, regulated activities, and compliance history. The system also captures detailed data on the generation of
hazardous waste by large quantity generators and on waste management practices from treatment, storage, and disposal facilities.
RCRAInfo is web accessible, providing a convenient user interface for Federal, state and local managers, encouraging development of
in-house expertise for controlled cost, and states have the option to use commercial off-the-shelf software to develop reports from database
tables.

RCRAInfo is currently  at Version 5 (V5), which was released in March 2010. V5  expanded on V4's capabilities and made updates to the
Handler module to support two new rules that went into effect in 2009.

Access to RCRAInfo is open only to EPA Headquarters, Regional, and authorized state personnel. It is not available to the general public
because the  system contains enforcement sensitive data. The general public is referred to EPA's Envirofacts Data Warehouse to obtain
information on RCRA-regulated hazardous waste sites. This non-sensitive information is supplied from RCRAInfo to Envirofacts.
3b. Data Quality Procedures                                                                                                    |
Within RCRAInfo,  the  application software  contains structural controls that promote the  correct entry  of the high-priority national
components.

In December 2008, EPA made a significant update to RCRAInfo (Version 4) to address many data quality concerns related to the
Permitting module, upon which this measure is based.  This update added components that help the user identify errors in the system
(Example: data gap report).  RCRAInfo is currently at  Version 5, which was released in March 2010.  Version 5 made a number of updates
to the Handler module that did not have a direct impact on this measure. However, EPA Headquarters has placed added emphasis on data
quality and runs monthly reports to identify potential data errors, and then works with the States and Regions to correct those errors.

RCRAInfo documentation, which is available to all RCRAInfo users on-line at https://rcrainfo.epa.gov/, provides guidance to facilitate the

                                                           184                                     Back to Table of Contents

-------
 Goal 3                                                    Objective 2                                              Measure HWO
generation and interpretation of data.

U.S. Environmental Protection Agency.  Office of Resource Conservation and Recovery. RCRAInfo website with documentation and data
http://www.epa.gov/enviro/html/rcris/index.html (accessed January 10,2012).
3c. Data Oversight
The Information Collection and Analysis Branch (ICAB) maintains a list of the Headquarters, Regional and delegated state/territory users and controls
access to the system.  Branch members ensure data collection is on track, conduct QA reports, and work with Regional and state partners to resolve
issues as they are discovered.

3d. Calculation Methodology
Determination of whether or not the facility has approved controls in place is based primarily on the legal and operating status codes for
each unit.

Accomplishment of updated controls is based on the permit expiration date code and other related codes.

The baseline is composed of facilities that can have multiple units. These units may  consolidate, split or undergo other activities that cause
the number of units to change. There may be occasions where there needs to be minor baseline modifications. The larger permitting
universe is carried over from one EPA strategic planning cycle to the next (starting with facilities subject to permits as of 10-1-1997) with
minor changes made with each subsequent strategic planning cycle (e.g., facilities referred to Superfund are removed, or facilities never
regulated are removed; facilities that applied for a permit within the last strategic cycle are added). EPA updates the list of units that need
"updated controls" after the end of each strategic planning cycle.  Those facilities that need updated controls are a smaller set within the
larger permitting universe.

Complete data dictionary is available at: http ://www. epa. gov/enviro/html/rcri s/rcri s  table .html

The unit of analysis for this measure is "facilities."	
4.  Reporting and  Oversight
4a. Oversight and Timing of Results Reporting
Program Implementation and Information Division (PIID) data analysts are responsible for the reporting.

4b. Data Limitations/Qualifications	

Even with the increasing emphasis on data quality, with roughly 10,000 units in the baseline (e.g., a facility can have more than one unit),
there are problems with the number of facilities in the baseline and their supporting information, particularly with the older inactive
facilities. EPA Headquarters works with the EPA Regional offices to  resolve them.

                                                             185                                      Back to Table of Contents

-------
 Goal 3                                                  Objective 2                                            Measure HWO
Basic site data may become out-of-date because RCRA does not mandate the notification of all information changes. Nevertheless, EPA
tracks the facilities by their ID numbers and those should not change even during ownership changes (RCRA Subtitle C EPA Identification
Number, Site Status, and Site Tracking Guidance , March 21, 2005).
4c. Third-Party Audits

The 1995 U.S. Government Accountability Office (GAO) report Hazardous Waste: Benefits of EPA's Information System Are Limited
(AEVID-95-167, August 22, 1995, http://www.gao.gov/archive/1995/ai95167.pdf, Accessed January 10, 2012) on EPA's Hazardous Waste
Information System reviewed whether national RCRA information systems support EPA and the states in managing their hazardous waste
programs. Those recommendations coincided with ongoing internal efforts to improve the definitions of data collected, and ensure that data
collected provide critical information and minimize the burden on states. RCRAInfo, the current national database, has evolved in part as a
response to this report. The "Permitting and Corrective Action Program Area Analysis" was the primary vehicle for the improvements
made in the December 2008 release (V4).

EPA OIG report:

U.S. Environmental Protection Agency.  "Permitting and Corrective Action Program Area Analysis". WIN/INFORMED Executive
Steering Committee, July 28, 2005.	
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                           186                                     Back to Table of Contents

-------
  Goal 3                                               Objective 3
  Measure Code :  112 - Number of LUST cleanups completed that meet risk-based
  standards for human  exposure and groundwater migration.
  Office of Solid Waste  and Emergency Response (OSWER)
                                                           Measure 112
   1. Measure and DQR Metadata
   Goal Number and Title
3 - Cleaning Up Communities and Advancing Sustainable Development
   Objective Number and Title
                                               3 - Restore Land
   Sub-Objective Number and Title
3 - Cleanup Contaminated Land
   Strategic Target Code and Title
5 - Through 2015,reduce the backlog of LUST cleanups
   Managing Office
  Office of Underground Storage Tanks (OUST)
   Performance Measure Term Definitions
Cleanups Completed -The number of cleanups completed is the cumulative number of confirmed releases where cleanup has been initiated
and where the state has determined that no further actions are currently necessary to protect human health and the environment. This
number includes sites where post-closure monitoring as long as site-specific (e.g., risk-based) cleanup goals have been met. Site
characterization, monitoring plans, and site-specific cleanup goals must be established and cleanup goals must be attained for sites being
remediated by natural attenuation to be counted in this category.  Clarification: "Cleanups Completed" is a cumulative category-sites
should never be deleted from this category. It is no longer necessary to report separately cleanups completed that are state lead with state
money and cleanups completed that are responsible party lead. It is, however, still necessary to report the number of cleanups completed
that are state lead with Trust Fund money. A "no further action" determination made by the state that satisfies the "cleanups initiated"
measure above, also satisfies this "cleanups completed" measure. This determination will allow a confirmed release that does not require
further action to meet the definition of both an initiated and completed cleanup.

For complete definition see EPA OUST.  UST And LUST Performance Measures Definitions. January 18, 2008.
http://www.epa.gov/OUST/cat/PMDefmitions.pdf, which are referenced in the Guidance To Regions For Implementing The LUST
Provision Of The American Recovery And Reinvestment Act Of 2009, EPA-510-R-09-003,  June 2009,
http://www.epa.gov/oust/eparecovery/lustproguide.pdf, p. 7-8.
See also: EPA.  Environmental Protection Agency Recovery Act Program Plan: Underground Storage Tanks.  May  15, 2009.
http://www.epa.gov/recovery/plans/oust.pdf

Risk-based standards for human exposure and groundwater migration.

Reference:  Semi-annual Report of UST Performance Measures, End Of Mid Fiscal Year 2011 - as of March 31, 2011, dated May 2011  :
http://www.epa.gov/OUST/cat/ca_l l_12.pdf
                                                        187
                                               Back to Table of Contents

-------
 Goal 3                                                  Objective 3                                               Measure 112

2.  Data Definition and Source Reporting	
2a. Original Data Source                                                                                                        |

 The original data source is States, DC, and territories who sign Leaking Underground Storage Tank (LUST) agreements with EPA.  These
entities can delegate reporting to sub-recipients (such as local governments).

Each EPA regional office manages work that occurs within regional boundaries.

For more information:
1.      US EPA Office of Underground Storage Tanks. "Guidance to Regions for Implementing the LUST Provision of the American
Recovery and Reinvestment Act of 2009." EPA-510-R-09-003.  June 2009. http://www.epa.gov/oust/eparecovery/lustproguide.pdf
2.      US EPA Office Of Underground Storage Tanks.  "Supplemental Guidance on Recovery Act Recipient Reporting (Section 1512) of
the American Recovery and Reinvestment Act of 2009." Memo from Carolyn Hoskinson to Regional UST Managers. October 2, 2009.
http://www.epa.gov/oust/eparecovery/OUST 1512 Memo 100209.pdf
2b. Source Data Collection                                                                                                      |

Determination of cleanup completion requires consideration of environmental data, such as field sampling, which can vary by project. The
overall measure requires tabulation of the number LUST clean-ups completed.

Spatial Detail: Geographic granularity can vary. Sub-recipient data submissions (when delegated) may be as detailed as the site level, for
which granularity is defined in latitude and longitude. Other data are entered by recipients for the entire state/territory (excluding
sub-recipient data). Granularity for work in Indian Country is the Regional level.

Spatial Coverage: National

For  cooperative agreements: Regional offices include QA Terms and Conditions in their states'  assistance agreement. CAs must be
current and specify: QA roles and responsibilities  for EPA and grantee recipients; and quality requirements including responsibilities for
final review and approval. Default quality requirements include: organization-level QA documentation (i.e. QMP) for state agencies and
primary contractors; and project-level QAPPs for  each CA.  In accordance with EPA's Uniform Administrative Requirements for Grants
and Cooperative Agreements, 40 CFR Part 31.45, states must develop and implement quality assurance practices. The regulation requires
developing and implementing quality assurance practices that  will "produce data of quality adequate to meet project objectives and to
minimize   loss   of   data   to   out   of  control  conditions   or   malfunctions';    see   OSWER  Directive   9650.10A:
www.epa.gov/oust/directiv/d965010a.htmtfsecl 1"

For contracts:  EPA Regions determine which quality requirements are applicable. Contracts must be current and specify:  QA roles and
responsibilities for EPA and national LUST  contractors; and quality requirements including responsibilities for final review and approval.
Default quality requirements include: organization-level QA documentation (i.e. QMP) for the  primary contractors; and project-level
QAPPs for each Tribal LUST remedial Work Assignment. Sample EPA contract language: "the Contractor shall comply with the

                                                           188                                     Back to Table of Contents

-------
 Goal 3                                                   Objective 3                                                Measure 112
higher-level quality standard selected below: Specifications and Guidelines for Quality Systems for Environmental Data Collection and
Environmental Technology Programs (ANSI/ASQC E4, 1994). As authorized by FAR 52.246-11, the higher-level quality standard
ANSI/ASQC E4 is tailored as follows:  The solicitation and contract require the offerers/contractor to demonstrate conformance to
ANSI/ASQC E4 by submitting the quality documentation described below. [Specifically,...] ... The Contractor shall not commence actual
field work until until the Government has approved the quality documentation (i.e., QAPP)."

Note: Regions keep copies of individual QAPPs associated with cooperative agreements and contracts. Each EPA regional office manages
its own state and tribal assistance agreements.
2c. Source Data Reporting
Site assessments and cleanup status are recorded as milestones are achieved, in accordance with each site's schedule and each recipient's
procedures.  Contractors and other recipients individually maintain records for reporting accomplishments into LUST4. Their data systems vary.

States, DC and territories submit location-, funding-, and progress-related data directly into LUST4.

LUST4 also allows for bulk (batch) uploading by states/territories that already have the location & measures-related data captured in a data
system or have the technical expertise to create flat files through another method in exactly the format and layout specified.  This batch
uploading is not supported by OUST; data providers not comfortable with this approach are encouraged to use the interactive online
features of the Locations Subsystem and Measures Subsystem.  Access to the LUST4 Locations and Measures Subsystems is available
online via the EPA portal at http://portal.epa.gov under the My Communities/Underground Storage Tank menu page.
3.  Information Systems and  Data Quality Procedures	
3a.  Information Systems
LUST4. This database is the master database of all LUST program-related data, including but not limited to data supporting Recovery Act measures.
Recipients and EPA report data for activity and measures directly into LUST4.  LUST4 includes both source data and transformed data (e.g., data
aggregated into Regional totals).

The program's Oracle web-based system— LUST4-- accessed through EPA's portal.

OSWER Performance Assessment Tool (PAT).  This tool serves as the primary external servicing resource for organizing and reporting OSWER's
performance data. PAT collects information from OSWER program systems, and conforms it for uniform reporting and data provisioning. PAT captures
data from LUST4; replicates business logic used by LUST4 for calculating measures; can deliver that data to EPA staff and managers via a business
intelligence dashboard interface for analytic and reporting use; enables LUST point of contact to document status and provide explanation for each
measure; and transmits data to the Budget Automation System.

Budget Automation System (BAS). BAS is the final repository of the performance values.
                                                                                                                              I
                                                             189                                      Back to Table of Contents

-------
 Goal 3                                                     Objective 3                                                  Measure 112
3b. Data Quality Procedures	|
EPA's regional grants project officers and regional program managers provide first-level data quality reviews and oversight of their
recipients' program performance measure results.

OUST uses a combination of automated validation along with manual QA/QC review.

QA/QC REVIEW BY REGIONS. EPA/OUST oversees the use of the QA/QC checklist, which is incorporated into the LUST4 oracle
web-based system. Regions complete the QA/QC checklist, sign it electronically and submit it to EPA/OUST for review, comment and
approval of each record.
NOTE: This QA/QC checklist was last updated 10/1/2009 and is accessed through the user interface of LUST4.

Regional QA/QC Evaluation Checklist -
Note: Checklist is to be completed by Regional reviewer and will appear "shaded" to others.
1. Previous Totals Column
~ Verify the previous total number is correct by comparing it to the total from the last reporting period. If there is a discrepancy, report the information
in the "Correction  To Previous Data" column. Please add comments in the "Comments" column for any corrections that are made to the applicable
performance measure.
2. Actions This Reporting Period
For each performance measure, if this "Reported" number deviates by more than 10% from the last period's number or appears otherwise questionable,
complete the following actions:
~ Review the state's explanation, if available.
~ If necessary, contact the state to obtain the corrected numbers and/or obtain a sufficient explanation and include the explanation in the "Comments"
section for the applicable performance measure.
3. Corrections to Previous Data Column
Verify that if any corrections have been listed that an explanation for the correction is provided in the "Comments" column and complete the following
actions:
~ Verify and discuss the correction with the state  if the correction is >10% or if the correction appears questionable (e.g., database conversions, database
cleanup efforts to resolve misclassified data, duplicative records, etc.)
~ Verify if the  corrections are anticipated to be a one-time event or occur over multiple years
~ Evaluate if the corrections will impact other performance measures (e.g., if the number of cleanups completed is adjusted downward by a correction,
does this also result in a commensurate downward adjustment of cleanups initiated?) Include any additional comments in the "Comments" column as
necessary.
4. Totals (Cumulative, if applicable)
~ Verify accuracy of all cumulative totals
~ Include any additional comments in the "Comments" column as necessary


AUTOMATED VALIDATION.  For instance upon data entry of any new location, location information is verified automatically by the Facility
Registry System (FRS). Location information without latitude and longitude  is also geocoded automatically. When entering measure information, the
system does not allow a measure value less than the applicable count of locations that are relevant to that measure (the count of location records is
automatically generated by the system); a measure value greater than the applicable count of locations requires provision of an explanatory comment.
                                                               190                                       Back to Table of Contents

-------
 Goal 3                                                     Objective 3                                                 Measure 112

EPA/OUST provides second-level data quality reviews of all data

LUST4. LUST4 operates under OSWER's QMP, including the security policy specified in that QMP. LUST4 does not have any
stand-alone certifications related to the EPA security policy or the Systems Life Cycle Management policy.  The LUST4 system is built
upon Oracle Business Intelligence tools provided by the EPA Business Intelligence Analytics Center, which ensures that a stand-alone
security certification is not necessary.

PAT. PAT operates under the OSWER Quality Management Plan (QMP).  PAT has a security certification confirming that a security
policy is not necessary because no sensitive data are handled and PAT is built upon the Oracle-based business intelligence system. PAT's
security certification indicates that it follows all security guidelines for EPA's Oracle Portal and that PAT is (1) not defined as a "Major
Application" according to NIST Special Publication 800-18, Guide for Developing Security Plans for Information Technology Systems,
section 2.3.1; (2) does not store, process,  or transmit information that the degree of sensitivity is assessed as high by considering the
requirements for availability, integrity, and confidentiality according to NIST Special Publication 800-18, Guide for Developing Security
Plans for Information Technology Systems, section 3.7.2.  (3) is not covered by EPA Order 2100.2A1 Information Technology Capital
Planning and Investment Control (CPIC).

Data Flow:

Step 1. Performance measure and location data are entered into LUST4 by recipients (or sub-recipients, if delegated) or by Regions (for EPA
contractors).Upon entry of any new location, location information is verified automatically by the Facility Registry System (FRS). Location
information without latitude and longitude is also geocoded automatically. (FRS data are used solely for data entry QA/QC, not to assist in calculating
results.)

Step 2. Each Region conducts Regional level review of data from the LUST4 system. Rejected data must be edited by the original data source.
Approved data proceed to Step 3.

Step 3. Headquarters' staff perform performs National Program Review, using data from the LUST4 system. Rejected data must be reviewed by the
region and, if needed, pushed back to the state for editing(Step 2).

Step 4. PAT pulls data from LUST4. Headquarters staff compare PAT results to LUST4 results. If PAT does not match LUST4 then there was an
error with the upload and data is reloaded. Headquarters staff enter into PAT the ACS status information of "Indicator" for each measure and, if desired,
explanation.(Note: PAT allows for programs to identify status other than "Indicator." When programs select a status of "no status," "data not available,"
or "target not met," PAT requires that an explanation be provided. LUST program policy is to resolve all reporting issues prior to ACS reporting, so
"Indicator" is the only status chosen and explanations for that status are optional.)

Step 5. Headquarters approves PAT results,  and PAT pushes results into BAS/Measures Central.

Step 6. Measures Central aggregates Regional data into a national total.  OUST reporting lead reviews and certifies results.	
3c. Data Oversight
                                                              191                                       Back to Table of Contents

-------
 Goal 3                                                    Objective 3                                                 Measure 112
An EPA Headquarters primary contact maintains a list of the HQ (OUST and OEI), Regional and state/territory primary and backup users; a record of
changes to the list is also maintained.  The primary HQ contact ensures that Regional reporting is on track, conducts QA on LUST performance
measures, ensures QA issues are resolved and/or documented, and oversees final reporting to BAS.

Regional Program Managers are ultimately responsible for regional-level data. They conduct their review based upon a national QA/QC checklist.	
3d. Calculation Methodology                                                                                                        |
The cumulative number of confirmed releases where cleanup has been initiated and where the state or region (for the tribes) has determined
that no further actions are currently necessary to protect human health and the environment, includes sites where post-closure monitoring is
not necessary as long  as  site specific (e.g., risk based)  cleanup goals  have been met.   Site  characterization, monitoring plans and
site-specific cleanup goals must be established and cleanup goals must be attained for  sites being remediated by natural attenuation to be
counted in this category. (See http://www.epa.gov/OUST/cat/PMDefinitions.pdf.)


The unit of analysis is site cleanup
4.  Reporting and  Oversight
4a. Oversight and Timing of Results Reporting
Semiannual by Deputy Office Director. Responsible for final review to ensure LUST 4 System Manager has completed review, and
numbers are accurate.
4b. Data Limitations/Qualifications
Data quality depends on the accuracy and completeness of state records.
4c. Third-Party Audits

Not applicable!
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                             192                                       Back to Table of Contents

-------
  GoalS                                                Objectives                                           Measure 151
  Measure Code : 151 - Number of Superfund sites with human  exposures under
  control.
  Office of Solid Waste and Emergency Response (OSWER)
   1.  Measure and DQR Metadata
   Goal Number and Title
3 - Cleaning Up Communities and Advancing Sustainable Development
   Objective Number and Title
                                                3 - Restore Land
   Sub-Objective Number and Title
3 - Cleanup Contaminated Land
   Strategic Target Code and Title
2 - By 2015, increase the number of Superfund final and deleted NPL sites and RCRA facilities
   Managing Office
  OSRTI
   Performance Measure Term Definitions
 Definition of Site: "Sites" refers only to National Priorities List (NPL) sites. (See below for definition of NPL.)  The term "site" itself is
not explicitly defined under Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) or by the Superfund
program; instead "site" is defined indirectly in CERCLA's definition of "facility," as follows: "The term 'facility' means (A) any building,
structure, installation, equipment, pipe or pipeline (including any pipe into a sewer or publicly owned treatment works), well, pit, pond,
lagoon, impoundment, ditch, landfill, storage container, motor vehicle, rolling stock, or aircraft, or (B) any site or area where a hazardous
substance has been deposited, stored, disposed of, or placed, or otherwise come to be located; but does not include any consumer product in
consumer use or any vessel."  (CERCLA, Title I, Section 101, (9)).

Superfund Alternative Approach (SAA) sites: The program collects and enters into CERCLIS, human exposure determinations at SAA
sites, but does not target or report official results at this time.

Definition of National Priorities List (NPL): Sites are listed on the National Priorities List (NPL) upon completion of Hazard Ranking
System (HRS) screening, public solicitation of comments about the proposed site, and final placement of the site on the NPL after all
comments  have been addressed. The NPL primarily serves as an information and management tool. It is a part of the Superfund cleanup
process and is updated periodically.   Section 105(a)(8)(B) of CERCLA as amended, requires that the statutory criteria provided by the HRS
be used to  prepare a list of national priorities among the known releases or threatened releases of hazardous substances, pollutants, or
contaminants throughout the United States. This list, which is Appendix B of the National Contingency Plan, is the NPL. Visit the HRS
Toolbox (http://www.epa.gov/superfund/sites/npl/hrsres/index.htm) page for guidance documents that are used to determine if a site is a
candidate for inclusion on the NPL.  [Source: Superfund website, http://www.epa.gov/superfund/sites/npl/npl_hrs.htm]

(Also see Appendix B of the most recent Superfund Program Implementation Manual (SPIM), which is updated each fiscal year and
contains definitions and documentation/coding guidance for Superfund measures.  The most current SPIM can be found here:
http://epa.gov/superfund/policy/guidance.htm.)
                                                          193                                   Back to Table of Contents

-------
  GoalS                                                  Objectives                                            Measure 151

Definition of "Current Human Exposure under Control" (HEUC): - Sites are assigned to this category when assessments for human
exposures indicate there are no unacceptable human exposure pathways and the Region has determined the site is under control for current
conditions site wide.

The human exposure status at a site is reviewed annually by the 10th working day in October, or at any time site conditions change.
CERCLIS is to be updated within 10 days of any change in status.

The HEUC documents, for Proposed, Final, and Deleted NPL sites and SAA settlement sites, the progress achieved towards providing
long-term human health protection by measuring the incremental progress achieved in controlling unacceptable human exposures at a site.
This is also a Government Performance and Results Act (GPRA) performance measure.

Controlling unacceptable human exposures can occur in three ways:
— Reducing the level of contamination. For purposes of this policy, "contamination"
generally refers to media containing contaminants in concentrations above appropriate protective risk-based levels associated with complete
exposure pathways to the point where the exposure is no longer "unacceptable;"  and/or
— Preventing human receptors from contacting contaminants in-place; and/or
— Controlling human receptor activity  patterns (e.g., by reducing the potential frequency or duration of exposure).

Five categories have been created to describe the level of human health protection achieved at a site:
— Insufficient data to determine human exposure control status;
— Current human exposures not under  control;
— Current human exposures under control;
— Current human exposures under control and protective remedy or remedies in place; and
— Current human exposures under control, and long-term human health protection achieved.

Definition of Accomplishment of "HEUC":
The criteria for determining the Site-Wide Human Exposure status at a site are found in the Superfund Environmental Indicators  Guidance
Human Exposures Revisions" March 2008 (http://www.epa.gov/superfund/accomp/ei/pdfs/fmal_ei_guidance_march_2008.pdf).  [Source:
SPIM Appendix B]

(See Appendix B of the most recent SPIM, which is updated each fiscal year and contains definitions and documentation/coding  guidance
for Superfund measures.  The most current SPIM can be found here: http://epa.gov/superfund/policy/guidance.htm.)

The Superfund Program's performance measures are used to demonstrate program progress and reflect major site cleanup milestones from
start (remedial assessment completion) to finish (number of sites ready for anticipated use sitewide).  Each measure marks a significant step
in ensuring human health and environment protection at Superfund sites.

                                                            194                                     Back to Table of Contents

-------
  GoalS                                                 Objectives                                            Measure 151
References:

 U.S. Environmental Protection Agency, EPA Performance and Accountability Reports, http://www.epa.gov/ocfo/par/index.htm.

U.S. Environmental Protection Agency, Superfund Accomplishment and Performance Measures,
http://www.epa.gov/superfund/accomplishments.htm.

U.S. Environmental Protection Agency, Federal Facilities Restoration and Reuse Office - Performance measures,
http://www.epa.gov/fedfac/documents/measures.htm.

U.S Environmental Protection Agency, Office of Inspector General, Information Technology - Comprehensive Environmental Response,
Compensation, and Liability Information System (CERCLIS) Data Quality, No. 2002-P-00016, http://www.epa.gov/oigearth/eroom.htm.

U.S. Government Accountability Office, "Superfund Information on the Status of Sites, GAO/RCED-98-241",
http://www.gao.gov/archive/1998/rc98241.pdf

U.S. Environmental Protection Agency, Office of Superfund Remediation and Technology Innovation, Superfund Program Implementation
Manuals (SPIM) , http://www.epa.gov/superfund/policy/guidance.htm.

U.S. Environmental Protection Agency, Office of Solid Waste and Emergency Response, "OSWER Quality Management Plan",
http://www.epa.gov/swerffrr/pdf/oswer_qmp.pdf

U.S. Environmental Protection Agency, Office of Environmental Information, EPA System Life Cycle Management Policy Agency
Directive 2100.5, http://www.epa.gOv/irmpoli8/ciopolicy/2100.5.pdf.

U.S. Environmental Protection Agency, Office of Environmental Information, EPA's Information Quality Guidelines,
http://www.epa.gov/quality/informationguidelines.
 2. Data Definition and Source Reporting	
 2a. Original Data Source

 Original data sources vary, and multiple data sources can be used for each site. Typical data sources are EPA personnel, contractors
 (directly to EPA or indirectly, through the interagency agreement recipient or cooperative agreement recipient), U.S. Army Corps of
 Engineers (interagency agreement recipient), and states/tribes/other political subdivisions (cooperative agreement recipients). EPA also
 collects data via pre-final inspections at sites.

                                                            195                                     Back to Table of Contents

-------
 GoalS                                                  Objectives                                             Measure 151
(See item Performance Measure Term Definitions in Tab 1, for more information.  Also, detailed information on requirements for source
data and completion procedures can be found on the following Superfund website:
http://www.epa.gov/superfund/programs/npl_hrs/closeout/index.htm)
2b. Source Data Collection                                                                                                      |

Collection typically involves some combination of environmental data collection, estimation and/or tabulation of records/activities.
Documents such as risk assessments, Record of Decisions (RODs), Action Memoranda, Pollution Reports (POLREPS), Remedial Action
(RA Reports), Close-out Reports, Five-year Reviews, NPL Deletion/Partial Deletion Notices are known reliable sources of data and often
provide the information necessary for making an HEUC evaluation with reasonable certainty.

Each EPA Region has an information management coordinator (IMC) that oversees reporting.

The Human Exposure Environmental Indicator data were collected beginning in FY 2002.

The collection methods and and guidance for determining HEUC  status are found in the Superfund Environmental Indicators Guidance
Human Exposures Revisions" March 2008. (http://www.epa.gov/superfund/accomp/ei/ei.htm)

(See item Performance Measure Term Definitions, for more information and references.)

Source data collection frequency: No set interval. Varies by site

Spatial Extent: National

Spatial detail:  Site, defined in database by latitude/longitude pair. In cases in which projects work on a smaller part of a site, geography
may be defined at a finer grain — the project-level.

2c. Source Data Reporting                                                                                                      |

Varied reporting format for source data EPA uses  to make decisions. In many cases, EPA reviews site-specific secondary data or existing
EPA-prepared reports. Documents such as risk assessments, RODs, Action Memoranda, POLREPS, RA Reports, Close-out Reports,
Five-year Reviews, and NPL Deletion/Partial Deletion Notices are known reliable sources of data and often provide the information
necessary for making an HEUC evaluation with reasonable certainty.

EPA's Regional offices and Headquarters enter data into CERCLIS on a rolling basis.

The human exposure status at a site is reviewed annually by the 10th working day in October, or at any time site conditions change.
CERCLIS is to be updated within 10 days of any change in status.

The instrument for determining the Site-Wide Human  Exposure status at a site is found in the Superfund Environmental Indicators
Guidance Human Exposures Revisions" March 2008.  Determinations are made by regional staff and management and entered directly into
                                                           196                                     Back to Table of Contents

-------
 GoalS                                                 Objectives                                           Measure 151
CERCLIS.
http://www.epa.gov/superfund/accotnp/ei/ei.httn

See Appendix B of the most recent SPIM, which is updated each fiscal year and contains definitions and documentation/coding guidance
for Superfund measures. The most current SPIM can be found here: http://epa.gov/superfund/policy/guidance.htm.
3. Information Systems and Data Quality Procedures
3a.  Information Systems
The HEUC determination is made directly in CERCLIS once it is determined that the site is Under Control and has been approved as such
by appropriate regional personnel.

CERCLIS database - The CERCLIS database is used by the Agency to track, store, and report Superfund site information (e.g., NPL
sites and non-NPL Superfund sites).

(For more information about CERCLIS, see Appendix E of the most recent SPIM, which is updated each fiscal year and contains
definitions and documentation/coding guidance for Superfund measures. The most current SPIM can be found here:
http://epa.gov/superfund/policy/guidance.htm.)

CERCLIS operation and further development is taking place under the following administrative control quality assurance procedures: 1)
Office of Environmental Information Interim Agency Life Cycle Management Policy Agency Directive; 2) the Office of Solid Waste  and
Emergency Response (OSWER) Quality Management Plan (QMP);  3) EPA IT  standards; 4) Quality Assurance Requirements  in all
contract vehicles under which CERCLIS is being developed and maintained; and 5) EPA IT security policies.  In addition, specific controls
are in place for system design, data conversion and data capture, as well as CERCLIS outputs.
 19-0585 (CERCLIS QAPP) 200S-0410.doc

CERCLIS adherence to the security policy has been audited.  Audit findings are attached to this record.
 CERCLIS July S 200S scan High Medium Response.xls OSWER QMP printed 2010-03-23.pdf

OSWER Performance Assessment Tool (PAT).  This tool serves as the primary external servicing resource for organizing and reporting
OSWER's performance data, which collects information from OSWER program systems, and conforms it for uniform reporting and data
provisioning.  PAT captures data from CERCLIS; replicates business logic used by CERCLIS for calculating measures; delivers that data
                                                          197                                    Back to Table of Contents

-------
 GoalS                                                  Objectives                                             Measure 151
to EPA staff and managers via a business intelligence dashboard interface for analytic and reporting use; and transmits data to the Budget
Automated System (BAS). No current system specifications document is currently available for PAT, but will be provided when available.
For this measure, PA T transmits Regional-level data to BAS.

PAT operates under the OSWER QMP. PAT has a security certification confirming that a security policy is not necessary because no
sensitive data are handled and PAT is built upon the Oracle-based business intelligence system. PAT's security certification indicates that
it follows all security guidelines for EPA's Oracle Portal and that PAT is (1) not defined as a "Major Application" according to NIST
Special Publication 800-18, Guide for Developing Security Plans for Information Technology Systems, section 2.3.1; (2) does not store,
process, or transmit information that the degree of sensitivity is assessed as high by considering the requirements for availability, integrity,
and confidentiality according to NIST Special Publication 800-18, Guide for Developing  Security Plans for Information Technology
Systems, section 3.7.2.  (3) is not covered by EPA Order 2100.2A1 Information Technology Capital Planning and Investment Control
(CPIC).

EPA Headquarters is now scoping the requirements for an integrated (Superfund  Document Management  System-) SDMS-CERCLIS
system,  called the Superfund Enterprise  Management System  (SEMS).  Development work on SEMS  began in FY 2007  and will
continue through FY 2013.

SEMS represents further re-engineering of the national reporting systems to include additional elements of EPA's Enterprise Architecture.
SEMS will provide a common platform for major Superfund systems and future IT development. It will be constructed in part using EPA
IT enterprise architecture principles and components. SEMS will provide a Superfund Program user gateway to various IT systems and
information collections.

3b. Data Quality Procedures                                                                                                    |
The regional SOPs for HEUC data entry, along with review and instructions/guidance for determining the Site-Wide Human Exposure
status at a  site are found in the Superfund Environmental Indicators  Guidance Human Exposures Revisions" March 2008.
http://www.epa.gov/superfund/accomp/ei/ei.htm

A list of all Headquarters-level data sponsors is provided in Exhibit  E.2 in SPIM Appendix E, Information Systems. The most current SPIM
can be found here:
http://www.epa.gov/superfund/policy/guidance.htm

CERCLIS: To ensure data accuracy and control, the following administrative  controls are in place: 1) Superfund Program Implementation
Manual  (SPIM), the program management manual that details what data must be reported; 2) Report Specifications, which are published
for each report detailing how reported data  are calculated; 3) Coding Guide, which contains technical instructions to data users including
Regional EVICs, program personnel, data owners, and data entry personnel;  4) Quick Reference Guides (QRG), which are available in the
CERCLIS Documents  Database and provide  detailed instructions on data entry for nearly every module  in CERCLIS; 5) Superfund
Comprehensive Accomplishment (SCAP) Reports within CERCLIS, which serve as  a means to track, budget, plan, and evaluate progress
towards meeting Superfund targets and measures; 6) a historical lockout feature in CERCLIS so that changes in past fiscal year data can be
                                                           198                                     Back to Table of Contents

-------
 GoalS                                                  Objectives                                             Measure 151
changed only by approved and designated personnel and are logged to a Change Log report, 7) the OSWER QMP; and 8) Regional Data
Entry Control Plans.

EPA Headquarters has developed data quality audit reports and Standard Operating Procedures, which address timeliness, completeness,
and accuracy, and has provided these reports to the Regions. In addition, as required by the Office of Management and Budget (OMB),
CERCLIS audit logs are reviewed monthly. The system was also re-engineered to bring CERCLIS into alignment with the Agency's
mandated Enterprise Architecture.  The first steps in this effort involved the migration of all 10 Regional and the Headquarters databases
into one single national database at the National Computing Center in Research Triangle Park (RTF) and the migration of SDMS to RTF to
improve efficiency and storage capacity. During this process SDMS was linked to CERCLIS which enabled users to easily transition
between programmatic accomplishments as reported in CERCLIS and the actual document that defines and describes the accomplishments.


Regional Data Entry  Control Plans. Regions have established and published Data Entry Control Plans, which are a key component of
CERCLIS verification/validation procedures. The control plans include: (1) regional policies and procedures for entering data into
CERCLIS, (2) a review process to ensure that all Superfund accomplishments are supported by source documentation, (3) delegation  of
authorities for approval of data input into CERCLIS,  and (4) procedures to ensure that reported accomplishments meet accomplishment
definitions. In addition, regions document in their control plans the roles and responsibilities of key regional employees responsible for
CERCLIS data (e.g., regional project manager, information management coordinator, supervisor, etc.), and the processes to assure that
CERCLIS data are current, complete, consistent, and accurate. Regions may undertake centralized or decentralized approaches to data
management. These plans are collected annually for review by OSRTI/EVIB (Information Management Branch).  [Source:  SPIM FY11,
III. J and Appendix E. http://www.epa.gov/superfund/action/process/spim 10/pdfs/appe.pdf)1

Copies of the 2010 Regional Data Entry Control Plans are provided with this DQR. Current and past year plans are available by contacting
the Chief, Information  Management Branch, Office of Superfund Remediation and Technology Innovation.

Regions are expected to prepare Data Entry Control Plans consistent with the SPIM and the Headquarters guidance: "CERCLIS Data Entry
Control Plan Guidance," June 2009.
               •ft
                -•Jrt>
 2009_drafLCERCLIS_DECP_guidance_6-5-09_.pdf

Superfund Program Implementation Manual (SPIM). The SPIM should be the first source referred to for additional questions related
to program data and reporting. The SPIM is a planning document that defines program management priorities, procedures, and practices
for the Superfund program (including response, enforcement, and Federal facilities). The SPIM provides the link between the GPRA,
EPA's Strategic Plan, and the Superfund program's internal processes for setting priorities, meeting program goals, and tracking
performance. It establishes the process to track overall program progress through program targets and measures.

The SPIM provides standardized and common definitions for the Superfund program, and it is part of EPA's internal control structure. As
required by the Comptroller General of the United States, through generally accepted accounting  principles (GAAP) and auditing
                                                           199                                     Back to Table of Contents

-------
 GoalS                                                  Objectives                                            Measure 151
standards, this document defines program scope and schedule in relation to budget, and is used for audits and inspections by the
Government Accountability Office (GAO) and the Office of the Inspector General (OIG). The SPIM is developed on an annual basis.
Revisions to the SPIM are issued during the annual cycle as needed.

The SPIM contains three chapters and a number of appendices. Chapter 1 provides a brief summary of the Superfund program and
summarizes key program priorities and initiatives. Chapter 2 describes the budget process and financial management requirements. Chapter
3 describes program planning and reporting requirements and processes. Appendices A through I highlight program priorities and
initiatives and provide detailed programmatic information, including Annual Targets for GPRA performance measures, and targets for
Programmatic Measures. [Source: SPIM 2011, Chapter I]

The most current version of the SPIM can be found at: http://epa.gov/superfund/policy/guidance.htm


Data Flow:

Step 1.  Original data sources provide information.

Step 2.  EPA Region reviews and determines HEUC status at the site and adjusts CERCLIS records as needed.

Step 3.  Headquarters' OSRTI data sponsor reviews and approves/disapproves written justifications for Regional determinations of "Not
Under Control" and "Insufficient Data," using data from CERCLIS. Data sponsor works with Regional staff to ensure that disapproved
justifications comport with Superfund Program guidance.

Step 4.  OSWERs PAT pulls data from CERCLIS. Headquarters staff compare PAT results to CERCLIS results. If PAT does not match
CERCLIS then there was an error with the upload and data are reloaded. Headquarters staff enter into PAT the Annual Commitment
System (ACS) status information for each measure and, if necessary, a status explanation.

Step 5.  Headquarters approves PAT results, and PAT pushes results into BAS.

Step 6.  BAS aggregates Regional data into a national total. OSRTI reporting lead reviews and certifies results.	
3c. Data Oversight                                                                                                           |
The Superfund program has a "data sponsorship" approach to database oversight. Headquarters staff and managers take an active role in
improving the quality of data stored in CERCLIS by acting as data sponsors.

Data sponsorship promotes consistency and communication across the Superfund program. Headquarters data sponsors communicate and
gain consensus from data owners on data collection and reporting processes. Data sponsors ensure that the data they need to monitor
performance and compliance with program requirements is captured and stored properly in CERCLIS. To meet this goal, headquarters data
sponsors identify their data needs, develop data field definitions, and distribute guidance requiring submittal of these data. Data owners are
                                                           200                                     Back to Table of Contents

-------
 GoalS                                                   Objectives                                             Measure 151
normally site managers that need the data in support of site work. Data owners follow the guidance they receive from data sponsors, as they
acquire and submit data. Headquarters data sponsors assist data owners in maintaining and improving the quality of Superfund program
data. These data are available for data evaluation and reporting. Data sponsorship helps promote consistency in both national and regional
reporting. In addition, data sponsorship provides a tool to improve data quality through program evaluation and adjustments in guidance to
correct weaknesses detected. Data sponsors may conduct audits to determine if there are systematic data problems (e.g., incorrect use of
codes, data gaps, etc.).  A list of all Headquarters-level data sponsors is provided in Exhibit E.2 in SPIM Appendix E, Information Systems.
The most current SPIM can be found here: http://epa.gov/superfund/policy/guidance.htm [source example for process: Region 2 SOP, page
53, entry for Data-Entry Training/Oversight]

Specific roles and responsibilities of data sponsors:
— Identify data needs;
— Oversee the process of entering data into the system;
— Determine the adequacy of data for reporting purposes;
— Conduct focus studies of data entered (A focus study is where a data sponsor identifies a potential or existing data issue to a data owner
(see below), IMC, or other responsible person to determine if a data quality problem  exists, and to solve the problem, if applicable. (IMC
responsibilities discussed below.) Focus studies can be informal via electronic messages.);
- Provide definitions for data elements;
- Promote consistency across the Superfund program;
— Initiate changes in CERCLIS as the program changes;
— Provide guidance requiring submittal of these data;
— Determine the adequacy of data for reporting purposes;
— Support the development of requirements for electronic data submission; and
- Ensure there is "objective" evidence to support the accomplishment data entered in CERCLIS through identifying data requirements and
check to assure compliance by performing periodic reviews of a random CERCLIS data sample.  [Source: SPIM 2010, III.E and E.A.5]

The primary responsibilities of data owners are (1) to enter and maintain data in CERCLIS and (2) assume responsibility for complete,
current, consistent, and accurate data. The data owners for specific data are clearly identified in the system audit tables. Regions annually
update region-specific Data Entry Control Plans (DECP).  Among other things, Regional data entry control plans identify which Data
Sponsors/Data Owners are responsible for different aspects of data entry. (See item 3b., Data Quality Procedures for more information on
Data Entry Control Plans.)

Information Management Coordinators (IMC).  In each Region, the IMC is a senior position which serves as regional lead for all
Superfund program and CERCLIS systems management activities. The following lead responsibilities for regional program planning and
management rest with the IMC:
— Coordinate program planning, budget development, and reporting activities;
— Ensure regional planning and accomplishments are complete, current,  and consistent, and accurately reflected in CERCLIS by working
with data sponsors and data owners;
— Provide liaison to HQ on SCAP process and program evaluation issues;
                                                            201                                     Back to Table of Contents

-------
 GoalS                                                 Objectives                                            Measure 151
— Coordinate regional evaluations by headquarters;
— Ensure that the quality of CERCLIS data are such that accomplishments and planning data can be accurately retrieved from the system;
and
— Ensure there is "objective" evidence to support accomplishment data entered in CERCLIS. (Objective Evidence Rule: "All transactions
must be supported by objective evidence, that is, documentation that a third party could examine and arrive at the same conclusion.")
[Source: SPIM2010, III.E]

The Information Management Officer (EVIO) & Director, Information Management and Data Quality Staff. OSWER is the lead point of
contact for information about the data from CERCLIS .

The Project Manager for CERCLIS oversees and is the approving authority for quality-related CERCLIS processes, and is closely
supported by a Contract Task Manager.  (See the CERCLIS QAPP,  attached, for more information.) The lead point of contact for
information about the data from CERCLIS is the Director, Information Management and Data Quality Staff,  Office of Solid Waste and
Emergency Response.

PAT Data Entry

The Annual Commitment System (ACS) Coordinator in OSRTI ensures that CERCLIS data for this measure  are correctly loaded into PAT.
The ACS Coordinator then works with the data sponsor to review uploaded data, edit records as appropriate,  and then push data to
ACS—part of the Office of Chief Financial Officer's (OCFO) BAS. PAT is maintained by OSWER's System Manager who ensures  that the
PAT system operates correctly, based on business logic agreed to by OSRTI.	
3d. Calculation Methodology                                                                                                  |
The performance measure is a specific variable entered into CERCLIS following specific coding guidance and corresponding supporting
site-specific documentation.

The unit of measure is number of sites.  The calculation only includes NPL sites.

References:

Superfund Data Element Dictionary (DED). The Superfund DED is available online at:
http://www.epa.gov/superfund/sites/ded/index.htm. The DED provides definitions and descriptions of elements, tables and codes from the
CERCLIS database used by the Superfund program. It also provides additional technical information for each entry, such  as data type, field
length and primary table. Using the DED, you can look up terms by table name or element name, or search the entire dictionary by
keyword.

Other additional references that may be useful:

Coding Guide.  The Superfund Coding Guide contains technical instructions to data users including Regional IMCs, program personnel,
                                                          202                                     Back to Table of Contents

-------
 GoalS                                                  Objectives                                             Measure 151
data owners, and data entry personnel.  The Remedial component of the Coding Guide is attached to this record.

      H
 Coding Guide • 2009.pdf

Quick Reference Guides (QRG).  Superfund Quick Reference Guides are available in the CERCLIS Documents Database and provide
detailed instructions on data entry for nearly every module in CERCLIS. Sample QRGs are available for entering data related to Remedial
Action Starts.
 Example QRG RA Start.doc

Site Status and Description document:  this is a QRG
for CERCLIS users, for filling in information related to site status and
description.
 Site Status and Description.doc
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Data Sponsor for HUEC, Annual Commitment System coordinator, and National Program Office (NPO) management.
Progress reporting is done periodically as checks, while official numbers are reported annually.	
4b. Data Limitations/Qualifications                                                                                              |
Users of HEUC data should recognize that HEUC status is reviewed at least annually, on a schedule that varies based upon site
characteristics.  This status review can result in a change in status with regard to this measure, with a site moving from HEUC status to
non-HEUC status.	
4c. Third-Party Audits                                                                                                         |
Three audits, two by the Office Inspector General (OIG) and the other by Government Accountability Office (GAO), assessed the validity
of the data in CERCLIS.  The OIG audit report, Superfund Construction Completion Reporting  (No. E1SGF7_05_0102_ 8100030), dated
December 30, 1997, concluded that the Agency "has good management controls to ensure accuracy of the information that is reported," and
"Congress and the public can rely upon the information EPA provides regarding construction completions."  The GAO report, Superfund:
Information on the Status of Sites (GAO/RCED-98-241), dated August 28, 1998, estimated that the cleanup status of National Priority List
(NPL) sites reported by CERCLIS as of September 30,  1997, is accurate for 95 percent of the sites. Another OIG audit, Information
                                                           203                                     Back to Table of Contents

-------
 GoalS                                                  Objectives                                             Measure 151
Technology - Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) Data Quality  (Report
No. 2002-P-00016), dated September 30, 2002, evaluated the accuracy, completeness, timeliness, and consistency of the data entered into
CERCLIS.  The report provided  11 recommendations  to improve controls for CERCLIS data  quality.  EPA  has  implemented these
recommendations and continues to use the monitoring tools for verification.

The IG annually reviews the end-of-year CERCLIS data, in an informal process, to verify data that supports the performance measures.
Typically, there are no published results.

EPA received an unqualified audit opinion by the OIG for the annual financial statements and recommends several corrective actions. The
Office of the Chief Financial Officer indicates that corrective actions will be taken.
 Record Last Updated: 02/13/2012 01:16:49 PM
                                                           204                                      Back to Table of Contents

-------
  Goal 3                                                 Objective 3                                           Measure CA1
  Measure Code :  CA1 - Cumulative percentage of RCRA facilities with human
  exposures to toxins under control.
  Office of Solid Waste and Emergency Response (OSWER)
   1. Measure and DQR Metadata
   Goal Number and Title
3 - Cleaning Up Communities and Advancing Sustainable Development
   Objective Number and Title
                                                 3 - Restore Land
   Sub-Objective Number and Title
3 - Cleanup Contaminated Land
   Strategic Target Code and Title
2 - By 2015, increase the number of Superfund final and deleted NPL sites and RCRA facilities
   Managing Office
   Office of Resource Conservation and Recovery
   Performance Measure Term Definitions
The performance measure is used to track the RCRA Corrective Action Program's progress in dealing with immediate threats to human health and
groundwater resources. It is meant to summarize and report on facility-wide environmental conditions at RCRA Corrective Action Program's facilities
nation-wide. For background information, please visit: http://www.epa.gov/osw/hazard/correctiveaction/index.htm.

Cumulative - made up of accumulated parts; increasing by successive additions

RCRA Facilities - facilities subject to restriction or action from the Resource Conservation and Recovery Act;

Human Exposure to Toxins - pathways or means by which toxic substances may come into direct contact with a person

Definition of "under control":  Known and suspected facility-wide conditions are evaluated using a series of simple questions and flow-chart logic to
arrive at a reasonable, defensible determination. These questions were issued as a memorandum titled: Interim Final Guidance for RCRA Corrective
Action Environmental Indicators, Office of Solid Waste, February 5, 1999).  Lead regulators (delegated state or EPA Region) for the facility (authorized
state or EPA) make the environmental indicator determination, but facilities or their consultants may assist EPA in the evaluation by providing
information on the current environmental conditions. The determinations are entered directly into RCRAInfo. EPA collects the determinations as made
by the lead regulator, and this total is used for this measure.
 2. Data Definition and Source  Reporting
 2a. Original Data Source
 States and regions enter all data on determinations made.
 2b. Source Data Collection
 Known and suspected facility-wide conditions are evaluated using a series of simple questions and flow-chart logic to arrive at a

                                                           205                                    Back to Table of Contents

-------
 Goal 3                                                   Objective 3                                             Measure CA1
reasonable, defensible determination. These questions were issued as a memorandum titled: Interim Final Guidance for RCRA Corrective
Action Environmental Indicators, Office of Solid Waste, February 5, 1999).  Lead regulators for the facility (authorized state or EPA) make
the environmental indicator determination (like whether human exposures to toxins are under control), but facilities or their consultants
may assist EPA in the evaluation by providing information on the current environmental conditions.

States and regions generate the data and manage data quality related to timeliness and accuracy (i.e., the environmental conditions and determinations
are correctly reflected by the data). EPA has provided guidance and training to states and regions to help ensure consistency in those determinations.
RCRAInfo documentation, available to all users on-line, provides guidance to facilitate the generation and interpretation of data.	
2c. Source Data Reporting                                                                                                        |

 States: With respect to meeting the human exposures to toxins controlled  a "yes," "no", or "insufficient information" entry is made in the database.
(EPA makes the same kind of entry related to facilities in non-delegated states.)	
3.  Information Systems and  Data Quality Procedures
3a. Information Systems
RCRAInfo, the national database which supports EPA's RCRA program, contains information on entities (generically referred to as
"handlers") engaged in hazardous waste generation and management activities regulated under the portion of the Resource Conservation
and Recovery Act (RCRA) that provides for regulation of hazardous waste.

RCRAInfo has several different modules, and allows for tracking of information on the regulated universe of RCRA hazardous waste
handlers, such as facility status, regulated activities, and compliance history. The system also captures detailed data on the generation of
hazardous waste by large quantity generators and on waste management practices from treatment, storage, and disposal facilities. Within
RCRAInfo, the Corrective Action Module tracks the status of facilities that require, or may require, corrective actions, including the
information related to the performance measure.

RCRAInfo is web accessible, providing a convenient user interface for Federal, state and local managers, encouraging development of
in-house expertise for controlled cost, and states have the option to use commercial off-the-shelf software to develop reports from database
tables.

 RCRAInfo is currently at Version 5 (V5), which was released in March 2010. V5 expanded on V4's capabilities and made updates to the
Handler module to support two new rules that went into effect in 2009.

Access to RCRAInfo is open only to EPA Headquarters, Regional, and authorized state personnel. It is not available to the general public
because the  system contains enforcement sensitive data. The general public is  referred to EPA's Envirofacts Data Warehouse to obtain
information  on RCRA-regulated hazardous waste sites.  This non-sensitive information is  supplied from RCRAInfo to Envirofacts. For
more information, see: http://www.epa.gov/enviro/index.html.

                                                             206                                      Back to Table of Contents

-------
 Goal 3                                                    Objective 3
Find more information about RCRAInfo at: http://www.epa.gov/enviro/html/rcris/index.html.
          Measure CA1
3b. Data Quality Procedures
Manual procedures: EPA Corrective Action sites are monitored on a facility-by-facility basis and QA/QC procedures are in place to
ensure data validity.

Automated procedures: Within RCRAInfo, the application software enforces structural controls that ensure that high-priority national
components of the data are properly entered. Training on use of RCRAInfo is provided on a regular basis, usually annually, depending on
the nature of systems changes and user needs. The latest version of RCRAInfo, Version 5 (V5), was released in March 2010 and has many
added components that will help the user identify errors in the system.	
3c. Data Oversight                                                                                                                |
The Information Collection and Analysis Branch (ICAB) maintains a list of the Headquarters, Regional and delegated state/territory users and controls
access to the system. Branch members ensure data collection is on track, conduct QA reports, and work with Regional and state partners to resolve
issues as they are discovered.

3d. Calculation Methodology                                                                                                       |
Annual progress for each measure is found by subtracting the cumulative progress at the end of the previous fiscal year from the cumulative

progress at the end of the current fiscal year.



4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting                                                                                        |

Program Implementation and Information Division (PIID) data analysts are responsible  for the reporting.

4b. Data Limitations/Qualifications                                                                                                  |
With emphasis on data quality, EPA Headquarters works with the EPA Regional offices to  ensure the national data pulls are consistent with

the Regional data pulls.

4c. Third-Party Audits                                                                                                             |
US Government Accountability Office (GAO) report: "Hazardous Waste: Early Goals Have Been Met in EPA's Corrective Action

Program, but Resource and Technical Challenges Will Constrain Future Progress." (GAO-11-514, August 25, 2011,

http://www.gao.gov/assets/330/321743.pdf)
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                             207
Back to Table of Contents

-------
Goal 3                                                    Objective 3                                             Measure CA1
                                                            208                                       Back to Table of Contents

-------
  Goal 3                                                Objective 3                                           Measure CA2
  Measure Code : CA2 - Cumulative percentage of RCRA facilities with migration of
  contaminated groundwater under control.
  Office of Solid Waste and Emergency Response (OSWER)
   1.  Measure and DQR Metadata
   Goal Number and Title
3 - Cleaning Up Communities and Advancing Sustainable Development
   Objective Number and Title
                                                3 - Restore Land
   Sub-Objective Number and Title
3 - Cleanup Contaminated Land
   Strategic Target Code and Title
3 - By 2015, increase number of Resource Conservation and Recovery Act (RCRA) facilities
   Managing Office
  Office of Resource Conservation and Recovery
   Performance Measure Term Definitions
 The performance measure is used to track the RCRA Corrective Action Program's progress in dealing with immediate threats to human
health and groundwater resources. It is meant to summarize and report on facility-wide environmental conditions at RCRA Corrective
Action Program's facilities nation-wide. For background information, please visit:
http://www.epa.gov/osw/hazard/correctiveaction/index.htm.

Cumulative - made up of accumulated parts; increasing by successive additions

RCRA Facilities - facilities subject to restriction or action from the Resource Conservation and Recovery Act;

Migration - to change position; movement from one location to another

Contaminated Groundwater - water in the subsurface which has become tainted with any number of dissolved contaminants at levels greater
than the prescribed environmental standard levels

Definition of "under control": Known and suspected facility-wide conditions are evaluated using a series of simple questions and flow-chart logic to
arrive at a reasonable, defensible determination. These questions were issued as a memorandum titled: Interim Final Guidance for RCRA Corrective
Action Environmental Indicators, Office of Solid Waste, February 5, 1999). Lead regulators (delegated state or EPA Region) for the facility (authorized
state or EPA) make the environmental indicator determination, but facilities or their consultants may assist EPA in the evaluation by providing
information on the current environmental conditions. The determinations are entered directly into RCRAInfo. EPA collects the determinations as made
by the  lead regulator, and this total is used for this measure.
                                                          209
                                                Back to Table of Contents

-------
 Goal 3                                                  Objective 3                                            Measure CA2
2.  Data Definition and Source Reporting
2a. Original Data Source
States and regions enter all data on determinations made.
2b. Source Data Collection
Known and suspected facility-wide conditions are evaluated using a series of simple questions and flow-chart logic to arrive at a
reasonable, defensible determination. These questions were issued as a memorandum titled: Interim Final Guidance for RCRA Corrective
Action Environmental Indicators, Office of Solid Waste, February 5, 1999).  Lead regulators for the facility (authorized state or EPA) make
the environmental indicator determination (like whether migration of contaminated groundwater is under control), but facilities or their
consultants may assist EPA in the evaluation by providing information on the current environmental conditions.

States and regions generate the data and manage data quality related to timeliness and accuracy (i.e., the environmental conditions and determinations
are correctly reflected by the data). EPA has provided guidance and training to states and regions to help ensure consistency in those determinations.
RCRAInfo documentation, available to all users on-line, provides guidance to facilitate the generation and interpretation of data.
2c. Source Data Reporting                                                                                                       |

 States: With respect to  releases to groundwater controlled, a "yes," "no", or "insufficient information" entry is made in the database. (EPA
makes the same kind of entry related to facilities in non-delegated states.)	
3.  Information Systems and Data Quality Procedures
3a. Information Systems
RCRAInfo, the national database which supports EPA's RCRA program, contains information on entities (generically referred to as
"handlers") engaged in hazardous waste generation and management activities regulated under the portion of the Resource Conservation
and Recovery Act (RCRA) that provides for regulation of hazardous waste.

RCRAInfo has several different modules, and allows for tracking of information on the regulated universe of RCRA hazardous waste
handlers, such as facility status, regulated activities, and compliance history. The system also captures detailed data on the generation of
hazardous waste by large quantity generators and on waste management practices from treatment, storage, and disposal facilities. Within
RCRAInfo, the Corrective Action Module tracks the status of facilities that require, or may require, corrective actions, including the
information related to the performance measure.

RCRAInfo is web accessible, providing a convenient user interface for Federal, state and local managers, encouraging development of
in-house expertise for controlled cost, and states have the option to use commercial off-the-shelf software to develop reports from database
tables.

 RCRAInfo is currently at Version 5 (V5), which was released in March 2010. V5 expanded on V4's capabilities and made updates to the
Handler module to support two new rules that went into effect in 2009.
                                                            210                                      Back to Table of Contents

-------
 Goal 3                                                   Objective 3                                             Measure CA2

Access to RCRAInfo is open only to EPA Headquarters, Regional, and authorized state personnel. It is not available to the general public
because the system contains enforcement sensitive data. The general public is referred to EPA's Envirofacts Data Warehouse to obtain
information on RCRA-regulated hazardous waste sites.  This non-sensitive information is supplied from RCRAInfo to Envirofacts. For
more information, see: http://www.epa.gov/enviro/index.html.

Find more information about RCRAInfo at: http://www.epa.gov/enviro/hrml/rcris/index.hrml.
3b. Data Quality Procedures                                                                                                       |
Manual procedures: EPA Corrective Action sites are monitored on a facility-by-facility basis and QA/QC procedures are in place to
ensure data validity.

Automated procedures: Within RCRAInfo, the application software enforces structural controls that ensure that high-priority national
components of the data are properly  entered. Training on use of RCRAInfo is provided on a regular basis, usually annually, depending on
the nature of systems changes and user needs.  The latest version of RCRAInfo, Version 5 (V5), was released in March 2010 and has many
added components that will help the  user identify errors in the system.	
3c. Data Oversight                                                                                                               |
The Information Collection and Analysis Branch (ICAB) maintains a list of the Headquarters, Regional and delegated state/territory users and controls
access to the system. Branch members ensure data collection is on track, conduct QA reports, and work with Regional and state partners to resolve
issues as they are discovered.

3d. Calculation Methodology                                                                                                      |
Annual  progress for each measure is found by subtracting the cumulative progress at the end of the previous fiscal year from the cumulative

progress at the end of the current fiscal year.



4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting                                                                                        |

Program Implementation and Information Division (PIID) data analysts are responsible for the reporting.

4b. Data Limitations/Qualifications                                                                                                 |
With emphasis on data quality, EPA Headquarters works with the EPA Regional offices to ensure  the national data pulls are consistent with

the Regional data pulls.

4c. Third-Party Audits                                                                                                            |
US Government Accountability Office (GAO) report: "Hazardous Waste: Early Goals Have Been Met in EPA's Corrective Action


                                                            211                                       Back to Table of Contents

-------
 Goal 3                                                    Objective 3                                              Measure CAS
Program, but Resource and Technical Challenges Will Constrain Future Progress."  (GAO-11-514, August 25, 2011,

http://www.gao.gov/assets/330/321743.pdf)
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                              212                                       Back to Table of Contents

-------
  Goal 3                                               Objective 3                                         Measure CAS
  Measure Code : CAS - Cumulative percentage of RCRA facilities with final
  remedies constructed.
  Office of Solid Waste and Emergency Response (OSWER)
   1. Measure and DQR Metadata
   Goal Number and Title
3 - Cleaning Up Communities and Advancing Sustainable Development
   Objective Number and Title
                                               3 - Restore Land
   Sub-Objective Number and Title
3 - Cleanup Contaminated Land
   Strategic Target Code and Title
4 - By 2015,, increase the number of RCRA facilities with final remedies constructed
   Managing Office
  Office of Resource Conservation and Recovery
   Performance Measure Term Definitions
 The remedy construction measure tracks the RCRA Corrective Action Program's progress in moving sites towards final cleanup. For
background information, please visit: http://www.epa.gov/osw/hazard/correctiveaction/index.htm.

Cumulative - made up of accumulated parts; increasing by successive additions

RCRA Facilities - facilities subject to restriction or action from the Resource Conservation and Recovery Act;

Definition of "final remedies constructed":
The lead regulators (delegated state or EPA Region) for the facility select the remedy and determine when the facility has completed
construction of that remedy. EPA collects the determinations as made by the lead regulator and this total is used for this measure.
 2. Data Definition and Source Reporting
 2a. Original Data Source
 States and regions enter all data on determinations made.
 2b. Source Data Collection
 Known and suspected facility-wide conditions are evaluated using a series of simple questions and flow-chart logic to arrive at a
 reasonable, defensible determination. These questions were issued as a memorandum titled: Interim Final Guidance for RCRA Corrective
 Action Environmental Indicators, Office of Solid Waste, February 5, 1999). The lead regulators (delegated state or EPA Region) for the
 facility select the remedy and determine when the facility has completed construction of that remedy.
 Construction completions are collected on both an area-wide and site-wide basis.
                                                        213
                                               Back to Table of Contents

-------
 Goal 3                                                  Objective 3                                            Measure CAS

States and regions generate the data and manage data quality related to timeliness and accuracy (i.e., the environmental conditions and determinations
are correctly reflected by the data). EPA has provided guidance and training to states and regions to help ensure consistency in those determinations.
RCRAInfo documentation, available to all users on-line, provides guidance to facilitate the generation and interpretation of data.
2c. Source Data Reporting                                                                                                       |
The remedy construction measure tracks the RCRA Corrective Action Program's progress in moving sites towards final cleanup. The date
of remedy construction is entered in the database. (EPA makes the same kind of entry related to facilities in non-delegated states.)	


3. Information Systems and Data Quality Procedures
3a. Information Systems                                                                                                       |

RCRAInfo, the national database which supports EPA's RCRA program,  contains information on entities (generically referred to as
"handlers") engaged in hazardous waste generation and management activities regulated under the portion of the Resource Conservation
and Recovery Act (RCRA) that provides for regulation of hazardous waste.

RCRAInfo has several different modules, and allows for tracking of information on the regulated universe of RCRA hazardous waste
handlers, such as facility status, regulated activities, and compliance history. The system also captures detailed data on the generation of
hazardous waste by large quantity generators and on waste management practices from treatment, storage, and disposal facilities. Within
RCRAInfo, the Corrective Action Module tracks the status of facilities that require, or may require, corrective actions, including the
information related to the performance measure.

RCRAInfo is web accessible, providing a convenient user interface for Federal, state and local managers, encouraging development of
in-house expertise for controlled cost, and states have the option to use commercial off-the-shelf software to develop reports from database
tables.

RCRAInfo is currently at Version 5 (V5), which was released in March 2010.  V5 expanded on V4's capabilities and made updates to the
Handler module to support two new rules that went into effect in 2009.

Access to RCRAInfo is open only to EPA Headquarters, Regional, and authorized state personnel. It is not available to the general public
because the system contains  enforcement sensitive data. The general public is referred to EPA's Envirofacts Data Warehouse to obtain
information on RCRA-regulated hazardous waste sites.  This non-sensitive information is supplied from RCRAInfo to Envirofacts. For
more information, see: http://www.epa.gov/enviro/index.html.

Find more information about RCRAInfo at: http://www.epa.gov/enviro/html/rcris/index.html.
3b. Data Quality Procedures                                                                                                    |
Manual procedures: EPA Corrective Action sites are monitored on a facility-by-facility basis and QA/QC procedures are in place to
ensure data validity.

                                                           214                                     Back to Table of Contents

-------
 Goal 3                                                   Objective 3                                              Measure CAS

Automated procedures: Within RCRAInfo, the application software enforces structural controls that ensure that high-priority national
components of the data are properly entered. Training on use of RCRAInfo is provided on a regular basis, usually annually, depending on
the nature of systems changes and user needs.  The latest version of RCRAInfo, Version 5 (V5), was released in March 2010 and has many
added components that will help the user identify errors in the system.	
3c. Data Oversight                                                                                                               |
The Information Collection and Analysis Branch (ICAB) maintains a list of the Headquarters, Regional and delegated state/territory users and controls
access to the system. Branch members ensure data collection is on track, conduct QA reports, and work with Regional and state partners to resolve
issues as they are discovered.

3d. Calculation Methodology                                                                                                      |
The remedy construction measure tracks the RCRA Corrective Action Program's progress in moving sites towards final cleanup. Like with
the environmental indicators determination, the lead regulators for the  facility select the remedy and determine when the facility has
completed construction of that remedy.  Construction completions are  collected on both an area-wide and site-wide basis for sake  of the
efficiency measure.	


4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting                                                                                        |

Program Implementation and Information Division (PIID) data analysts  are responsible for the reporting.

4b. Data Limitations/Qualifications                                                                                                 |
With emphasis on data quality, EPA Headquarters works with the EPA Regional offices to ensure the national data pulls are consistent with

the Regional data pulls.

4c. Third-Party Audits
US Government Accountability Office (GAO) report: "Hazardous Waste:  Early Goals Have Been Met in EPA's Corrective Action

Program, but Resource and Technical Challenges Will Constrain Future Progress."  (GAO-11-514, August 25, 2011,

http://www.gao.gov/assets/330/321743.pdf)
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                            215                                      Back to Table of Contents

-------
  GoalS                                                Objectives                                           Measure S10
  Measure Code : S10 - Number of Superfund sites ready for anticipated use
  site-wide.
  Office of Solid Waste and Emergency Response (OSWER)
   1.  Measure and DQR Metadata
   Goal Number and Title
3 - Cleaning Up Communities and Advancing Sustainable Development
   Objective Number and Title
                                                3 - Restore Land
   Sub-Objective Number and Title
3 - Cleanup Contaminated Land
   Strategic Target Code and Title
7 - By 2015, ensure that 799 Superfund NPL sites are "sitewide ready for anticipated use"
   Managing Office
  OSRTI
   Performance Measure Term Definitions
 Definition of Site: "Sites" refers only to National Priorities List (NPL) sites. (See below for definition of NPL.) The term "site" itself is
not explicitly defined underComprehensive Environmental Response, Compensation, and Liability Act (CERCLA) or by the Superfund
program; instead "site" is defined indirectly in CERCLA's definition of "facility," as follows: "The term 'facility' means (A) any building,
structure, installation, equipment, pipe or pipeline (including any pipe into a sewer or publicly owned treatment works), well, pit, pond,
lagoon, impoundment, ditch, landfill, storage container, motor vehicle, rolling stock, or aircraft, or (B) any site or area where a hazardous
substance has been deposited, stored, disposed of, or placed, or otherwise come to be located; but does not include any consumer product in
consumer use or any vessel."  (CERCLA, Title I, Section 101, (9)).

Definition of Sitewide Ready for Anticipated Use (SWRAU):  Where for the entire construction complete NPL site: All cleanup goals in
the Record(s) of Decision or other remedy decision document(s) have been achieved for media that may affect current and reasonably
anticipated future land uses of the site, so that there are no unacceptable risks; and all institutional or other controls required in the
Record(s) of Decision or other remedy decision document(s) have been put in place.

The Human Exposure determination for sites that qualify for the Sitewide Ready-for-use measure should either be:
• "Current Human Exposure Controlled and Protective Remedy in Place"; or
• "Long-Term Human Health Protection Achieved

In addition,  All acreage at the site must be Ready for Anticipated Use (RAU) (i.e., the Superfund Remedial/Federal Facilities Response
Universe Acres minus the total RAU Acres must be zero).

For more information about the SWRAU performance measure, visit: http://www.epa.gov/superfund/programs/recycle/pdf/sitewide_a.pdf

Also, see Appendix B of the most recent Superfund Program Implementation Manual(SPIM), which is updated each fiscal year and contains
                                                          216                                    Back to Table of Contents

-------
  GoalS                                                  Objectives                                            Measure S10
definitions and documentation/coding guidance for Superfund measures.  The most current SPIM can be found here:
http://epa.gov/superfund/policy/guidance.htm.

Superfund Alternative Approach (SAA) sites: The program tracks SAA sites that meet Sitewide ready for anticipated use criteria, but
does target or report official results at this time.

Definition of National Priorities List (NPL):  Sites are listed on the National Priorities List (NPL) upon completion of Hazard Ranking
System (HRS) screening, public solicitation of comments about the proposed site, and final placement of the site on the NPL after all
comments have been addressed. The NPL primarily serves as an information and management tool. It is a part of the Superfund cleanup
process and is updated periodically.  Section 105(a)(8)(B) of CERCLA as amended, requires that the statutory criteria provided by the HRS
be used to prepare a list of national priorities among the known releases or threatened releases of hazardous substances, pollutants, or
contaminants throughout the United States. This list, which is Appendix B of the National Contingency Plan, is the NPL. Visit the HRS
Toolbox (http://www.epa.gov/superfund/sites/npl/hrsres/index.htm) page  for guidance documents that are used to determine if a site is a
candidate for inclusion on the NPL.  [Source: Superfund website, http://www.epa.gov/superfund/sites/npl/npljirs.htm]

(Also see Appendix B of the most recent SPIM, which is updated each fiscal year and contains definitions and documentation/coding
guidance for Superfund measures. The most current SPIM can be found here: http://epa.gov/superfund/policy/guidance.htm.)

The  Superfund Program's performance measures are used to demonstrate the agency's progress of site cleanup and reuse. Each measure
marks a significant step in ensuring human health and environmental protection at Superfund sites.

References:

U.S. Environmental Protection Agency, EPA Performance and Accountability Reports, http://www.epa.gov/ocfo/par/index.htm.

U.S. Environmental Protection Agency, Superfund Accomplishment and Performance Measures,
http://www.epa.gov/superfund/accomplishments.htm.

U.S. Environmental Protection Agency, Federal Facilities Restoration and Reuse Office - Performance measures,
http://www.epa.gov/fedfac/documents/measures.htm.

U.S Environmental Protection Agency, Office of Inspector General, Information Technology - Comprehensive Environmental Response,
Compensation, and Liability Information System (CERCLIS) Data Quality, No.  2002-P-00016, http://www.epa.gov/oigearth/eroom.htm.

U.S. Government Accountability Office,  "Superfund Information on the Status of Sites, GAO/RCED-98-241",
http://www.gao.gov/archive/1998/rc98241.pdf

U.S. Environmental Protection Agency, Office of Superfund Remediation and Technology Innovation, Superfund Program Implementation
                                                            217                                      Back to Table of Contents

-------
  GoalS                                                  Objectives                                            Measure S10
Manuals (SPIM) , http://www.epa.gov/superfund/policy/guidance.htm (accessed July 30, 2009).

U.S. Environmental Protection Agency, Office of Solid Waste and Emergency Response, "OSWER Quality Management Plan",
http://www.epa.gov/swerffrr/pdf/oswer_qmp.pdf

U.S. Environmental Protection Agency, Office of Environmental Information, EPA System Life Cycle Management Policy Agency
Directive 2100.5, http://www.epa.gov/oamhpodl/adm_placement/ITS_BISS/slcmmgmt.pdf

U.S. Environmental Protection Agency, Office of Environmental Information, EPA's Information Quality Guidelines,
http://www.epa.gov/quality/informationguidelines.
 2. Data Definition and Source Reporting
 2a. Original Data Source
 Original data sources vary, and multiple data sources can be used for each site.  Typical data sources are EPA personnel, contractors
 (directly to EPA or indirectly, through the interagency agreement recipient or cooperative agreement recipient), U.S. Army Corps of
 Engineers (interagency agreement recipient), and states/tribes/other political subdivisions (cooperative agreement recipients).  EPA also
 collects data via pre-final inspections at sites.

 (See item Performance Measure Term Definitions in Tab 1, for more information. Also, detailed information on requirements for source
 data and completion procedures can be found on the following Superfund website:
 http://www.epa.gov/superfund/programs/npl_hrs/closeout/index.htm)

 2b. Source Data Collection                                                                                                     |
 Collection mode varies, typically with multiple collection modes at each site. Collection typically involves some combination of
 environmental data collection, estimation and/or tabulation of records/activities.  Documents such as risk assessments, Record of Decisions
 (RODs), Action Memoranda, Pollution Reports  (POLREPS), Remedial Action (RA) Reports, Close-out Reports, Five-year Reviews, NPL
 Deletion/Partial Deletion Notices are known reliable sources of data and often provide the information necessary for making a SWRAU
 evaluation with reasonable certainty. Regions should also ensure consistency between the SWRAU determination, the All ICs Implemented
 indicator, and the HEUC environmental indicator.

 The SWRAU baseline was established in 2006. Data were from FY 2007 and continues through FY 2012.

 The Guidance and Checklist can be found at: http://www.epa.gov/superfund/programs/recycle/pdf/sitewide_a.pdf

 (See item Performance Measure Term Definitions, for more information and references.)
                                                            218                                     Back to Table of Contents

-------
 GoalS                                                 Objectives                                           Measure S10

Source data collection frequency: On a regular basis with no set schedule, as the data are entered real-time. Varies by site.

Spatial Extent: National

Spatial detail: Site, defined in database by latitude/longitude pair.
2c. Source Data Reporting
 Collection mode varies, typically with multiple collection modes at each site. Collection typically involves some combination of
environmental data collection, estimation and/or tabulation of records/activities. Documents such as risk assessments, RODs, Action
Memoranda, POLREPS, RA Reports, Close-out Reports, Five-year Reviews, NPL Deletion/Partial Deletion Notices are known reliable
sources of data and often provide the information necessary for making a SWRAU evaluation with reasonable certainty. Regions should
also ensure consistency between the SWRAU determination, the All ICs Implemented indicator, and the HEUC environmental indicator.


SWRAU source data are entered in CERCLIS on a regular basis with no set schedule, as the data are entered real-time. However, status at
a site is reviewed annually by the 10th working day in October, or at any time site conditions change. CERCLIS is to be updated within 10
days of any change in status.

The information is entered in CERCLIS in the Land Reuse Module. This module contains screens for entering and defining acreage data
and the checklist, as well as setting the SWRAU status and applicable dates.

In addition to submitting information through CERCLIS, Regions are required to submit a Checklist to Headquarters documenting that the
site has met the measure.
3. Information Systems and Data Quality Procedures
3a.  Information Systems
The SWRAU determination is made directly in Comprehensive Environmental Response, Compensation, and Liability Information System
(CERCLIS) once it is determined that the site meets all required criteria and has been approved as such by appropriate regional personnel.
The CERCLIS data system meets all relevant EPA QA standards.

CERCLIS database  - CERCLIS is EPA's primary database to store and report data for NPL and non-NPL Superfund sites. The
Superfund Comprehensive Accomplishment Plan (SCAP) reports in CERCLIS are used to report progress on measures, including
SWRAU.

(For more information about CERCLIS, see Appendix E of the most recent SPEVI, which is updated each fiscal year and contains
definitions and documentation/coding guidance for Superfund measures. The most current SPEVI can be found here:
                                                          219                                    Back to Table of Contents

-------
 GoalS                                                  Objectives                                             Measure S10
http://epa.gov/superfund/policy/guidance.htm.)

CERCLIS operation and further development is taking place under the following administrative control quality assurance procedures: 1)
Office of Environmental Information Interim Agency Life Cycle Management Policy Agency Directive; 2) the Office of Solid Waste and
Emergency Response (OSWER) Quality Management Plan (QMP); 3) EPA IT standards; 4) Quality Assurance Requirements in all
contract vehicles under which CERCLIS is being developed and maintained; and 5) EPA IT security policies.  In addition, specific controls
are in place for system design, data conversion and data capture, as well as CERCLIS outputs.
 19-0585 (CERCLIS QAPP) 2009-0410.doc

CERCLIS adherence to the security policy has been audited. Audit findings are attached to this record.
 CERCLIS July 9 200S scan High Medium Response. xls OSWER QMP printed 201 0-03-23.pdf

OSWER Performance Assessment Tool (PAT).  This tool serves as the primary external servicing resource for organizing and reporting
OSWER's performance data, which collects information from OSWER program systems, and conforms it for uniform reporting and data
provisioning. PAT captures data from CERCLIS; replicates business logic used by CERCLIS for calculating measures; delivers that data
to EPA staff and managers via a business intelligence dashboard interface for analytic and reporting use; ; and transmits data to the Budget
Automated System (BAS). No current system specifications document is currently available for PAT, but will be provided when available.
For this measure, PA T transmits Regional-level data to BAS.

PAT operates under the OSWER QMP. PAT has a security certification confirming that a security policy is not necessary because no
sensitive data are handled and PAT is built upon the Oracle-based business intelligence system. PAT's security certification indicates that
it follows all security guidelines for EPA's Oracle Portal and that PAT is (1) not defined as a "Major Application" according to NIST
Special Publication 800-18, Guide for Developing Security Plans for Information Technology Systems, section 2.3.1; (2)  does not store,
process, or transmit information that the degree of sensitivity is assessed as high by considering the requirements for availability, integrity,
and confidentiality according to NIST Special Publication 800-18, Guide for Developing Security Plans for Information Technology
Systems, section 3.7.2.  (3) is not covered by EPA Order 2100.2A1 Information Technology Capital Planning and Investment Control
(CPIC).

EPA Headquarters is  now scoping the requirements  for an integrated (Superfund Document Management System-)  SDMS-CERCLIS
system,  called the Superfund Enterprise Management System (SEMS).  Development work on SEMS began in FY 2007 and will
continue through FY 2013.

SEMS represents further re-engineering of the national reporting systems to include additional elements of EPA's Enterprise Architecture.
SEMS will  provide a common platform for major Superfund systems and future IT development. It will be constructed in part using EPA

                                                           220                                      Back to Table of Contents

-------
 GoalS                                                  Objectives                                            Measure S10
IT enterprise architecture principles and components. SEMS will provide a Superfund Program user gateway to various IT systems and
information collections. _
3b. Data Quality Procedures
CERCLIS: To ensure data accuracy and control, the following  administrative controls are in place: 1) SPIM, the program management
manual that details what data must be reported; 2) Report Specifications, which are published for each report detailing how reported data
are calculated; 3)  Coding  Guide, which contains  technical instructions to data users including Regional  Information Management
Coordinators (IMCs), program personnel, data owners, and data entry personnel; 4) Quick Reference Guides (QRG), which are available in
the CERCLIS Documents Database and provide detailed instructions on data entry for nearly every module in CERCLIS; 5) SCAP Reports
within CERCLIS, which serve as a means to track, budget, plan, and evaluate progress towards meeting Superfund targets and measures; 6)
a historical lockout feature in CERCLIS  so that changes in past fiscal year data can be changed only by approved and designated personnel
and are logged to a Change Log report, 7) the OSWER QMP; and 8) Regional Data Entry Control Plans.

EPA Headquarters has developed data quality audit reports and Standard Operating Procedures, which address timeliness, completeness,
and accuracy, and has provided these reports to the Regions. In addition, as required by the Office of Management and Budget (OMB),
CERCLIS audit logs are reviewed monthly.  The system was also re-engineered  to bring CERCLIS into alignment with the Agency's
mandated Enterprise Architecture. The first steps in this effort involved the migration of all 10 Regional and the Headquarters databases
into one single national database at the National Computing Center in Research Triangle Park (RTF) and the migration of SDMS to RTF to
improve efficiency and storage capacity.  During this process SDMS was linked to CERCLIS which enabled users to easily transition
between programmatic accomplishments as reported in CERCLIS and the actual document that defines and describes the accomplishments.


Regional Data Entry Control Plans.  Regions have established and published Data Entry Control Plans, which are  a key component of
CERCLIS verification/validation procedures. The control plans include: (1) regional policies and procedures for entering data into
CERCLIS, (2) a review process to ensure that all Superfund accomplishments are supported by source documentation, (3) delegation of
authorities for approval of data input into CERCLIS,  and (4) procedures to ensure that reported accomplishments meet accomplishment
definitions. In addition, regions document in their control plans the roles and responsibilities of key regional employees responsible for
CERCLIS data (e.g., regional project manager, information management coordinator, supervisor, etc.), and the processes to assure that
CERCLIS data are current,  complete, consistent, and accurate. Regions may undertake centralized or decentralized approaches to data
management. These plans are collected annually for review by OSRTI/EVIB. [Source: SPIM FY1 1, IIIJ and Appendix E.
http://www.epa.gov/superfund/action/process/spimlO/pdfs/appe.pdf] Copies of the 2010 Regional Data Entry Control Plans are provided
with this DQR. Current and past year plans are available by contacting the Chief, Information Management Branch, Office of Superfund
Remediation and Technology Innovation.

Regions are expected to prepare Data Entry Control Plans consistent with the SPIM and the Headquarters guidance:  "CERCLIS Data Entry
Control Plan Guidance," June 2009.
 2009_drafLCERCLIS_DECP_guidance_6-5-09_.pdf
                                                           221                                     Back to Table of Contents

-------
 GoalS                                                  Objectives                                            Measure S10

In addition to entering information into CERCLIS, Regions must also submit a Checklist for Reporting the SWRAU Government
Performance and Results Act (GPRA) Measure to the headquarters data sponsor. The information provided in the Checklist can be used by
Headquarters to ensure sites nominated for the measure conform to the  same national best practices.

SWRAU is unique in that sites that cease to meet the measure can be retracted from the national total in the event they no longer meet the
criteria. Sites that are retracted must submit a retraction form to Headquarters, explaining the reason for the retractions. For more
information, see Appendix A of the Guidance at: http://www.epa.gov/fedfac/sf_ff_fmal_cprm_guidance.pdf.

Superfund Program Implementation Manual  (SPIM). The SPIM should be the first source referred to for additional questions related
to program data and reporting. The SPIM is a planning document that defines program management priorities, procedures, and practices
for the Superfund program (including response, enforcement, and Federal facilities). The SPIM provides the link between the GPRA,
EPA's Strategic Plan, and the Superfund program's internal processes for setting priorities, meeting program goals, and tracking
performance. It establishes the process to track overall program progress through program targets and measures.

The SPIM provides standardized and common definitions for the Superfund program, and it is part of EPA's internal control structure. As
required by the Comptroller General of the United States, through generally accepted accounting principles (GAAP) and auditing
standards, this document defines program scope and schedule in relation to budget, and is used for audits and inspections by the
Government Accountability Office (GAO) and the Office of the Inspector General (OIG). The SPIM is developed on an annual basis.
Revisions to the SPIM are issued during the annual cycle as needed.

The SPIM contains three chapters and a number of appendices. Chapter 1 provides a brief summary of the Superfund program and
summarizes key program priorities and initiatives. Chapter 2 describes the budget process and financial management requirements. Chapter
3 describes program planning and reporting requirements and processes. Appendices A through I highlight program priorities and
initiatives and provide detailed programmatic information, including Annual Targets for GPRA performance measures, and targets for
Programmatic Measures. [Source: SPIM 2011, Chapter I]

The most current version of the SPIM can be found at: http://epa.gov/superfund/policy/guidance.htm

Data Flow:

Step 1.  Original data sources provide information.

Step 2.  EPA Region reviews and determines SWRAU status at the site and adjusts CERCLIS records as needed.

Step 3.  Headquarters' OSRTI data sponsor reviews and approves/disapproves Regional determinations of SWRAU using data from
CERCLIS. Data sponsor works with Regional staff to ensure that determinations follow Superfund Program guidance.


                                                           222                                     Back to Table of Contents

-------
 GoalS                                                  Objectives                                            Measure S10
Step 4.  The OSWER's PAT pulls data from CERCLIS. Headquarters staff compare PAT results to CERCLIS results. If PAT does not
match CERCLIS then there was an error with the upload and data are reloaded. Headquarters staff enter into PAT the Annual Commitment
System (ACS) status information for each measure and, if necessary, a status explanation.

Step 5.  Headquarters approves PAT results, and PAT pushes results into BAS.

Step 6.  BAS  aggregates Regional data into a national total.  OSRTI reporting lead reviews and certifies results.	
3c. Data  Oversight                                                                                                           |
The Superfund program has a "data sponsorship" approach to database oversight. Headquarters staff and managers take an active role in
improving the quality of data stored in CERCLIS by acting as data sponsors. The data sponsor for SWRAU, or "Land Ready for Reuse," is
available in Appendix E of SPEVI. http://epa.gov/superfund/action/process/spiml l/pdfs/Appendix_E_SPIM_201 l_FINAL.pdf

Specific roles and responsibilities of data sponsors:
•       Identify data needs;
•       Oversee the process of entering data into the system;
•       Determine the adequacy of data for reporting purposes;
•       Conduct focus studies of data entered (A focus study is where a data sponsor identifies a potential or existing data issue to a data
owner (see below), IMC, or other responsible person to determine if a data quality problem exists, and to solve the problem, if applicable.
(IMC responsibilities discussed under item 2d, Database Oversight, below.) Focus studies can be informal via electronic messages.);
•       Provide definitions for data elements;
•       Promote consistency across the Superfund program;
•       Initiate changes in CERCLIS as the program changes;
•       Provide guidance requiring submittal of these data;
•       Support the development of requirements for electronic data submission; and
•       Ensure there is "objective" evidence to support the accomplishment data entered in CERCLIS through identifying data
requirements and check to assure compliance by performing periodic reviews of a random CERCLIS data sample. [Source: SPEVI 2011,
III.EandE.A.5]

Measure-specific data sponsor information:

The Headquarter CERCLIS users responsible for the QA/QC of SWRAU data have primary source knowledge of program and  site data
used in the Superfund Program Implementation Manual (SPIM) and in CERCLIS. The data sponsor is responsible for:
•       ensuring that the correct data enters the system on a real-time basis, as the program/site plans and accomplishments change.
•       assuring procedures for determining that a site's SWRAU eligibility has been accomplished.
•       flipping the special initiative  flag in CERCLIS once a site is determined to be SWRAU, and running audit and confirmatory reports
from CERCLIS to ensure the information is accurate and up to date.

                                                          223                                     Back to Table of Contents

-------
 GoalS                                                 Objectives                                            Measure S10
The Project Manager for CERCLIS oversees and is the approving authority for quality-related CERCLIS processes, and is closely
supported by a Contract Task Manager.  (See the CERCLIS QAPP, attached, for more information.) The lead point of contact for
information about the data from CERCLIS is the Director, Information Management and Data Quality Staff, Office of Solid Waste and
Emergency Response.

Information Management Coordinators  (IMCs).   In each Region, the IMC is a senior position which serves as regional lead for all
Superfund program and CERCLIS systems management activities. The following lead responsibilities for regional program planning and
management rest with the IMC:
•       Coordinate program planning, budget development, and reporting activities;
•       Ensure regional planning and accomplishments are complete, current, and consistent, and accurately reflected in CERCLIS by
working with data sponsors and data owners;
•       Provide liaison to HQ on SCAP process and program evaluation issues;
•       Coordinate regional evaluations by headquarters;
•       Ensure that the quality of CERCLIS data are such that accomplishments and planning data can be accurately retrieved from the
system;  and
•       Ensure there is "objective" evidence to support accomplishment data entered in CERCLIS. (Objective Evidence Rule: "All
transactions must be supported by objective evidence, that is, documentation that a third party could examine and arrive at the same
conclusion.") [Source: SPIM 2011, III.E]

The primary responsibilities of data owners  are (1) to enter and maintain data in CERCLIS and (2) assume responsibility for complete,
current,  consistent, and accurate data. The data owners for specific data are clearly identified in the system audit tables.  Regions annually
update region-specific Data Entry Control Plans (DECP).  Among other things, Regional data entry control plans identify which Data
Sponsors/Data Owners are responsible for different aspects of data entry. (See item 2e, Regions Have Standard Operating Procedures, for
more information on Data Entry Control Plans.)

There is a Project Manager for CERCLIS. S/He oversees and is the approving authority for quality-related CERCLIS processes, and is
closely supported by a Contract Task Manager.  (See the CERCLIS QAPP, attached, for more information.)

The Information Management Officer (EVIO) & Director, Information Management and Data Quality Staff. Office of Solid Waste and
Emergency Response  is the lead point of contact for information about the data from CERCLIS .

PAT Data Entry

The Annual Commitment System (ACS) Coordinator in OSRTI ensures that CERCLIS data for this measure are correctly loaded into PAT.
The ACS Coordinator then works with the data sponsor to review uploaded data, edit records as appropriate, and then push data to
ACS—part of the Office of Chief Financial Officer's (OCFO) BAS. PAT is maintained by OSWER's System Manager who ensures that the
PAT system operates correctly, based on business logic agreed to by OSRTI.
3d. Calculation Methodology
                                                          224                                     Back to Table of Contents

-------
 GoalS                                                  Objectives                                             Measure S10
The performance measure is a specific variable entered into CERCLIS following specific coding guidance and corresponding supporting
site-specific documentation.

The unit of measure is number of sites.  The calculation only includes NPL sites

References:

Superfund Data Element Dictionary. The Superfund Data Element Dictionary (DED) is available online at:
http://www.epa.gov/superfund/sites/ded/index.htm.  The DED provides definitions and descriptions of elements, tables and codes from the
Comprehensive Environmental Response, Compensation and Liability Information System (CERCLIS) database used by the Superfund
program. It also provides additional technical information for each entry, such as data type, field length  and primary table. Using the DED,
you can look up terms by table name or element name, or search the entire dictionary by keyword.

Other additional references that may be useful:

Coding Guide. The Superfund Coding Guide contains technical instructions to data users including Regional Information Management
Coordinators (EVICs), program personnel, data owners, and data entry personnel. The Remedial component of the Coding Guide is
attached to this record.
 Coding Guide - 2009.pdf

Quick Reference Guides (QRG).  Superfund Quick Reference Guides are available in the CERCLIS Documents Database and provide
detailed instructions on data entry for nearly every module in CERCLIS.  Sample QRGs are available for entering data related to Remedial
Action Starts.
 Example QRG RA Start.doc

Site Status and Description document: this is a Quick Reference Guide
for CERCLIS users, for filling in information related to site status and
description.
 Site Status and Description.doc
                                                           225                                     Back to Table of Contents

-------
 GoalS                                                 Objectives                                            Measure S10

4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting                                                                                    |
Data Sponsor for SWRAU (Land Ready for Reuse), Annual Commitment System (ACS) coordinator, and National Program Office (NPO)
management.
Progress reporting is done quarterly as checks, while official numbers are reported annually.	
4b. Data Limitations/Qualifications                                                                                             |
Sites that meet the SWRAU performance measure must also meet one of two Human Exposure Under Control (HEUC) performance
measures:

• "Current Human Exposure Controlled and Protective Remedy in Place"; or
• "Long-Term Human Health Protection Achieved"

Specific to S10, if the HEUC changes to something other than the two environmental indicators, and will not be restored by the end of the
fiscal year, Regions must retract a site from the SWRAU national total, and that retraction counts against the national target. Retractions are
made when Regions determine that an entire site no longer meets the SWRAU criteria. Because SWRAU is counted as a "net" number on a
yearly basis, a retraction is subtracted from that year's total number of SWRAU sites. If a Region retracts a site, the Region must still
achieve its net SWRAU goal. For example, if a Region has a goal of achieving a SWRAU designation for 10 sites, but retracts two that
year, the Region must identify 12 sites to meet the SWRAU target that year (12 - 2 = 10).

Based on experience dealing with sites to date, there are three categories of sites that may not meet the SWRAU measure. Even sites that
on the surface appear to not meet the measure may have circumstances that require additional consideration.

•      Ground water only sites
•      Sites with overly restrictive ICs
•      Sites that cannot support any use.

NPL sites that have been addressed by state programs for cleanup should NOT be counted as SWRAU if the actions taken are not
documented  in an EPA CERCLA decision document.	
4c. Third-Party Audits                                                                                                        |
For CERCLIS data: The GAO report, Superfund: Information on the  Status of Sites (GAO/RCED-98-241), dated August  28,  1998,
estimated that the cleanup status of National Priority List (NPL) sites reported by CERCLIS as of September 30, 1997, is accurate for 95
percent of the sites.
(www.gao.gov/archive/1998/rc98241.pdf)  Another OIG audit, Information Technology - Comprehensive Environmental Response,
Compensation, and Liability Information System (CERCLIS) Data Quality (Report No. 2002-P-00016), dated September 30, 2002,
evaluated the accuracy, completeness, timeliness, and consistency of the data entered into CERCLIS. (See
http://www.epa.gov/oig/reports/2002/cerlcis.pdf.) The report provided 11 recommendations to improve controls for CERCLIS data

                                                          226                                     Back to Table of Contents

-------
 GoalS                                                   Objectives                                             Measure S10
quality. EPA has either implemented or continues to implement these recommendations.

The IG also annually reviews the end-of-year CERCLIS data, in an informal process, to verify data that supports the performance
measures. Typically, there are no published results.

Annual EPA Office of Inspector General Audit/Report. The EPA OIG provides an annual report to Congress of the results of its audits
of the Superfund Program. Those reports are available at: http://www.epa.gov/oig/reports/annual.htm. The most recently available report
is the FY 2009 report. In that report, EPA received an unqualified audit opinion by the OIG for the annual financial statements, although
three material weaknesses and eight significant deficiencies were also identified.  The OIG recommended several corrective actions. The
Office of the Chief Financial Officer indicated in November 2009 that corrective actions will be taken.
 Record Last Updated: 02/13/2012 01:17:01 PM
                                                            227                                      Back to Table of Contents

-------
  Goal 3                                              Objective 4
  Measure Code : 5PQ - Percent of Tribes implementing federal regulatory
  environmental programs in Indian country (cumulative).
  Office of Indian and Tribal Affairs (OITA)
                                                       Measure 5PQ
   1. Measure and DQR Metadata
   Goal Number and Title
3 - Cleaning Up Communities and Advancing Sustainable Development
   Objective Number and Title
4-Strengthen Human Health and Environmental Protection in Indian Country
   Sub-Objective Number and Title
1 - Improve Human Health and the Environment in Indian Country
   Strategic Target Code and Title
1 - By 2015, increase the percent of tribes implementing federal regulatory environmental programs
   Managing Office
  AIEO
   Performance Measure Term Definitions
The performance measure tracks the number of "Treatment in a manner similar to a State" (TAS) program approvals or primacies and
execution of "Direct Implementation Tribal Cooperative Agreements (DITCAs)."


TAS status grants a tribe eligibility to implement and administer the environmental statutes for a program within the tribe's boundaries
comparable to the way States implement and administer the statutes outside of Indian country.

DITCAs are agreements negotiated between EPA and federally-recognized tribes and eligible intertribal consortia that enable the tribes to
conduct agreed-upon activities and to help EPA implement federal environmental programs in Indian country in the absence of an
acceptable tribal program.

The measure is based on a count of tribes, and a given tribe may have more than one TAS program, and may have DITCAs as well.
Because of the tribes with multiple qualifying programs, the total number of TAS designations plus DITCAs in Indian country is higher
than the number of tribes with regulatory environmental programs as reported for this measure.
This measure represents progression toward the goal of improving human health and the environment in Indian country by helping tribes
plan, develop and establish environmental protection programs.
 2. Data Definition and Source Reporting
 2a. Original Data Source
 Regions and Tribes
 2b. Source Data Collection
                                                       228
                                              Back to Table of Contents

-------
 Goal 3                                                 Objective 4                                            Measure 5PQ
Data for the TPMS are input on an ongoing basis by Regional tribal programs and EPA headquarters.	
2c. Source Data Reporting
Reports are in the form of tables with measures in the columns and years in the rows. The years can be compared. Data are input manually
by regional tribal Tribal Program Management System (TPMS) team members. The data are reported by the Regions into TPMS at
mid-year and at the end of the year.	


3.  Information  Systems  and Data Quality Procedures	
3a.  Information Systems                                                                                                     |

The Tribal Program Management System (TPMS) is a secure database that holds the performance information

http://www.epa.gov/tribalportal/

The information is entered into standard query fields in the data system. Thus, there is no  allowance for differences in reporting across
EPA's Regional offices, and national reports can be assembled in a common framework. The assumption is that the authorized person who
enters the data is knowledgeable about the performance status of the tribe.

Quality Management Plan (QMP) is  being drafted by contractor	
3b.  Data Quality Procedures                                                                                                   |
Standard Operating Procedures are detailed in the Data Definitions document. Each Regional Administrator, who has tribal activity in his
regional area, is the EPA official who certifies information in  TPMS prior to submission to EPA Headquarters American Indian Office
(AIEO.) However, in  some cases the Regional Administrator may wish to  delegate the signatory authority to another official such as the
Regional Indian  Coordinator.   This procedure generally follows  guidance  provided  in  EPA Information Quality  Guidelines.  (See
http://www.epa.gov/quality/informationguidelines/ for more information.)


Additionally, the data in TPMS are extracted by the regional TPMS team twice a year,  and delivered by spreadsheet to the Regional TPMS
Project Officers for review and verification.
TPMS Data Definitions.doc
3c. Data Oversight
Regional Indian Coordinators certify data and submit to AIEO
3d. Calculation Methodology
Calculation methodology is: Count the active tribes with DITCA or TAS in a fiscal year. TAS do not have expiration dates and are
cumulative.
                                                          229                                     Back to Table of Contents

-------
 Goal 3
Objective 4
Measure 5PQ
Calculation of Percentages:  572 is the number that is used to calculate percentage, and reflects tribal bands that are independent entities
and are eligible for EPA funding.

Because of the tribes with multiple qualifying programs, the total number of TAS designations plus DITCAs in Indian country is higher
than the number of tribes with regulatory environmental programs as reported for this measure.

Percent of Tribes implementing federal regulatory environmental programs in Indian country:
f- Tribal Program Manae6"16"* System (TPMS) - Windows Internet txplorer
/~ ~> r — : 	
£.. tttpsrj
iaspub .epa. gcv/TATS/cd:o_raport_tas_dtc9.i5p?tlan-13


Ed®
fl||*t||X |LI •: :.^ich I!*10*]
Ffa Edt View Favcrtes Tods Help
A & £, TWal Pfogran Management System (TPMS) Sf * S3 ft " &£»* " 0TS* *
managcrncm
System (TPMS)
Home
EPfl Strategic
Plan
Feedback
Done
MEASURE Z, PERCENT OF TRIBES IMPLEMENTING FEDERAL REGULATORY ENVIRONMENTAL
PROGRAMS IN INDIAN COUNTRY
|
FvJ5Ca' Tribes with TAS Tribes with DITCA
Year
2004 37 ~~5
pnnr; 41 10
2006 45 9
2007 ^7 26
2008 54 29
2009 56 17
2010 57 23
2011 57 13
2012
Tribes with TAS or
DITCA or both
41
49
52
67
76
68
73
67
-
*-
-------
 Goal 3                                                  Objective 4                                             Measure 5PQ
TPMS Data Definitions.doc
4.  Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The procedures for collecting and reporting on the Goal 4 Objective 3 performance measures require that Regonal program managers
certify the accuracy of the data submitted by the regions to AIEO. This certification procedure is consistent with EPA Information Quality
Guidelines and is verified by AIEO personnel	
4b. Data Limitations/Qualifications	

Because data are input by EPA's Regional  Project Officers on an ongoing basis, there may be a time lag between when a tribal program
status has been achieved and when the data are entered into the TPMS.

For the TPMS, errors could occur by mis-entering data or neglecting to enter data.  However, the data from each region will be certified as
accurate at the end of each reporting cycle; error is estimated to be low, about 1-2 percent.	
4c. Third-Party Audits

Not Applicable
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                           231                                      Back to Table of Contents

-------
  Goal 3                                              Objective 4                                        Measure 5PR
  Measure Code : 5PR - Percent of Tribes conducting EPA approved environmental
  monitoring and assessment activities in Indian country (cumulative.)
  Office of Indian and Tribal Affairs (OITA)
   1. Measure and DQR Metadata
   Goal Number and Title
3 - Cleaning Up Communities and Advancing Sustainable Development
   Objective Number and Title
4-Strengthen Human Health and Environmental Protection in Indian Country
   Sub-Objective Number and Title
1 - Improve Human Health and the Environment in Indian Country
   Strategic Target Code and Title
2 - By 2015, increase the percent of tribes conducting EPA-approved environmental monitoring
   Managing Office
  AIEO
   Performance Measure Term Definitions
 A tribe is a governmental entity that is recognized by the federal government and eligible to receive federal funding.

The performance measure reports the number of active Quality Assurance Project Plans (QAPPs) for monitoring activities that have been
approved by Regional Quality Assurance Officers. All ongoing environmental monitoring programs are required to have active QAPPs,
which are used as a surrogate for the monitoring activities that occur in Indian country.

However, tribes often have more than one QAPP, so the count of total QAPPs is always higher than the number of tribes that have QAPPs
as reported for this measure.


Environmental monitoring and assessment activities are those that measure biological, chemical, or physical measurements.

EPA-approved indicates a required QAPP for the activity has been approved by Regional Quality Assurance Officers.

Active QAPPs are those that have not expired.

This measure represents progression toward the goal of improving human health and the environment in Indian country by helping tribes
plan, develop and establish environmental protection programs.
 2. Data  Definition and Source Reporting
 2a. Original Data Source
                                                      232
                                              Back to Table of Contents

-------
 Goal 3                                                 Objective 4                                            Measure 5PR

Regional Quality Assurance Officers	
2b. Source Data Collection
 Regional tribal program liaisons obtain information from Regional Quality Assurance Officers.


Spatial Detail: Base unit is a tribe. Geographic coverage is national.
2c. Source Data Reporting
Reports are in the form of tables with measures in the columns and years in the rows. The years can be compared. Data are input manually
by regional Tribal Program Management System (TPMS) team members. The data are reported by the Regions into TPMS at mid-year and
at the end of the year.	


3.  Information Systems  and Data  Quality Procedures
3a.  Information Systems                                                                                                      |

The Tribal Program Management System (TPMS) is a secure database that holds the performance information

http://www.epa.gov/tribalportal/

The information is  entered into standard query fields in the data system. Thus, there is no allowance for differences in reporting across
EPA's Regional offices, and national reports can be assembled in a common framework.  The assumption is that the authorized person who
enters the data is knowledgeable about the performance status of the tribe.

Quality Management Plan (QMP) is being drafted by contractor	
3b.  Data Quality Procedures                                                                                                   |

Standard Operating Procedures are detailed in the Data Definitions document. Each Regional Administrator, who has tribal activity in his
regional area, is the EPA official who certifies information in TPMS prior to submission to EPA Headquarters American Indian Office
(AIEO.) However,  in some cases the Regional Administrator may wish to delegate the signatory authority to another official such as the
Regional Indian Coordinator. This procedure generally follows guidance provided in EPA Information Quality Guidelines. (See
http://www.epa.gov/quality/informationguidelines/ for more information.)


Additionally, the data in TPMS are extracted by the regional TPMS team twice a year, and delivered by spreadsheet to the Regional TPMS
Project Officers for review and verification.
TPMS Data Definitions.doc
                                                          233                                     Back to Table of Contents

-------
 Goal 3                                                    Objective 4                                              Measure 5PR
3c. Data Oversight	|
Regional Indian Coordinators
3d. Calculation Methodology                                                                                                       |

Each row in the report is a fiscal year. Calculation methodology is: Count number of tribes with at least one active QAPP in a fiscal year. A
tribe is counted once even if they have more than one QAPP.

Calculation of Percentages:  572 is the number that is used to calculate percentage, and reflects tribal bands that are independent entities
and are eligible for EPA funding.
                                                             234                                       Back to Table of Contents

-------
 Goal 3
 ^5 Triba! Program Management System (TPMS) - Windows Internet Explorer
                                           Objective 4
Measure 5PR
  Fte   Edt   View  Favorites  Tods  Help
          ^ Trtoal Pf oyarn Management System (TPMS)
 Tribal PTODTB
   Manapcm-
   SyslcrnUi
   Home

 EPB Strut--
   Plan
 Cone
                    Performance and Accountability Report (PAR) (as of 4/20/2011)

MEASURE 3. PERCENT OF TRIBES CONDUCTING ERA-APPROVED ENVIRONMENTAL MONITORING
                    AND ASSESSMENT ACTIVITIES IN INDIAN COUNTRY




Fiscal Year
£004
2005
2006
2007
SODS
2009


2010
2011
2012
Total Tribes with QAPPs % of Total Tribes Hi at have
Monitoring Activities
294
285
2 GO
274
255
242
51,4%
49,33%
48,95%
47,9%
44.58%
42,31%
227 39,69%
193 33,74%
-
-
                   * Data for thg presant fiscal y9ar may not b9 completo.

                   Notes:

                      1.  QAPPs are used as indicators that tribes are conduttona EPA-approved environmental monitoring and assessment
                         activities.
                      2.  Percent of bribes is calculated as; (Total Tribes/572) x 100,

                        Abbreviations Used in Columns:
                       * QAPP - quality Assurance Projact Plan
                                                                                            Internet
                                                                                          \ 100%
TPMS Data Definitions.doc
4. Reporting and  Oversight
4a. Oversight and Timing of Results Reporting
The procedures for collecting and reporting on the Goal 4 Objective 3 performance measures require that Regional program managers

                                                                  235                                          Back to Table of Contents

-------
 Goal 3                                                    Objective 4                                              Measure 5PR
certify the accuracy of the data submitted by the regions to AIEO.  This certification procedure is consistent with EPA Information Quality
Guidelines and verified by AIEO personnel	
4b. Data Limitations/Qualifications

Because data are input by EPA's Regional Project Officers on an ongoing basis, there may be a time lag between when a tribal program
status has been achieved and when the data are entered into the TPMS.

For the TPMS, errors could occur by mis-entering data or neglecting to enter data.  However, the data from each region will be certified as
accurate at the end of each reporting cycle; error is estimated to be low, about 1-2 percent.	
4c. Third-Party Audits

Not applicable
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                             236                                       Back to Table of Contents

-------
 Goal 4                                         No Associated Objective                                   Measure CS1
 Measure Code : CS1 - Percentage of planned research products completed on time
 by the Chemical Safety for Sustainability research program.
 Office of Research and Development (ORD)
   1. Measure and DQR Metadata
   Goal Number and Title                          4 - Ensuring the Safety of Chemicals and Preventing Pollution
   Objective Number and Title
   Sub-Objective Number and Title	
   Strategic Target Code and Title	
   Managing Office                                Office of Program Accountability and Resource Management- Planning
   Performance Measure Term Definitions
A research product is "a deliverable that results from a specific research project or task. Research products may require translation or
synthesis before integration into an output ready for partner use."

 This secondary performance measure tracks the timely completion of research products.

Sustainability Research Strategy, available from:
http://epa.gov/sciencematters/april2011/truenorth.htm

http://www.epa.gov/risk_assessment/health-risk.htm


 2. Data Definition and Source Reporting	
 2a. Original Data Source
 EPA and its partners confirm the schedule for completing research outputs and products that are transformed or synthesized into outputs.
 ORD tracks progress toward delivering the outputs; clients are notified of progress.  Scheduled milestones are compared to actual progress
 on a quarterly basis. At the end of the  fiscal year, outputs are either classified as "met" or "not met" to determine the overall percentage of
 planned products that have been met by the research program. The actual product completion date is self-reported.	
 2b. Source Data Collection	

 Each output is assigned to a Lab or Center representative before the start of the fiscal year.  This individual provides quarterly status
 updates via ORD's Resource Management System. Status reports are reviewed by senior management, including the Lab or Center
 Director and National Program Director. Overall status data is generated and reviewed by ORD's Office of Program Accountability and
 Resource Management.
                                                       237                                   Back to Table of Contents

-------
 Goal 4
2c. Source Data Reporting
No Associated Objective
Measure CS1
Quarterly status updates are provided via ORD's Resource Management System.
3.  Information Systems and Data  Quality Procedures
3a.  Information Systems
Internal database or internal tracking system such as the Resources Management System (RMS).	
3b. Data Quality Procedures	
EPA and its partners confirm the schedule for completing research outputs and products that are transformed or synthesized into outputs.
ORD tracks progress toward delivering the outputs; clients are notified of progress. Scheduled milestones are compared to actual progress
on a quarterly basis. At the end of the fiscal year, outputs are either classified as "met" or "not met" to determine the overall percentage of
planned products that have been met by the program.	
3c. Data Oversight

The National Program Director oversees the source data reporting, specifically, the process of establishing agreement with program
stakeholders and senior ORD managers on the list and content of the planned products, and subsequent progress, completion, and delivery
of these products.	
3d. Calculation Methodology	
At the end of the fiscal year, outputs are either classified as "met" or "not met". An overall percentage of planned products met by the
program is reported.	
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The Office of Program Accountability and Resource Management is responsible for reporting program progress in meeting its target of
completion of 100% of program planned products.

4b. Data Limitations/Qualifications	
This measure does not capture directly the quality or impact of the research products.	
4c. Third-Party Audits

Not applicable
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                          238
                                                Back to Table of Contents

-------
Goal 4                                               No Associated Objective                                        Measure CS1
                                                             239                                       Back to Table of Contents

-------
 Goal 4                                         No Associated Objective                                   Measure HS1
 Measure Code : HS1 - Percentage of planned research products completed  on time
 by the Homeland Security research program.
 Office of Research and Development (ORD)
   1. Measure and DQR Metadata
   Goal Number and Title                          4 - Ensuring the Safety of Chemicals and Preventing Pollution
   Objective Number and Title
   Sub-Objective Number and Title	
   Strategic Target Code and Title	
   Managing Office                                Office of Program Accountability and Resource Management- Planning
   Performance Measure Term Definitions
A research product is "a deliverable that results from a specific research project or task.  Research products may require translation or
synthesis before integration into an output ready for partner use."

 This secondary performance measure tracks the timely completion of research products.

Sustainability Research Strategy, available from:
http://epa.gov/sciencematters/april2011/truenorth.htm

http://www.epa.gov/risk_assessment/health-risk.htm


 2. Data Definition and Source Reporting	
 2a. Original Data Source
 EPA and its partners confirm the schedule for completing research outputs and products that are transformed or synthesized into outputs.
 ORD tracks progress toward delivering the outputs; clients are notified of progress.  Scheduled milestones are compared to actual progress
 on a quarterly basis. At the end of the fiscal year, outputs are either classified as "met" or "not met" to determine the overall percentage of
 planned products that have been met by the research program. The actual product completion date is self-reported.	
 2b. Source Data Collection	

 Each output is assigned to a Lab or Center representative before the start of the fiscal year.  This individual provides quarterly status
 updates via ORD's Resource Management System. Status reports are reviewed by senior management, including the Lab or Center
 Director and National Program Director. Overall status data is generated and reviewed by ORD's Office of Program Accountability and
 Resource Management.
                                                       240                                  Back to Table of Contents

-------
 Goal 4
2c. Source Data Reporting
No Associated Objective
Measure HS1
Quarterly status updates are provided via ORD's Resource Management System.
3.  Information Systems and Data  Quality Procedures
3a.  Information Systems
Internal database or internal tracking system such as the Resources Management System (RMS).	
3b. Data Quality Procedures	
EPA and its partners confirm the schedule for completing research outputs and products that are transformed or synthesized into outputs.
ORD tracks progress toward delivering the outputs; clients are notified of progress. Scheduled milestones are compared to actual progress
on a quarterly basis. At the end of the fiscal year, outputs are either classified as "met" or "not met" to determine the overall percentage of
planned products that have been met by the program.	
3c. Data Oversight

The National Program Director oversees the source data reporting, specifically, the process of establishing agreement with program
stakeholders and senior ORD managers on the list and content of the planned products, and subsequent progress, completion, and delivery
of these products.	
3d. Calculation Methodology	
At the end of the fiscal year, outputs are either classified as "met" or "not met". An overall percentage of planned products met by the
program is reported.	
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The Office of Program Accountability and Resource Management is responsible for reporting program progress in meeting its target of
completion of 100% of program planned products.

4b. Data Limitations/Qualifications	
This measure does not capture directly the quality or impact of the research products.	
4c. Third-Party Audits

Not applicable
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                          241
                                                Back to Table of Contents

-------
Goal 4                                               No Associated Objective                                        Measure HS1
                                                             242                                       Back to Table of Contents

-------
  Goal 4                                               Objective 1                                          Measure 009
  Measure Code : 009 - Cumulative number of certified Renovation Repair and
  Painting firms
  Office of Chemical Strategies and Pollution Prevention (OCSPP)
1. Measure and DQR Metadata
Goal Number and Title
Objective Number and Title
Sub-Objective Number and Title
Strategic Target Code and Title
Managing Office
4-
1-
1-
2-

Ensuring the Safety of Chemicals and Preventing Pollution
Ensure Chemical Safety
Protect Human Health from Chemical Risks
By 2014,reduce the percentage of children with blood lead
Office of Pollution Prevention and Toxic



levels above 5ug/dl to 1.0 percent or less

Performance Measure Term Definitions
Cumulative number: Number since October 1, 2009.
Certified Renovation Repair and Painting firms: "Renovation, Repair, and Painting" is generally defined as any activity that disturbs
paint in housing and child-occupied facilities built before 1978, including remodeling, repair, maintenance, electrical work, plumbing,
painting, carpentry and window replacement. Most minor repair and maintenance activities of less than six square feet per interior room or
20 square feet on the exterior or a home or building are exempt from the work practice requirements. However, this exemption does not
apply to window replacements, demolitions or the use of prohibited practices.

Background:
On March 31, 2008, EPA issued a new rule (Renovation, Repair, and Painting Program Rule or RRP rule) aimed at protecting children from
lead-based paint hazards. The rule requires contractors and construction professionals that work in pre-1978 housing or child-occupied
facilities to follow lead-safe work practice standards to reduce potential exposure to dangerous levels of lead for children in places they
frequent. In October 2009, firms began to apply to EPA for certification to conduct renovations.  As of April 2010, renovations in target
(pre-1978) housing  and child-occupied facilities must be conducted by certified renovation firms, using renovators with accredited training,
and following the work practice requirements of the rule.
 2. Data Definition and Source Reporting
 2a. Original Data Source
 Firms seeking certification submit applications directly to EPA, enabling the Agency to track the number of certified firms through its
 Federal Lead-Based Paint Program (FLPP) database. In states that have received authorization from EPA to administer a Renovation
 Repair and Painting program in lieu of the Federal program, state grantees collect data on the number of state certified Renovation Repair
                                                        243                                   Back to Table of Contents

-------
 Goal 4                                                  Objective 1                                            Measure 009
and Painting firms.	
2b. Source Data Collection                                                                                                      |
The original data source generally would be EPA itself since applicants do not collect data but merely submit certification applications.
Authorized states would be the original data sources in some cases.  Entry of application data into the FLPP database is handled by an EPA
contractor.

EPA collects data on the numbers of firms certified in each authorized state through quarterly reports from grantees as part of the Agency's
oversight of authorized programs.
2c. Source Data Reporting
Form/mechanism for receiving data and entering into EPA system: Firms seeking RRP certification submit applications in hard copy
directly to EPA. The data are entered into the FLPP database by an EPA contractor.  The original hard copies are retained to augment the
electronic records. Authorized states report data to EPA Regional Offices on the number of certified firms in the state.
Timing and frequency of reporting:  Application data are entered into the FLPP database continuously as applications to the Federal
Program are received.	


3.  Information Systems and Data Quality Procedures
3a.  Information Systems                                                                                                       |
The Federal Lead-Based Paint Program (FLPP) database provides a record of all applications for the certification of Renovation Repair and
Painting firms where EPA directly implements the program, the actions taken on those applications including final decisions, and the
multiple steps in the process used for measurement.  Thus, the number of firms actually being certified can be obtained directly from the
database. EPA uses an Oracle Discoverer application to query the database to collect measurable performance data. Documentation for the
FLPP database is maintained internally at EPA and is available upon request.

The FLPP database is currently undergoing improvements to increase the processing efficiency and tracking for firm certifications.

3b.  Data Quality Procedures                                                                                                    |
The database is interactive, and operational usage in processing applications by Headquarters and the Regional offices provides ongoing
internal quality reviews. Further, EPA periodically checks contractors' data entry quality.

OPPT has in place a signed Quality Management Plan ("Quality Management Plan for the Office of Pollution Prevention and Toxics;
Office of Prevention, Pesticides and Toxic  Substances", November 2008).  Like the 2003 QMP, it will ensure the standards and procedures
are  applied to this effort. In addition, NPCD has an approved Quality Management Plan in place, dated July 2008.  Applications and
instructions for applying for certification and accreditation are documented and available at the Web  site
http://www.epa.gov/lead/pubs/traincert.htm. Documentation for the FLPP database is maintained internally at EPA and is available upon
request.	
                                                                                                                            I
                                                           244                                     Back to Table of Contents

-------
 Goal 4                                                  Objective 1                                             Measure 009
3c. Data Oversight	
Branch Chief, Planning and Assessment Branch
3d. Calculation Methodology
Since the measure simply tracks the number of firms certified to perform Lead RRP work, there is no need to transform the original data by
any mathematical methods.
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
 Planning and Accountability Lead in the Resource Management Staff in the Office of Program Management Operations. Reporting
semiannually: mid-year and end-of-year.	
4b. Data Limitations/Qualifications                                                                                                |
Data are estimates from firm certification applications received either directly by EPA or through EPA authorized State programs and
reported to EPA Regional offices.

There is little or no sampling error in this performance measure because it is based on an evaluation of all applicable records for the Federal
program. Data on firms certified in each authorized state are collected as part of the Agency's oversight of authorized programs through
semi-annual reports from grantees.	
4c. Third-Party Audits
Not applicable.
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                           245                                      Back to Table of Contents

-------
  Goal 4                                                Objective 1                                            Measure 091
  Measure Code : 091 - Percent of decisions completed on time (on or before PRIA  or
  negotiated due date).
  Office of Chemical Strategies and Pollution Prevention (OCSPP)
   1.  Measure and DQR Metadata
   Goal Number and Title                           4 - Ensuring the Safety of Chemicals and Preventing Pollution
   Objective Number and Title	    1 - Ensure Chemical Safety
   Sub-Objective Number and Title
1 - Protect Human Health from Chemical Risks
   Strategic Target Code and Title                    5 - BV 2014/ reduce concentration of targeted chemicals in children
   Managing Office                                   Office of Pesticide Programs
   Performance Measure Term Definitions
Decisions: Each action is assigned a decision number when it is received and with time, actions and decisions have come to mean about the
same. A decision may be an application to register a new pesticide product, to amendment a registered product's label, to review a protocol,
to establish a tolerance or to make a decision on a request to waive a study requirement.

Completed: An action or decision is completed when OPP makes a decision on the application, i.e. the product is registered,  a label is
stamped, protocol reviewed, or the action is denied, the label not approved, etc. A decision memorandum is issued describing the decision
made and the date that the delegated official signs the memo is the date that the decision is completed.  In the case  of a label, the date that
the label is stamped as approved is the date that the application to register or amend a label is completed.

PRIA: The Pesticide Registration Improvement Act (PRIA) of 2003 established pesticide registration service fees  for registration actions.
The Pesticide Registration Improvement Renewal Act (PRIA 2), effective October 1, 2007, reauthorized the PRIA for five more years until
2012. The PRIA 2 legislation increased the number of actions covered by fees, modified the payment process and application in-processing.
The category of action, the amount of pesticide registration  service fee, and the corresponding decision review periods by year are
prescribed in these statutes. Their goal is to create a more predictable evaluation process for affected  pesticide decisions, and  couple the
collection of individual fees with specific decision review periods. They also promote shorter decision review periods for reduced-risk
applications.

On time (on or before PRIA or negotiated due date): Each PRIA 2 fee category has an associated  period of time in which the Agency
must make a determination, which has been called a decision review period or PRIA 2 timeframe, or "PRIA due date." The PRIA 2 due date
may be extended by a mutual agreement between the applicant and the Agency. The new due date is  called a negotiated due date.
Negotiated due dates occur predominately as a result of missing information or data or data deficiencies identified  during an in-depth review
of the application. The due date then is extended to allow the applicant the time to submit the data or information and for the Agency to
review the data and make a determination.
                                                         246                                    Back to Table of Contents

-------
  Goal 4                                                  Objective 1                                             Measure 091

Background:
This measure is a program output which represents the program's statutory requirements to ensure that pesticides entering the marketplace
are safe for human health and the environment, and when used in accordance with the packaging label present a reasonable certainty of no
harm. In addition, under PRIA and PRIA 2, there are specific timelines, based on the type of registration action, by which the Agency must
make a decision. These laws do allow the decision due date under PRIA to be negotiated to a later date, after consultation with and
agreement by the submitter of the application. The timeliness measure represents the Agency's effectiveness in meeting these PRIA
timelines.

For more information, see
• http ://www. epa. gov/pesticides/fees/
• FIFRA Sec 3(c)(5)
• FFDCA Sec 408(a)(2).
 2. Data  Definition and Source Reporting	
 2a. Original Data Source                                                                                                        |
 EPA senior managers.
 2b. Source Data Collection                                                                                                      |
 Source Data Collection Methods: EPA senior managers review justifications and make final decisions to extend of negotiate a PRIA due
 date and whether or not to issue a "PRIA Determination to Not Grant" a registration. The Agency employs continuous monitoring of the
 status of PRIA decisions. Numerous internal Agency meeting continue to monitor workload and compliance with PRIA due dates.
 Throughout the pesticide registration program, weekly meetings are held to review the status of pending decisions, due date extensions, and
 refunds; to identify potential issues and target their resolution; to resolve fee category questions; and to coordinate schedules with science
 support organizations.

 EPA QA requirements/guidance governing collection: All risk assessments are subject to public and scientific peer review. All registration
 actions must employ sound science and meet the Food Quality Protection Act (FQPA) safety standards. The office adheres to its Quality
 Management Plan  (Nov. 2006) in ensuring data quality and that procedures are properly applied.	
 2c. Source Data Reporting
 All registration actions received under the PRIA and PRIA 2 are entered and tracked in the Pesticide Registration Information System
 (PRISM). Reports developed in Business Objects (using PRISM as the data source) allow senior management to more effectively track the
 workload (e.g., pending actions with upcoming PRIA due dates, actions for which the PRIA date appears to have passed etc.) and ensure
 that PRIA or negotiated due dates are met.

 OPP uses several internal controls within the OPPIN/PRISM system. First of all, users must be FIFRA CBI cleared in order to access the

                                                            247                                     Back to Table of Contents

-------
 Goal 4                                                   Objective 1                                             Measure 091
system.  Within the system, security measures are taken to allow only authorized users to perform certain operations, which are managed
by our Data Base Administrator (DBA). For example, only Branch Chiefs can enter a negotiated due date in the Registration Division.
The DBA must receive an Access Form from users wanting to use the system and their supervisor must sign the Access Form.

Applications are pin punched upon receipt by a NOWCC in ISB/ITRMD/OPP and the pin punch date is entered into OPPIN by another
NOWCC in ISB. The pin punch date is the receipt date in OPPIN.   The EPA team leader performs periodic/random checks of their work.
Experts  from the three registering divisions review each application and place it in a PRIA fee category generally on the date of receipt.

PRIA 2  requires that certification of payment be submitted together with the application.  .  Beginning January 2, 2008, ISB started to hold
any application that did not contain certification of payment.  ISB contacts the submitter to request certification of payment.  When the
certification is received, ISB generates an acknowledgement and sends it to the submitter. If no certification of payment is received within
14 days, ISB prepares a rejection letter for the Deputy Office Director's signature.  After the rejection letter is signed, ISB posts the
rejection to OPPIN, and invoices the submitter for 25% of the appropriate PRIA fee.

Any issues related to assigning a fee category are discussed with divisional management and may be elevated.  If a full fee is being paid,
the date that begins the PRIA timeframe or start date is the latest of 21 days after receipt of the application or the day payment is received
by the Washington Finance Center/ OCFO.  Staff in OCFO enter the amount and date of receipt of the payment into IFMS. OPP
downloads IFMS and electronically transfers the data into OPPIN.

Once the IFMS data is transferred to OPPIN, OPPIN automatically calculates due dates from the start date using the time frames in the FR
Notice on the fee schedule. Due dates can be extended through negotiations with the registrant or applicant.  Negotiated due dates are
manually entered and the rights to enter a negotiated due date belong to only branch chiefs, the Division Directors and other individuals
designated such rights by a Division Director. In BPPD, negotiated PRIA due dates are entered in OPPIN by the branch chiefs, branch
team leaders, or its Administrative Specialist while in RD, only a branch chief enters the date. According to OPP's procedures, a
negotiated due date cannot be entered into the system until the Deputy Office Director or Office Director approves the negotiated date by
signing  the negotiated due date form.  A copy of the negotiated due date form and documentation of the applicants agreement with the due
date are filed.
Beginning July 2011. OPP transition to using Webforms for processing negotiated due date forms.   Forms are routed, approved, and
retained electronically.

The date that an action is completed is entered by staff in RD, AD,  and BPPD according to their internal procedures.  Documentation  of the
date of completion is filed in the product's file.  Once data is entered into OPPIN, start dates and due dates can not be changed by staff in
the regulatory divisions.  Changes are made by staff programming OPPIN in ITRMD.  "Data fixes" must be requested by generating a SCR
(Systems Change Request). These requests are reviewed by ITRMD staff and management and representatives of the regulatory divisions.
Questions and issues are elevated to the PRIA Senior Advisor and if needed to OPP management. OPP management holds a Bi-weekly
PRIA meeting in which these issues are discussed and resolved. The OPP Immediate Office uses a number of monitoring reports to identify
actions that are past their due date or appear to have been logged out past their due date.  An issue is then resolved with the appropriate
division and generally involves an action that needs to be logged out as completed or a negotiated due date that needs to be entered.
                                                            248                                      Back to Table of Contents

-------
 Goal 4                                                 Objective 1                                            Measure 091
OPPIN software issues have also been identified through this oversight effort and an SCR is developed to make the necessary programming
corrections.

Annually, the Office of the Inspector General conducts an audit that includes verifying the accurate entry of the date an action is received,
extended and completed.
3. Information Systems and Data Quality Procedures
3a.  Information Systems
All registration actions received under the PRIA and PRIA 2 are entered and tracked in the Pesticide Registration Information System
(PRISM).

The Office of Pesticide Programs (OPP) has migrated all of its major data systems including regulatory and scientific data, workflow
tracking and electronic document management into one integrated system, the Pesticide Registration Information System (PRISM). PRISM
provides a centralized source of information on all registered pesticide products, including chemical composition, toxicity, name and
address of registrant, brand names, registration actions, and related data. It is maintained by the EPA and tracks regulatory data submissions
and studies, organized by scientific discipline, which are submitted by the registrant in support of a pesticide's registration. All registration
actions received under the PRIA and PRIA 2 are entered and tracked in PRISM.

PRISM is the successor to the Office of Pesticide Programs Information System Network (OPPIN). Data has been migrated from the
following databases: Chemical Vocabulary (CV), Company Name and Address (CNAD), Pesticide Document Management System
(PDMS), Pesticide Product Information System (PPIS), Chemical Review Management System (CRMS), FIFRA CBI Access (FAS),
Jackets, Product Data Call-In (PDCI), Phones, Pesticide Regulatory Action Tracking (PRAT), Reference System (REFS), Tolerance
Indexes (TIS and TOTS). Sources of the input are paper copy and electronic data. EPA's Central Data Exchange (CDX), scheduled as EPA
097, is the gateway for electronic submissions. It consolidates information stored on the mainframe, the OPP LAN, on stand-alone
computers and in paper copy. PRISM (Pesticide Registration Information System) consolidates various pesticides program databases.

EPA recently constructed a module in PRISM tracking major Registration Review milestones. This module enhances tracking capabilities
and is an important management tool.

For information on disposition of records in this database, please see EPA Records Schedule 329,
http://www.epa.gov/records/policy/schedule/sched/329.htm

OPP adheres to its Quality Management Plan (Nov. 2006) in ensuring data quality and that procedures are properly applied.


PRISM was developed between 1997 and 2003 and has been operational since June 2, 2003. PRISM provides e-government capabilities to
                                                          249                                     Back to Table of Contents

-------
 Goal 4                                                  Objective 1                                             Measure 091
share pesticide information with OPP stakeholders. PRISM supports OPP's responsibilities under a variety of regulatory requirements
including FIFRA, FQPA, PRIA, PRIA II, Pesticide Registration Review and for the Endocrine Disrupter Screening Program and will
standardize the structure of a chemical case where appropriate to define the key tasks and documents used in the a number of pesticide
review processes. EDSP components are used to order, monitor, track and manage scientific tests associated with pesticide chemicals.
Pesticide Registration Improvement Renewal Act (PRIA II).

PRISM was developed in response to the requirements of the following laws and regulations:

•       The Title III of the E-Government Act of 2002 - Federal Information Security Management Act (FISMA) - Public Law 107-347:
A security plan must be developed and practiced throughout all life cycles of the agency's information systems.


•       Office of Management and Budget (OMB) Circular A-130, Management of Federal Information Resources: A System Security
Plan (SSP) is to be developed and documented for each GSS and Major Application (MA) consistent with guidance issued by the National
Institute of Standards and Technology (NIST).


•       Federal Information Processing Standards (FIPS) Publication 199, Standards for Security Categorization of Federal Information
and Information Systems: This document defines standards for the security categorization of information and information systems. System
security categorization must be included in  SSPs.


•       FIPS Publication 200, Minimum Security Requirements for Federal Information and Information Systems: This document contains
information regarding specifications for minimum security control requirements for federal information and information systems. Minimum
security controls must be documented in SSPs.


•       NIST Special Publication (SP) 800-18 Revision  1, Guide for Developing Security Plans for Federal Information Systems: The
minimum standards for an SSP are provided in this NIST document.


•       NIST SP 800-53, Revision 3, Recommended Security Controls for Federal Information Systems and Organizations: This document
contains a list of security controls that are to be implemented  into federal information systems based on their FIPS  199 categorization. This
document is used in conjunction with FIPS  200 to define minimum security controls,  which must be documented in SSPs.


•       EPA Information Security Planning Policy. A system security plan shall be developed for each system cited on the EPA Inventory
of Major Information Systems, including major applications and general support systems
                                                          250                                      Back to Table of Contents

-------
 Goal 4                                                   Objective 1                                              Measure 091

Most, if not all, of PRISM data should be considered "source" data. This means that these data originate from primary data providers,
particularly pesticide product registrants, submitting information sent to EPA directly in response to FIFRA regulatory requirements.

PRISM contains source data and from this source data, certain dates, such as the date due are calculated automatically.

3b. Data Quality Procedures	
OPP adheres to its Quality Management Plan (Nov. 2006) in ensuring data quality and that procedures are properly applied.
3c. Data Oversight
Branch Chief, Financial Management and Planning Branch.
3d. Calculation Methodology
Unit of analysis: Percent

The percent completed on time is calculated by taking the total number of decisions or actions  completed and withdrawn on or before their
due date and dividing by the total number decisions or actions completed and withdrawn within the date range specified.
4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting	
Planning and Accountability Lead in the Resource Management Staff in the Office of Program Management Operations.  Reporting
semiannually: mid-year and end-of-year.	
4b. Data Limitations/Qualifications	
No Data Limitations.

4c. Third-Party Audits
Not applicable.
 Record Last Updated: 02/13/2012 01:16:48 PM
                                                            251                                      Back to Table of Contents

-------
  Goal 4                                                Objective 1                                           Measure 164
  Measure Code  : 164 - Number of pesticide registration review dockets opened.
  Office of Chemical Strategies and Pollution Prevention (OCSPP)
   1.  Measure and DQR Metadata
   Goal Number and Title
4 - Ensuring the Safety of Chemicals and Preventing Pollution
   Objective Number and Title
1 - Ensure Chemical Safety
   Sub-Objective Number and Title
                                                1 - Protect Human Health from Chemical Risks
   Strategic Target Code and Title
5 - By 2014, reduce concentration of targeted chemicals in children
   Managing Office
   Office of Pesticide Programs
   Performance Measure Term Definitions
Registration Review dockets: EPA initiates a registration review by establishing a docket for a pesticide registration review case and
opening the docket for public review and comment. Each docket contains a Summary Document that explains what information EPA has on
the pesticide and the anticipated path forward. The Summary Document includes:

•       A Preliminary Work Plan highlighting anticipated risk assessment and data needs, providing an anticipated timeline for completing
the pesticide's review, and identifying the types of information that would be especially useful to the Agency in conducting the review;

•       A fact sheet providing general background information and summarizing the current status of the pesticide;

•       Ecological risk assessment problem formulation and human health scoping sections describing the data and scientific analyses
expected to be necessary to complete the pesticide's registration review.

Opened: EPA initiates a registration review by establishing a docket for a pesticide registration review case and opening the docket for
public review and comment. The Agency publishes a Federal Register notice that announces the availability of the docket and provides a
comment period of at least 60 days. See http://www.epa.gov/oppsrrdl/registration_review/reg_review_process.htm for more information.

Background:
The Food Quality Protection Act of 1996 directed EPA to establish a Registration Review program with the goal of reviewing all registered
pesticides, AIs and products,  on a 15-year cycle to ensure that they continue to meet the standards of registration. EPA issued the final rule
in 2006 and began implementing the program in 2007. Under the rule, EPA posts registration review schedules and these will provide a
baseline for expected AI case dockets that will be opened for the next three year cycle and for decisions expected over the next several
years. The first step of Registration Review is to open a public docket for each pesticide case entering the process to show the public what
the Agency knows about the AI and seek comment. When comments are evaluated and data needs are finalized, OPP posts a Final Work
Plan (FWP) for each AI case.  Although the docket openings and the FWPs are tracked, both steps require notable resources to complete.
                                                          252
                                                 Back to Table of Contents

-------
  Goal 4                                                 Objective 1                                            Measure 164

All registrations must be based on sound science and meet the Food Quality Protection Act (FQPA) safety standard. All risk assessments are
subject to public and scientific peer review. In addition, OPP management reviews and signs new documents before being placed in the
docket or posted on EPA's website.

For more information, see:
http://www.epa.gov/oppsrrdI/registration review/


 2. Data Definition and  Source  Reporting	
 2a. Original Data Source                                                                                                      |
 OPP staff, working collaboratively across the program, develop the draft preliminary work plan taking into account existing policies,  data
 requirements, and standard operating procedures.
 2b. Source Data Collection                                                                                                     |
 Each preliminary work plan is approved by Director of the appropriate OPP division (Antimicrobial Division, Biopesticides and Pollution
 Prevention Division, and Pesticide Re-evaluation Division). All preliminary work plans are included in the docket for that registration
 review case and are available via the pesticide program website at http://www.epa.gov/pesticides.	
 2c. Source Data Reporting
 Form/mechanism for receiving data and entering into EPA system: As described in 2b, all preliminary work plans are posted to the
 docket for that registration review case and are available via the pesticide program website.  Counts for preliminary work plans completed
 are tracked and tabulated in a master spreadsheet maintained by the Pesticide Re-evaluation Division.

 Timing and frequency of reporting: Preliminary work plans are developed on a quarterly basis. Counts of actions completed are
 available at the end of each quarter.	


 3. Information Systems and Data Quality Procedures	
 3a. Information Systems                                                                                                      |
 The Office of Pesticide Programs (OPP) has migrated all of its major data systems including regulatory and scientific data, workflow
 tracking and electronic document management into one integrated system, the Pesticide Registration Information System (PRISM). PRISM
 provides a centralized source of information on all registered pesticide products, including chemical composition, toxicity, name and
 address of registrant, brand names, registration actions, and related data. It is maintained by the EPA and tracks regulatory  data submissions
 and studies, organized by scientific discipline, which are submitted by the registrant in support of a pesticide's registration. All registration
 actions received under the PRIA and PRIA 2 are entered and tracked in PRISM.

 PRISM is the successor to the Office of Pesticide Programs Information System Network (OPPIN). Data has been migrated from the
 following databases:  Chemical Vocabulary (CV), Company Name and Address (CNAD), Pesticide Document Management System
                                                           253                                     Back to Table of Contents

-------
 Goal 4                                                  Objective 1                                             Measure 164
(PDMS), Pesticide Product Information System (PPIS), Chemical Review Management System (CRMS), FIFRA CBI Access (FAS),
Jackets, Product Data Call-In (PDCI), Phones, Pesticide Regulatory Action Tracking (PRAT), Reference System (REFS), Tolerance
Indexes (TIS and TOTS). Sources of the input are paper copy and electronic data. EPA's Central Data Exchange (CDX), scheduled as EPA
097, is the gateway for electronic submissions. It consolidates information stored on the mainframe, the OPP LAN, on stand-alone
computers and in paper copy. PRISM (Pesticide Registration Information System) consolidates various pesticides program databases.

EPA recently constructed a module in PRISM tracking major Registration Review milestones. This module enhances tracking capabilities
and is an important management tool.

For information on  disposition of records in this database, please see EPA Records  Schedule 329,
http://www.epa.gov/records/policy/schedule/sched/329.htm


PRISM was developed between 1997 and 2003 and has been operational since June 2, 2003. PRISM provides e-government capabilities to
share pesticide information with OPP stakeholders. PRISM  supports OPP's responsibilities under a variety of regulatory requirements
including FIFRA, FQPA, PRIA, PRIA II, Pesticide Registration Review and for the Endocrine Disrupter Screening Program and will
standardize the structure of a chemical case where appropriate to define the key tasks and documents used in the a number of pesticide
review processes. EDSP components are used to order, monitor, track and manage scientific tests associated with pesticide chemicals.
Pesticide Registration Improvement Renewal Act (PRIA II).

PRISM was developed in response to the requirements of the following laws and regulations:

•       The Title III of the E-Government Act of 2002 - Federal Information Security Management Act (FISMA) - Public Law 107-347:
A security plan must be developed and practiced throughout all life cycles of the agency's information systems.


•       Office of Management and Budget (OMB) Circular  A-130, Management of Federal Information Resources: A System Security
Plan (SSP) is to be developed and documented for each GSS and Major Application (MA) consistent with guidance issued by the National
Institute of Standards and Technology (NIST).


•       Federal Information Processing Standards (FIPS) Publication 199, Standards for Security Categorization of Federal Information
and Information Systems: This document defines standards for the security categorization of information and information systems. System
security categorization must be included in  SSPs.


•       FIPS Publication 200,  Minimum Security Requirements for Federal Information and Information Systems: This document contains
information regarding specifications for minimum security control requirements for federal information and information systems. Minimum

                                                          254                                      Back to Table of Contents

-------
 Goal 4                                                  Objective 1                                             Measure 164
security controls must be documented in SSPs.


•       NIST Special Publication (SP) 800-18 Revision 1, Guide for Developing Security Plans for Federal Information Systems: The
minimum standards for an SSP are provided in this NIST document.


•       NIST SP 800-53, Revision 3, Recommended Security Controls for Federal Information Systems and Organizations: This document
contains a list of security controls that are to be implemented into federal information systems based on their FIPS 199 categorization. This
document is used in conjunction with FIPS 200 to define minimum security controls, which must be documented in SSPs.


•       EPA Information Security Planning Policy. A system security plan shall be developed for each system cited on the EPA Inventory
of Major Information Systems, including major applications and general support systems

Most, if not all, of PRISM data should be considered "source" data.  This means that these data originate from primary data providers,
particularly pesticide product registrants, submitting information sent to EPA directly in response to FIFRA regulatory requirements.

3b. Data Quality Procedures
OPP adheres to its Quality Management Plan (Nov. 2006) in ensuring data quality and that procedures are properly applied.

3c. Data Oversight
 Branch Chief, Financial Management and Planning Branch	
3d. Calculation Methodology
Identification of Unit of Measure and Timeframe:  Timeframe is the fiscal year. Unit of measure is the number of preliminary work plans
completed each year.	
4. Reporting and  Oversight
4a. Oversight and Timing of Results Reporting
Planning and Accountability Lead in the Resource Management Staff in the Office of Program Management Operations. Reporting
semiannually: mid-year and end-of-year.	
4b. Data Limitations/Qualifications	
No data limitations.	
4c. Third-Party Audits
Not applicable.	


                                                           255                                     Back to Table of Contents

-------
Goal 4                                                          Objective 1                                                     Measure 164






Record Last Updated: 02/13/2012 01:16:50 PM
                                                                    256                                            Back to Table of Contents

-------
  Goal 4                                                Objective 1                                           Measure 230
  Measure Code : 230 - Number of pesticide registration review final work plans
  completed.
  Office of Chemical Strategies and Pollution Prevention (OCSPP)
   1.  Measure and DQR Metadata
   Goal Number and Title
4 - Ensuring the Safety of Chemicals and Preventing Pollution
   Objective Number and Title
1 - Ensure Chemical Safety
   Sub-Objective Number and Title
2 - Protect Ecosystems from Chemical Risks
   Strategic Target Code and Title
1 - By 2015, no watersheds will exceed aquatic life benchmarks for targeted pesticides
   Managing Office
   Office of Pesticide Programs
   Performance Measure Term Definitions
 Registration Review dockets: EPA initiates a registration review by establishing a docket for a pesticide registration review case and
opening the docket for public review and comment. Each docket contains a Summary Document that explains what information EPA has on
the pesticide and the anticipated path forward. The Summary Document includes:

•       A Preliminary Work Plan highlighting anticipated risk assessment and data needs, providing an anticipated timeline for completing
the pesticide's review, and identifying the types of information that would be especially useful to the Agency in conducting the review;

•       A fact sheet providing general background information and summarizing the current status of the pesticide;

•       Ecological risk assessment problem formulation and human health scoping sections describing the data and scientific analyses
expected to be necessary to complete the pesticide's registration review.

Completed: After the closure of the public comment period for the preliminary work plan, EPA reviews those comments and revises (as
necessary) the work plan, resulting in the issuance of a final work plan.  See
http://www.epa.gov/oppsrrdl/registration review/reg  review_process.htm  for more information.

Background:
The Food Quality Protection Act of 1996 directed EPA to establish a Registration Review program with the goal of reviewing all registered
pesticides, AIs and products, on a  15-year cycle to ensure that they continue to meet the standards of registration.  EPA issued the final rule
in 2006 and began implementing the program in 2007.  Under the rule, EPA posts registration review schedules and these will provide a
baseline for expected AI case dockets that will be opened for the next three year cycle and for decisions expected over the next several
years. The first step of Registration Review is to open a public docket for each pesticide case entering the process to show the public what
the Agency knows about the AI and  seek comment. When comments are evaluated and data needs are finalized, OPP posts a Final Work
                                                         257                                    Back to Table of Contents

-------
  Goal 4                                                 Objective 1                                            Measure 230
Plan (FWP) for each AI case. Although the docket openings and the FWPs are tracked, both steps require notable resources to complete.

All registrations must be based on sound science and meet the Food Quality Protection Act (FQPA) safety standard. All risk assessments are
subject to public and scientific peer review. In addition, OPP management reviews and signs new documents before being placed in the
docket or posted on EPA's website.

For more information, see:
http://www.epa.gov/oppsrrdI/registration  review/
 2. Data Definition and Source Reporting
 2a. Original Data Source
 OPP staff, working collaboratively across the program, review the public comments and develop the draft final work plan taking into
 account existing policies, data requirements, and standard operating procedures.	
 2b. Source Data Collection                                                                                                     |
 Each final work plan is approved by Director of the appropriate OPP division (Antimicrobial Division, Biopesticides and Pollution
 Prevention Division, and Pesticide Re-evaluation Division). All final work plans are included in the docket for that registration review
 case and are available via the pesticide program website at http://www.epa.gov/pesticides.	
 2c. Source Data Reporting                                                                                                      |
 Form/mechanism for receiving data and entering into EPA system:  As described in 2b, all final work plans are posted to the docket
 for that registration review case and are available via the pesticide program website.  Counts for final work plans completed are tracked and
 tabulated in a master spreadsheet maintained by the Pesticide Re-evaluation Division.

 Timing and frequency of reporting: Final work plans are developed on a quarterly basis. Counts of actions completed are available at
 the end of each quarter.	


 3.  Information Systems and Data  Quality Procedures	
 3a.  Information Systems                                                                                                      |
 The Office of Pesticide Programs (OPP) has migrated all of its major data systems including regulatory and scientific data, workflow
 tracking and electronic document management into one integrated system, the Pesticide Registration Information System (PRISM). PRISM
 provides a centralized source of information on all registered pesticide products, including chemical composition, toxicity, name and
 address of registrant, brand names, registration actions, and related data. It is maintained by the EPA and tracks regulatory data submissions
 and studies, organized by scientific discipline, which are submitted by the registrant in support of a pesticide's registration. All registration
 actions received under the PRIA and PRIA 2 are entered and tracked in PRISM.

 PRISM is the successor to the Office of Pesticide Programs Information System Network (OPPIN). Data has been migrated from the
                                                           258                                     Back to Table of Contents

-------
 Goal 4                                                 Objective 1                                            Measure 230
following databases: Chemical Vocabulary (CV), Company Name and Address (CNAD), Pesticide Document Management System
(PDMS), Pesticide Product Information System (PPIS), Chemical Review Management System (CRMS), FIFRA CBI Access (FAS),
Jackets, Product Data Call-In (PDCI), Phones, Pesticide Regulatory Action Tracking (PRAT), Reference System (REFS), Tolerance
Indexes (TIS and TOTS). Sources of the input are paper copy and electronic data. EPA's Central Data Exchange (CDX), scheduled as EPA
097, is the gateway for electronic submissions. It consolidates information stored on the mainframe, the OPP LAN, on stand-alone
computers and in paper copy. PRISM (Pesticide Registration Information System) consolidates various pesticides program databases.

EPA recently constructed a module in PRISM tracking major Registration Review milestones.  This module enhances tracking capabilities
and is an important management tool.

For information on  disposition of records in this database, please see EPA Records Schedule 329,
http://www.epa.gov/records/policy/schedule/sched/329.htm


PRISM was developed between 1997 and 2003 and has been operational since June 2, 2003. PRISM provides e-government capabilities to
share pesticide information with OPP stakeholders. PRISM supports OPP's responsibilities under a variety of regulatory requirements
including FIFRA, FQPA, PRIA, PRIA II, Pesticide Registration Review and for the Endocrine Disrupter Screening Program and will
standardize the structure of a chemical case where appropriate to define the key tasks and documents used  in the a number of pesticide
review processes. EDSP components are used to order, monitor, track and manage scientific tests associated with pesticide chemicals.
Pesticide Registration Improvement Renewal Act (PRIA II).

PRISM was developed in response to the requirements of the following laws and regulations:

•       The Title III of the E-Government Act of 2002 - Federal Information  Security Management Act (FISMA) - Public Law 107-347:
A security plan must be developed and practiced throughout all life cycles of the agency's information systems.


•       Office of Management and Budget (OMB) Circular A-130, Management of Federal Information Resources:  A System Security
Plan (SSP) is to be developed and documented for each GSS and Major Application (MA) consistent with  guidance issued by the National
Institute of Standards and Technology (NIST).


•       Federal Information Processing Standards (FIPS) Publication 199, Standards for Security Categorization of Federal Information
and Information Systems: This document defines standards for the security categorization of information and information systems. System
security categorization must be included in SSPs.


•       FIPS Publication 200,  Minimum Security Requirements for Federal Information and Information Systems: This document contains

                                                          259                                     Back to Table of Contents

-------
 Goal 4                                                  Objective 1                                             Measure 230
information regarding specifications for minimum security control requirements for federal information and information systems. Minimum
security controls must be documented in SSPs.


•       NIST Special Publication (SP) 800-18 Revision 1, Guide for Developing Security Plans for Federal Information Systems: The
minimum standards for an SSP are provided in this NIST document.


•       NIST SP 800-53, Revision 3, Recommended Security Controls for Federal Information Systems and Organizations: This document
contains a list of security controls that are to be implemented into federal information systems based on their FIPS  199 categorization. This
document is used in conjunction with FIPS 200 to define minimum security controls, which must be documented in SSPs.

•       EPA Information Security Planning Policy. A system security plan shall be developed for each system cited on the EPA Inventory
of Major Information Systems, including major applications and general support systems

Most, if not all,  of PRISM data should be considered "source" data.  This means that these data originate from primary data providers,
particularly pesticide product registrants, submitting information sent to EPA directly in response to FIFRA regulatory requirements.	
3b. Data Quality  Procedures
OPP adheres to  its Quality Management Plan (Nov. 2006) in ensuring data quality and that procedures are properly applied.

3c. Data Oversight	
Branch Chief, Financial Management and Planning Branch

3d. Calculation Methodology
Timeframe is the fiscal year. Unit of measure is the number of final work plans completed each year.	
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Planning and Accountability Lead in the Resource Management Staff in the Office of Program Management Operations. Reporting
semiannually: mid-year and end-of-year.	
4b. Data Limitations/Qualifications	
No data limitations.	
4c. Third-Party Audits
Not applicable.	


                                                           260                                     Back to Table of Contents

-------
Goal 4                                                          Objective 1                                                    Measure 230






Record Last Updated: 02/13/2012 01:16:50 PM
                                                                    261                                            Back to Table of Contents

-------
  Goal 4                                               Objective 1                                          Measured 8
  Measure Code : CIS - Percentage of existing CBI claims for chemical identity in
  health and safety studies reviewed and challenged, as appropriate.
  Office of Chemical Strategies and Pollution Prevention (OCSPP)
   1. Measure and  DQR Metadata
   Goal Number and Title                           4 - Ensuring the Safety of Chemicals and Preventing Pollution
   Objective Number and Title	1 - Ensure Chemical Safety	
   Sub-Objective Number and Title                   3 - Ensure Transparency of Chemical Health and Safety Information
   Strategic Target Code and Title                    1 - 2015, make all health and safety information available to the public for chemicals in commerce
   Managing Office                                  Office of Pollution Prevention and Toxics
   Performance Measure Term Definitions
Existing CBI claims: Under TSCA, companies may claim that information they submit to EPA should be treated as "confidential business
information" (CBI) and not be disclosed to the public. "Existing" CBI claims are CBI claims in TSCA filings presently in the possession of
the Agency.

Health and safety studies: EPA will begin a general practice of reviewing confidentiality claims for chemical identities in health and safety
studies, and in data from health and safety studies, submitted under the Toxic Substances Control Act (TSCA) in accordance with Agency
regulations at 40 CFR part 2, subpart B.

Reviewed and, as appropriate, challenged: To achieve this measure, EPA must complete the following actions for new and historical
submissions by the end of 2015: 1) determine if a challenge to the CBI claim is warranted; 2) execute the challenge; and 3) where legally
defensible, declassify the information claimed as CBI. Section 14(b) of TSCA does not extend confidential treatment to health and safety
studies, or data from health and safety studies, which, if made public, would not disclose processes used in the manufacturing or processing
of a chemical substance or mixture or, in the  case of a mixture, the release of data disclosing the portion of the mixture comprised by any of
the chemical substances in the mixture. Where a chemical identity does not explicitly contain process information or reveal portions of a
mixture, EPA expects to find that the information would clearly not be entitled to confidential treatment. Where EPA determines that the
information is not eligible for confidential treatment, the Agency will notify companies, and in those instances where the company will not
voluntarily relinquish the claims, EPA may initiate administrative action in  accordance with Section 14 of TSCA.

Background:
   This performance measure supports EPA's strategic measure through 2015 to make all health and safety studies available to the public
for chemicals in commerce, to the extent allowed by law. For pesticides, EPA will continue to make risk assessments and supporting
information available through its long standing Public Participation Process.
   The effort has involved the tightening of EPA's CBI policies, first on January 21, 2010, when EPA said it planned to reject CBI claims

                                                          262                                   Back to Table of Contents

-------
  Goal 4                                                 Objective 1                                            Measured 8
for chemicals submitted to EPA with studies that show a substantial risk to people's health and the environment and that have been
previously disclosed on the TSCA Chemical Inventory. Sn a follow-up policy change issued May 27, 2010, EPA said it planned to generally
deny confidentiality claims for and the identity of chemicals in health and  safety studies filed under the TSCA, except in specified
circumstances.
   Health and safety information differs greatly in complexity and consequently the declassification may occur rapidly in some areas but
take longer than others to reach completion.

For more information, please see:
(1)     http://www.epa.gov/oppt/tsca8e/pubs/confidentialbusinessinformation.html
(2)     http://www.epa.gov/oppt/existingchemicals/pubs/transparency.html
http://www.regulations.gov/#!documentDetail:D=EPA-HQ-OPPT-2010-0446-0001
 2. Data Definition and Source Reporting
 2a. Original Data Source
 Data are provided by EPA Headquarters Staff.
 2b. Source Data Collection
 Historical data used to identify existing CBI health and safety data will come from staff and contractor maintained internal databases. The
 Agency relies on existing databases that track TSCA filings and identify data elements for each document, including CBI claims. These
 databases are used for a wide variety of purposes relating to TSCA implementation. Quality controls include standard operating
 procedures related to data processing. This process is enhanced, in the CBI review context, by reviews of the actual data (hard copy,
 microfiche and pdf) to ensure that what is tracked is consistent with the submitted filings.	
 2c. Source Data Reporting
 EPA receives information by paper or electronic submission under the authority of TSCA. CBI reviews are initiated under specific
 regulatory authority (e.g. 40 CFR 720, 40 CFR part 2 etc). EPA receives materials pursuant to TSCA on a daily basis.
 3. Information Systems and Data Quality Procedures	
 3a. Information Systems                                                                                                       |
 OPPT has developed a CBI declassification tracking system, which includes the records identified for review, date of receipt, review status,
 claim validation, letter or call sent, 2.204(d)(2) Action, and declassification status. For chemicals in 8(e) filings the system will also track
 if the chemical name has process or portion of mixture information and if it is claimed as research and development (R&D) or as a
 pesticide. Data elements used to track the de-classification studies will consist of new process-specific elements input by reviewers and
 elements traditionally associated with studies that were input to OPPT databases. The tracking system is limited to tracking filings, CBI
 review status and other data element consistent with internal management of program.  The system does not contain any transformed data.

                                                           263                                     Back to Table of Contents

-------
 Goal 4                                                  Objective 1                                            Measured 8
3b. Data Quality Procedures	|
EPA staff will ensurethe number of health and safety studies reviewed is equal to or less than the total number of health and safety studies
received in the fiscal year for both measures, combined. EPA reviews all subject filings with CBI claims to ensure that such claims are
consistent with provisions of statute and Agency policy. Participants in the review include legal and technical staff who rely on advice
from EPA's Office of General Counsel.
3c. Data Oversight                                                                                                             |
Branch Chief, Planning and Assessment Branch

3d. Calculation Methodology                                                                                                    |
The baseline assumes that between the enactment of TSCA and August 21, 2010, 22,483 CBI cases potentially containing TSCA health and
safety information were submitted for chemicals potentially in commerce.  In recent years, hundreds of such cases have been submitted
annually.

EPA has identified all filings and cases subject to review and placed these in a tracking database. As filings and cases are reviewed and
appropriate actions undertaken, the database is updated to capture the new status. Thus, the number of CBI claims that are reviewed and
challenged can be readily obtained from the database and expressed as a percentage of all existing CBI claims in health and safety studies.
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
 Planning and Accountability Lead in the Resource Management Staff in the Office of Program Management Operations. Reporting
semiannually: mid-year and end-of-year.

4b. Data Limitations/Qualifications
Data limitations include:
•          Some archived data may have been lost or damaged.
•          The DTS database does not differentiate between types of CBI claims, so some studies tracked in the DTS system may, in
theory, already be public.
•          Some submissions may be redundant due to overlap in processing.
•          Other limitations expected.
There is no estimateon the number oferrors that couldhave been made during data entry.	
4c. Third-Party Audits                                                                                                          |
OIG published a report in 2010 finding that CBI claims were excessive, and encouraging EPA to increase public access to information filed
under TSCA. For more information, see http://www.epa.gov/oig/reports/2010/20100217-10-P-0066.pdf.


                                                           264                                     Back to Table of Contents

-------
Goal 4                                                           Objective 1                                                     Measured 8




Record Last Updated: 02/13/2012 01:16:59 PM
                                                                    265                                            Back to Table of Contents

-------
  Goal 4                                               Objective 1                                          Measure E01
  Measure Code : E01 - Number of chemicals for which Endocrine Disrupter
  Screening Program (EDSP) decisions have been completed
  Office of Chemical Strategies and Pollution Prevention  (OCSPP)
   1. Measure and  DQR Metadata
   Goal Number and Title                           4 - Ensuring the Safety of Chemicals and Preventing Pollution
   Objective Number and Title	    1 - Ensure Chemical Safety
   Sub-Objective Number and Title
1 - Protect Human Health from Chemical Risks
   Strategic Target Code and Title                    6 - BV 2015/ complete Endocrine Disrupter Screening Program (EDSP)
   Managing Office                                  Office of Science Coordination and Policy
   Performance Measure Term Definitions
Chemicals: The initial pesticide chemicals to be screened in the EDSP.

Endocrine Disrupter Screening Program Decisions:  This measure deals with data generated by evaluating chemicals via the Endocrine
Disrupter Screening Program's (EDSP) Tier 1 screening assays, which are designed to identify compounds that have the potential to interact
with the endocrine system.  These decisions take into consideration Tier 1 screening battery data, other scientifically relevant information
(OSRI), and/or the regulatory status of a chemical, as applicable. The decisions counted via this measure range from determining whether a
chemical has the potential to interact with the Estrogen (E) Androgen (A),  or Thyroid (T) hormone systems to otherwise determining
whether further endocrine related testing is necessary because EPA makes  a regulatory determination to remove a chemical from further
consideration or a test order recipient announces that it is discontinuing the manufacture and import of the chemical.  This measure is a
count of these decisions.
• EDSP decisions for a particular chemical (in Tier 1) can be organized into two broad categories: (1) regulatory actions and (2)
determinations regarding potential to interact with E, A, or T. In both cases, the decisions will determine whether further endocrine related
testing is necessary for that chemical.
• There are several regulatory actions that will remove a chemical from further consideration for endocrine related testing in the EDSP.
These include, but would not be limited to, cancellation of pesticide registrations, ceasing sales of the chemical for use in pesticide products,
and discontinuing the manufacture and import of the chemical. These actions may be voluntary on the part of a Tier 1 test order recipient or
the result of an EPA regulatory determination. In either case, when such regulatory  decisions have been completed for a chemical in Tier 1
of the EDSP, that  chemical will be counted for this measure.

Completed:  Chemicals which EPA judges to have been fully assessed using the Endocrine Disrupter Screening Program's  (EDSP) Tier 1
screening assays and other scientifically relevant information (as applicable) for their potential to interact with E, A, or T, will be counted
for this measure.

                                                         266                                   Back to Table of Contents

-------
  Goal 4                                                 Objective 1                                            Measure E01
Chemicals are also counted for this measure when EPA makes a regulatory determination to remove a chemical from further consideration
for endocrine related testing in the EDSP. This can be due to cancellation of pesticide registrations or because a Tier 1 test order recipient
makes voluntary decision to discontinue the manufacture and import of the chemical.

Decisions will be counted once EPA announces them via updates to the EDSP website (
http ://www. epa. gov/endo/pub s/ED SP_O SRI_Response_Table. pdf).

Background:
• EPA anticipates that an increasing proportion of the resources allocated to the EDSP will be used for review of EDSP submissions of Tier
1 screening battery data in FY 2012. As a result, a measure based on the number of chemicals for which EDSP decisions have been
completed captures an important shift in resource utilization for the program.
• In general, it is anticipated that the EDSP decisions will vary from chemical to chemical with respect to complexity and timing.
Therefore, careful analysis will be needed in setting performance targets each year. It is anticipated that annual performance targets will be
established by considering (to the extent practicable) the number of chemicals for which EDSP Tier 1 test orders have been issued, the
identity of the chemicals, the number of Tier 1 test order recipients, any other available chemical specific information and EPA resources
available to complete data evaluations. However, several factors remain unpredictable and will impact the schedule for completing EDSP
decisions. These include, for example, the number of pesticide cancellations and other regulatory actions that may remove a chemical from
commerce and/or discontinue manufacture and import (voluntary and enforced), unforeseen  laboratory capacity limits, and unforeseen
technical problems with completing the Tier 1 assays for a particular chemical.  Each of these factors can move the timeline for completing
an EDSP decision for a particular chemical beyond the fiscal year in which the decision was originally anticipated.
• This performance measure is best used in conjunction with another EDSP annual performance measure (Number of chemicals for which
EDSP Tier 1 test orders have been issued).  Measuring the number of chemicals for which EDSP Tier 1 test orders have been issued will,
together with additional chemical-specific information, help set performance targets for the number of chemicals for which EDSP decisions
have been completed.
• Endocrine Disrupter Screening Program;  Second List of Chemicals for Tier 1  Screening [Federal Register Notice: November 17, 2010

(Volume 75, Number 221, pages 70248-70254)]

http://www.regulations.gov/contentStreamer? disposition=attachment&objectId=0900006480b954bf&contentType=html
• http ://www. epa. gov/endo/ (including Highlights box on right side of page)
• http://www.epa.gov/endo/pubs/edspoverview/background.htm
http://www.epa.gov/endo/pubs/EDSP OSRI Response Table.pdf
 2. Data  Definition and Source Reporting
 2a. Original Data Source
 EPA staff, including scientists and regulatory managers from relevant program offices, are responsible for making and documenting the
 decisions.	

                                                           267                                    Back to Table of Contents

-------
 Goal 4                                                  Objective 1                                            Measure E01
2b. Source Data Collection
Source Data Collection Methods: The decisions will take into consideration Tier 1 screening battery data, other scientifically relevant
information (OSRI), and/or the regulatory status of a chemical, as applicable.

EPA has developed guidance on how to conduct the Weight of Evidence (WoE) analysis that will lead to decisions about whether
chemicals have the potential to interact with E, A, or T and whether further testing will be required (see Highlights box at
http://www.epa.gov/endo).

Date/time intervals covered by source data: FY 2012-present	
2c. Source Data  Reporting
Form/mechanism for receiving data and entering into EPA system: EPA has created and is maintaining an on-line report for tracking
the status of chemicals screened in the EDSP  (see Highlights box at http://www.epa.gov/endo). The report includes for each chemical: the
date a test order was issued, to whom the test order was issued, the due date for completing and submitting the data, the recipient's
response to the order, and regulatory status (e.g., pesticide registration cancelled), as appropriate.  In addition, the report will include
information on EDSP decisions.
Decisions will be counted once EPA announces them via updates to the EDSP website (
http://www.epa. gov/endo/pubs/EDSP_OSRI_Response_Table.pdf).

Timing and frequency of reporting: Annual
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
EPA has created and is maintaining an on-line report for tracking the status of chemicals screened in the EDSP (see Highlights box at
http://www.epa.gov/endo). The report includes for each chemical: the date a test order was issued, to whom the test order was issued, the
due date for completing and submitting the data, the recipient's response to the order, and regulatory status (e.g., pesticide registration
cancelled), as appropriate.  In addition, the report will include information on EDSP decisions.  EPA anticipates expanding this report to
include chemicals other than pesticides.

Additional information:
Since the data will correspond to the on-line reporting on the status of chemicals in the EDSP, the public and other interested parties will  be
able to easily  determine the accuracy of the reported results.
3b. Data Quality Procedures                                                                                                    |
Data on the number of decisions generated for this measure will be reviewed for accuracy before submitting.

The number of chemicals for which EDSP Tier 1 decisions have been completed can be checked against supporting records documenting
the decisions.	
                                                                                                                            I
                                                           268                                     Back to Table of Contents

-------
 Goal 4                                                  Objective 1                                             Measure E01
3c. Data Oversight
Deputy Director, Office of Science Coordination and Policy
3d. Calculation Methodology
Unit of analysis: Number of chemicals for which Endocrine Disrupter Screening Program (EDSP) Tier 1 decisions have been completed.


4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting	
Planning and Accountability Lead in the Resource Management Staff in the Office of Program Management Operations.  Reporting
semiannually: mid-year and end-of-year.	
4b. Data Limitations/Qualifications	
Decisions are based on EPA regulatory actions and data review once data are received, thus minimal error is anticipated with this estimate.
4c. Third-Party Audits

The American Society of Human Genetics, the American Society for Reproductive Medicine, the Endocrine Society, the Genetics Society
of America, the Society for Developmental Biology, the Society for Pediatric Urology, the Society for the Study of Reproduction, and the
Society for Gynecologic  Investigation. Assessing Chemical Risk: Societies Offer Expertise. Science  , March 3, 2011 DOT:
10.1126/science.331.6021.1136-a or http://www.sciencemag.org/content/331/6021/1136.1.
 Record Last Updated: 02/13/2012 01:16:59 PM
                                                           269                                      Back to Table of Contents

-------
  Goal 4                                               Objective 1                                           Measure E02
  Measure Code : E02 - Number of chemicals  for which EDSP Tier 1 test orders have
  been issued
  Office of Chemical Strategies and Pollution Prevention (OCSPP)
   1. Measure and  DQR Metadata
   Goal Number and Title                           4 - Ensuring the Safety of Chemicals and Preventing Pollution
   Objective Number and Title	1 - Ensure Chemical Safety	
   Sub-Objective Number and Title                   1 - Protect Human Health from Chemical Risks
   Strategic Target Code and Title	
   Managing Office                                  Office of Science Coordination and Policy
   Performance Measure Term Definitions
Chemicals: The initial pesticide chemicals to be screened in the EDSP.

EDSP Tier 1:  The Endocrine Disrupter Screening Program's (EDSP) Tier 1 screening assays, which are designed to identify compounds
that have the potential to interact with the body's endocrine system.

Test orders: The initial issuance of orders to conduct EDSP Tier 1 screening tests to entities initially identified by EPA as being the
producers of specific chemicals that may have the potential to interact with the estrogen, androgen, or thyroid hormone systems, all of
which are part of the endocrine system.

Issued: Issuance of EDSP Tier 1 test orders follows the policies and procedures that are described in detail in the Federal Register  at
74FR17560.  For the purpose of this measure, completing the issuance of Tier 1 test orders for a particular chemical will be defined as
completing the initial issuance of orders to the order recipients initially identified by EPA.  Subsequent issuance of orders to recipients who
were not initially identified by EPA or to recipients who become subject to EDSP requirements after the initial issuance of test orders
(referred to as "catch up" orders) will  not be considered in this measure. As noted above,

Background:
• Tier 1 screening will include a battery of screening assays that would identify substances with the potential to interact with the estrogen,
androgen, or thyroid hormone systems, according to text at http://www.epa.gov/endo/pubs/edspoverview/components.htmtf2.
• Consistent with EPA plans to integrate the EDSP Tier 1  test orders into the pesticide registration review process, issuance of test orders
for additional chemicals (including industrial chemicals that are water contaminants) is expected to continue in FY 2012 and beyond.
• EPA anticipates that an increasing proportion of the resources allocated to the EDSP will be used for review of EDSP submissions of Tier
1 screening battery data in FY 2012. As a result, a measure based on the number of chemicals for which EDSP decisions have been

                                                         270                                    Back to Table of Contents

-------
  Goal 4                                                   Objective 1                                              Measure E02
completed captures an important shift in resource utilization for the program.

• Given the dynamic nature of chemical markets, some companies may be missed in EPA's analysis or companies may enter new markets
subjecting them to the EDSP requirements for a chemical after the initial test orders for that chemical have been issued. EPA's policies and
procedures allow for "catch up" orders to address these situations. Given that the time horizon for "catch up" orders is 15 years after the
initial test orders are issued for a chemical, for purposes of this measure, a chemical will be counted as completed after initial test orders are
issued.

• With EPA plans to integrate EDSP Tier 1 test orders into the pesticide registration review process and as EPA develops subsequent lists of
chemicals, EPA anticipates that an increasing proportion of the EDSP resources will be used for the issuance of Tier 1 test orders and data
review. Therefore, a measure based on the number of Tier 1 test orders issued captures performance of activities on which the program will
be spending a larger proportion of its future resources.

• In general, it is anticipated that the EDSP decisions will vary from chemical to chemical with respect to complexity and timing.
Therefore, careful analysis will be needed in setting performance targets each year.  It is anticipated that annual performance targets will be
established by considering (to the extent practicable) the number of chemicals for which EDSP Tier 1 test orders have been issued, the
identity of the chemicals, the number of Tier 1 test order recipients, any other available chemical specific information and EPA resources
available to complete data evaluations.  However, several factors remain unpredictable and will impact the schedule for completing EDSP
decisions. These include, for example,  the number of pesticide cancellations and other regulatory actions that may remove a chemical from
commerce and/or discontinue manufacture and import (voluntary and enforced), unforeseen laboratory capacity limits, and unforeseen
technical problems with completing the Tier 1 assays for a particular chemical.  Each of these factors can move the timeline for completing
an EDSP decision for a particular chemical beyond the fiscal year in which the  decision was originally anticipated.

• Annual performance targets for this measure will be subject to obtaining an approved Information Collection Request and the EPA
resources available for issuing EDSP Tier 1 test orders.

• Annual performance targets may be influenced by a number of factors including OCSPP's identification of manufacturers of chemicals
and the corresponding issuance of Information Collection Requests.  Therefore, careful analysis will be needed in setting performance
targets each year.

• The results from this performance measure, together with additional chemical specific information, will help set performance targets for
another EDSP measure:  the number of chemicals for which Endocrine Disrupter Screening Program (EDSP) decisions have been
completed.

• Endocrine Disruptor Screening Program: Second List of Chemicals for Tier 1 Screening [Federal Register Notice: November 17, 2010

(Volume 75, Number 221,  pages 70248-70254)]

• http ://www. epa. gov/endo/ (including Highlights box on right side of page)
• http://www.epa.gov/endo/pubs/edspoverview/background.htm
http://www.epa.gov/endo/pubs/edsp  orders  status.pdf

                                                            271                                      Back to Table of Contents

-------
 Goal 4                                                  Objective 1                                             Measure E02

2.  Data Definition and Source Reporting	
2a. Original Data Source                                                                                                       |
EPA staff, including scientists and regulatory managers from relevant program offices, are responsible for issuing and documenting the test
orders.
2b. Source Data Collection
Source Data Collection Methods:
Using several databases, EPA initially completed a comprehensive analysis to identify companies that are potential test order recipients
because of their association with specific chemicals. These chemicals may have the potential to interact with the estrogen, androgen, or
thyroid hormone systems.

The policies and procedures regarding issuance of EDSP Tier 1 test orders are described in detail in the Federal Register  (74FR17560).
The policies and procedures regarding issuance of EDSP Tier 1 test orders that are described in detail in the Federal Register  (74FR17560)
are being adapted to address additional chemicals (including water contaminants). EPA completes a comprehensive analysis using several
databases to identify companies that are potential order recipients for each chemical.	
2c. Source Data Reporting
EPA has created and is maintaining an on-line report for tracking the status of chemicals screened in the EDSP (see Highlights box at
http://www.epa.gov/endo).  The report includes for each chemical: the date a test order was issued, to whom the test order was issued, the
due date for completing and submitting the data, the recipient's response to the order, and regulatory status (e.g., pesticide registration
cancelled), as appropriate. In addition, the report will include information on EDSP Tier 1 decisions.	


3.  Information Systems and Data Quality Procedures
3a.  Information Systems	
EPA has created and is maintaining an on-line report for tracking the number of chemicals for which EDSP Tier 1 test orders have been
issued.

EPA's on-line report for tracking the status of chemicals screened in the EDSP includes for each chemical: the date a test order was issued,
to whom the test order was issued, the due date for completing and submitting the data, the recipient's response to the order, regulatory
status (e.g., pesticide registration cancelled), as appropriate, and other information.

Additional information:
Since the data generated for this measure will correspond to the on-line reporting on the status of chemicals in the EDSP, the public and
other interested parties will be able to easily determine the accuracy of the reported results.	
3b. Data Quality Procedures	
The number of chemicals for which Tier 1 test orders have been issued can be checked against order related documentation.

                                                           272                                     Back to Table of Contents

-------
 Goal 4                                                   Objective 1                                             Measure E02
Data on the number of orders issued that are related to this measure will be reviewed for accuracy before submitting.	
3c. Data Oversight
Deputy Director, Office of Science Coordination and Policy	
3d. Calculation Methodology
Unit of analysis: Number of chemicals for which Endocrine Disrupter Screening Program (EDSP) Tier 1 test orders have been issued.


4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting
Planning and Accountability Lead in the Resource Management Staff in the Office of Program Management Operations. Reporting
semiannually: mid-year and end-of-year.
4b. Data Limitations/Qualifications
Issuance of test orders is  based largely on EPA actions, thus minimal error is anticipated with this estimate.
4c. Third-Party Audits

The American Society of Human Genetics, the American Society for Reproductive Medicine,  the Endocrine Society, the Genetics Society
of America, the Society for Developmental Biology, the Society for Pediatric Urology, the Society for the Study of Reproduction, and the
Society for Gynecologic  Investigation. Assessing Chemical Risk: Societies Offer Expertise. Science  , March 3, 2011 DOI:
10.1126/science.331.6021.1136-a or http://www.sciencemag.org/content/331/6021/1136.1.
 Record Last Updated: 02/13/2012 01:16:59 PM
                                                            273                                     Back to Table of Contents

-------
  Goal 4                                               Objective 1                                          Measure HC1
  Measure Code :  HC1 - Annual number of hazard characterizations completed for
  HPV chemicals
  Office of Chemical  Strategies and Pollution Prevention (OCSPP)
   1. Measure and DQR Metadata
   Goal Number and Title                          4 - Ensuring the Safety of Chemicals and Preventing Pollution
   Objective Number and Title	    1 - Ensure Chemical Safety
   Sub-Objective Number and Title
1 - Protect Human Health from Chemical Risks
   Strategic Target Code and Title
   Managing Office                                 Office of Pollution Prevention and Toxics
   Performance Measure Term Definitions
Hazard characterizations:  "Hazard characterizations" refers to Screening Level Hazard Characterization Reports prepared by EPA staff
based on information submitted by the companies that make the chemicals, as well as on data identified from a targeted search of publicly
available sources of information specifically relevant to characterizing hazards.

Completed:  Screening Level Hazard Characterization Reports are deemed "completed" once they are deemed by senior Agency scientists
and OPPT management to be suitable for posting on the program's website. In order for reports to be completed, the source  Screening
Information Data Set data submissions must be judged by the Agency to be adequate.

HPV chemicals: High Production Volume chemicals produced or imported in the United States in quantities of 1 million pounds or more
per year.

Background:
• EPA's High Production Volume Challenge (HPV Challenge) program has inspired chemical manufacturers and users to deliver health and
environmental effects data on many of the most heavily used chemicals in U.S. commerce to the agency. More information is available at:
http ://www. epa. gov/hpv/.
• EPA is investigating the hazard characteristics of heavily used chemicals in conjunction with the Organization for Economic Cooperation
and Development (OECD). The OECD's criteria for including chemicals in its Screening Information Data Sets (SIDS) program are
production in one OECD Member country in quantities above 10,000 metric tons (22 million Ibs) per annum or above 1,000 metric tons (2.2
million Ibs) in two or more OECD countries. More information is available at http://www.epa.gov/opptintr/sids/pubs/overview.htm.
Screening Level Hazard Characterization Reports are supplemented and aligned twice a year with the international database of chemicals
sponsored internationally through Screening Information Data Sets (SIDs) Initial Assessment Meetings. Hazard characterizations are made
publicly available through OPPT's High Production Volume Information System (HPVIS): http://www.epa.gov/hpvis/

                                                         274                                   Back to Table of Contents

-------
 Goal 4                                                Objective 1                                          Measure HC1

2. Data Definition and Source Reporting	
2a. Original Data Source                                                                                                    |
Submissions from chemical sponsors, for both U.S. HPVs and international Screening Information Data Sets (SIDs) chemicals.
2b. Source Data Collection                                                                                                   |
Tabulation of records or activities:  Screening Level Hazard Characterization Reports are prepared by EPA staff based on submissions
from chemical sponsors and are reviewed by senior scientists and management to determine whether they are complete. Each screening
level hazard characterization document represents a thorough review by qualified EPA personnel of the information provided by the
submitter, as well as other targeted sources of information. For more information about sources utilized, please visit:
http://www.epa.gov/hpvis/hazardinfo.htm.

This  measure analyzes and supplements data received through EPA's High Production Volume (HPV) challenge, the EPA program that
has inspired companies to deliver health and environmental effects data on many of the most heavily used chemicals in U.S. commerce to
the agency. An assessment of adequacy is made for HPV chemicals, defined as approximately 2,450 chemicals (1400 US Sponsored
chemicals, 850 International sponsored chemicals, and 200 Original Organization for Economic Cooperation and Development (OECD)
SIDS Initial Assessment Reports (SIARs)). The measure is a count of completed reports from all of these sources, which are then posted on
EPA's website, http://www.epa.gov/hpvis/abouthc.htm

EPA OA requirements/guidance governing collection: OPPT has in place a signed Quality Management Plan (Quality Management
Plan  for the Office of Pollution Prevention and Toxics; Office of Prevention, Pesticides and Toxic Substances, November 2008).
2c. Source Data Reporting
Form/mechanism for receiving data and entering into EPA system: EPA staff complete Screening Level Hazard Characterization
Reports based on submissions from chemical sponsors.

Once a report is completed, as determined by senior scientist and management review, an internal reporting spreadsheet called HPV HC
Tracking Data is updated with the chemical name and date of completion. The HPV tracking system is updated by EPA staff upon posting
of final documents to the EPA web site at the end of each quarter. The number of chemicals reviewed and posted is then recorded in the
internal reporting spread sheet.

Timing and frequency of reporting: As new HCs are posted at the end of each quarter, the number of chemicals posted is recorded in the
internal tracking spreadsheet.


3. Information Systems and Data  Quality Procedures	
3a. Information Systems
EPA uses a reporting spreadsheet called HPV HC Tracking Data to track the number of completed Screening Level Hazard
Characterization Reports.  There are no transformed data in this spreadsheet as this is a simple tracking measure.
                                                         275                                    Back to Table of Contents

-------
 Goal 4
Objective 1
Measure HC1
3b. Data Quality Procedures
Not Available
3c. Data Oversight
 Branch Chief, Planning and Assessment Branch
3d. Calculation Methodology
The performance result is simply a count of Screening Level Hazard Characterization Reports completed by EPA either quarterly or over
the fiscal year.
4. Reporting and Oversight	
4a. Oversight and Timing of Results Reporting	
 Planning and Accountability Lead in the Resource Management Staff in the Office of Program Management Operations. Reporting
semiannually: mid-year and end-of-year.

4b. Data Limitations/Qualifications	
Not Available

4c. Third-Party Audits
Recent GAO reviews found that EPA does not routinely assess the risks of all existing chemicals and faces challenges in obtaining the
information necessary to  do so. EPA has taken several steps to respond to these reviews including more aggressive efforts to collect data,
continued efforts to assess data through hazard characterizations, and increased emphasis on risk management activities for chemicals of
concern.

GAO-05-458: Chemical Regulation: Options Exist to Improve EPA's Ability to Assess Health Risks and Manage Its Chemical Review
Program, June 2005.

GAO-06-1032T: Chemical Regulation: Actions Are Needed to Improve the Effectiveness of EPA's Chemical Review Program, August
2006.

GAO-09-271: High Risk Series-An update. Transforming EPA's Processes for Assessing and Controlling Toxic Chemicals, January 2009.
 Record Last Updated: 02/13/2012 01:17:01 PM
                                                           276
                                           Back to Table of Contents

-------
Goal 4                                                    Objective 1                                              Measure HC1
                                                            277                                       Back to Table of Contents

-------
  Goal 4                                              Objective 1                                         Measure RA1
  Measure Code : RA1 - Percentage of planned research products completed on time
  by the Human Health Risk Assessment research program.
  Office of Research and Development (ORD)
   1. Measure and DQR Metadata
   Goal Number and Title                          4 - Ensuring the Safety of Chemicals and Preventing Pollution
   Objective Number and Title	1 - Ensure Chemical Safety	
   Sub-Objective Number and Title	
   Strategic Target Code and Title	
   Managing Office                                Office of Program Accountability and Resource Management- Planning
   Performance Measure Term Definitions
A research product is "a deliverable that results from a specific research project or task.  Research products may require translation or
synthesis before integration into an output ready for partner use."

 This secondary performance measure tracks the timely completion of research products.

Sustainability Research Strategy, available from:http://epa.gov/sciencematters/april2011/truenorth.htm

http://www.epa.gov/risk_assessment/health-risk.htm


 2. Data Definition and Source Reporting	
 2a. Original Data Source
 EPA and its partners confirm the schedule for completing research outputs and products that are transformed or synthesized into outputs.
 ORD tracks progress toward delivering the outputs; clients are notified of progress.  Scheduled milestones are compared to actual progress
 on a quarterly basis. At the end of the fiscal year, outputs are either classified as "met" or "not met" to determine the overall percentage of
 planned products that have been met by the research program. The actual product completion date is self-reported.
 2b. Source Data Collection

 Each output is assigned to a Lab or Center representative before the start of the fiscal year.  This individual provides quarterly status
 updates via ORD's Resource Management System. Status reports are reviewed by senior management, including the Lab or Center
 Director and National Program Director. Overall status data is generated and reviewed by ORD's Office of Program Accountability and
 Resource Management.
 2c. Source Data Reporting
                                                       278                                  Back to Table of Contents

-------
 Goal 4                                                 Objective 1
Quarterly status updates are provided via ORD's Resource Management System.
          Measure RA1
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
Internal database or internal tracking system such as the Resources Management System (RMS).
3b. Data Quality Procedures
EPA and its partners confirm the schedule for completing research outputs and products that are transformed or synthesized into outputs.
ORD tracks progress toward delivering the outputs; clients are notified of progress. Scheduled milestones are compared to actual progress
on a quarterly basis. At the end of the fiscal year, outputs are either classified as "met" or "not met" to determine the overall percentage of
planned products that have been met by the program.	
3c. Data Oversight	

The National Program Director oversees the source data reporting, specifically, the process of establishing agreement with program
stakeholders and senior ORD managers on the list and content of the planned products, and subsequent progress, completion, and delivery
of these products.
3d. Calculation Methodology
At the end of the fiscal year, outputs are either classified as "met" or "not met". An overall percentage of planned products met by the
program is reported.	
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The Office of Program Accountability and Resource Management is responsible for reporting program progress in meeting its target of
completion of 100% of program planned products.

4b. Data Limitations/Qualifications
This measure does not capture directly the quality or impact of the research products.
4c. Third-Party Audits

Not applicable
 Record Last Updated: 02/13/2012 01:16:50 PM
                                                          279
Back to Table of Contents

-------
  Goal 4                                               Objective 2                                          Measure 297
  Measure Code : 297 - Metric Tons of Carbon Dioxide Equivalent (MTCO2e)
  reduced, conserved, or offset through pollution  prevention.
  Office of Chemical Strategies and Pollution Prevention (OCSPP)
   1. Measure and DQR Metadata
   Goal Number and Title                          4 - Ensuring the Safety of Chemicals and Preventing Pollution
   Objective Number and Title                      2 - Promote Pollution Prevention
   Sub-Objective Number and Title                  1 ~ Prevent Pollution and Promote Environmental Stewardship
   Strategic Target Code and Title                   2 - Bv 2015, reduce 9 million MTof carbon dioxide equivalent (MMTC02Eq.) through pollution prevention
   Managing Office                                  Office of Pollution Prevention and Toxics
   Performance Measure Term Definitions
Carbon Dioxide Equivalent: A measure expressed in metric units that is used to compare the emissions from various greenhouse gases
based upon their global warming potential (GWP). Carbon dioxide equivalents are commonly expressed as "million metric tons of carbon
dioxide equivalents (MMTCO2Eq)." The carbon dioxide equivalent for a gas is derived by multiplying the tons of the gas by the associated
GWP. The use of carbon equivalents (MMTCE) is declining.
MMTCO2Eq = (million metric tons of a gas) * (GWP of the gas)
See greenhouse gas, global warming potential, metric ton.

Offset: Emission savings or storage that can be considered to cancel out emissions that would otherwise have occurred. For example,
electricity produced from burning landfill gas is considered to replace electricity from the grid, leading to a carbon offset because landfill
gas production and combustion results in lower GHG emissions than grid electricity production from fossil fuels.
http://epa.gov/climatechange/wycd/waste/downloads/warm-defmitions-and-acronyms.pdf

P2 Programs related to this measure include:

Green Suppliers Network (GSN) and Energy, Economy, and the Environment (E3) - Green Suppliers Network works with large
manufacturers to engage their small and medium-sized suppliers in low-cost technical reviews that focus on process improvement and waste
minimization.

Energy, Economy, and the Environment is a coordinated federal and local technical assistance initiative to help manufacturers adapt and
thrive in a new business era focused on sustainability. The program provides technical assessments of production processes and training to:
maximize energy efficiency, reduce environmental wastes, identify opportunities for reducing carbon emissions, promote sustainable
manufacturing practices and growth, and reduce business costs.

Environmentally Preferable Products (EPP) - EPA's Environmentally Preferable Purchasing (EPP) Program is helping agencies across the

                                                         280                                   Back to Table of Contents

-------
  Goal 4                                                 Objective 2                                             Measure 297
federal government comply with green purchasing requirements, and in doing so is using the federal government's enormous buying power
to stimulate market demand for green products and services. Results for the Environmentally Preferable Purchasing (EPP) program are
derived from the Federal Electronics Challenge (FEC) and the Electronic Product Environmental Assessment Tool (EPEAT).  The Federal
Electronics Challenge (FEC) is a voluntary partnership program that encourages federal facilities and agencies to purchase greener
electronic products, reduce impacts of electronic products during use, and manage obsolete electronics in an environmentally safe way.

Additional information about the P2 programs listed here can be found at: http://www.epa.gov/p2/pubs/partnerships.htm
 2. Data  Definition and Source Reporting	
 2a. Original Data Source                                                                                                       |
 Green Suppliers Network (GSN) and Energy, Economy, and the Environment (E3): The source of P2-related data is the technical
 assistance provider reviewing the facility.  This professional provides an estimate of the potential reductions and savings achievable at the
 facility being reviewed. This person is usually an environmental expert from the state environmental agency or its designee.

 Environmentally Preferable Products (EPP):  For the Federal Electronics Challenge, the data source is federal facility and agency
 partners. EPA obtains data on annual sales of Electronic Product Environmental Assessment Tool (EPEAT) registered electronics products
 from the Green Electronics Council, which obtains the data from manufacturers of the products.

 2b. Source Data Collection                                                                                                      |
 Date/time intervals covered by source data:

 Green Suppliers Network (GSN) and Energy, Economy and the Environment (E3): For GSN, performance data are collected by the
 National Institute of Standards and Technology (NIST) Manufacturing Extension Partnership (MEP) on a monthly, quarterly, and annual
 basis.  Currently, E3 results are reported on an annual basis, but the NIST MEP is working with its regional managers to develop a more
 efficient method of providing E3 results.

 Environmentally Preferable Products (EPP):  Data are collected on an annual basis. FY 2011 data will be collected for the Federal
 Electronics Challenge in January 2012 and for the Electronic Product Environmental Assessment Tool products (EPEAT) in 2012.

 EPA QA requirements/guidance governing collection:

 Green Suppliers Network (GSN) and Energy, Economy, and the Environment (E3): Data are collected and verified under NIST MEP's
 QA/QC plan, which guides the NIST MEP Centers as grantees to the Department of Commerce. Environmental data are collected under
 the QA/QC requirements of the state environmental agency participating in GSN and E3 program. States utilize these  data for their own
 purposes as well. An E3 database is currently being developed that will track these same metrics for E3.  In the interim the E3 metrics are
 being captured by NIST regional managers on EXCEL spreadsheets and shared with EPA.
                                                            281                                      Back to Table of Contents

-------
 Goal 4                                                 Objective 2                                           Measure 297

EPP:  Instructions and guidelines are provided to partners submitting applications to the Federal Electronics Challenge on how to report
data. Manufacturers of EPEAT-registered products sign a Memorandum of Understanding in which they warrant the accuracy of the data
they provide.
2c. Source Data Reporting
Form/mechanism for receiving data and entering into EPA system: Green Suppliers Network (GSN) and Energy, Economy, and the
Environment (E3)  : The NIST MEP Center representative enters data into the Customer Relationship Management (CRM) database, which
the NIST MEP program uses to collect NIST and EPA performance metrics for the MEP/GSN/E3 programs. MEP headquarters enters data
into the CRM on economic and environmental outcomes from technical assistance providers conducting facility reviews.

Timing and frequency of reporting: For GSN, results are provided by NIST MEP on a monthly, quarterly, and annual basis.  Currently
E3 results are reported on an annual basis.  However, NIST MEP is working with its regional managers to develop a more efficient method
of providing E3 results while the E3 data base is being developed.
3.  Information Systems and Data Quality Procedures	
3a.  Information Systems                                                                                                     |
Green Suppliers Network (GSN) and Energy, Economy, and the Environment (E3) :  EPA uses the National Institute of Standards and
Technology's (NIST's) Customer Relationship Management (CRM) database, which the NIST Manufacturing Extension Partnership
(NIST MEP) program uses to collect NIST and EPA performance metrics for the MEP/GSN/E3 programs. MEP headquarters enters data
into the CRM on economic and environmental outcomes from technical assistance providers conducting facility reviews.

Environmentally Preferable Products (EPP):  Results for Environmentally Preferable Purchasing (EPP) are derived from the Federal
Electronics Challenge (FEC) and the Electronic Product Environmental Assessment Tool (EPEAT).  FEC uses the FEC Administrative
Database for storage and retrieval of annual reporting information from FEC government partners.	
3b.  Data Quality Procedures                                                                                                  |
OPPT: All OPPT programs operate under the Information Quality Guidelines as found at
http://www.epa.gov/quality/informationguidelines, as well as under the Pollution Prevention and Toxics Quality Management Plan (QMP)
("Quality Management Plan for the Office of Pollution Prevention and Toxics; Office of Prevention, Pesticides and  Toxic Substances,"
November 2008), and the programs will ensure that those standards and procedures are applied to this effort. The Quality Management Plan
is for internal use only.

The P2 Program established a Standard Operating Procedures report to govern its collection, tracking, analyzing, and publicly reporting of
data on environmental and other performance parameters. These SOPs pertain to the type, format and quality of data to be submitted to the
Agency by partners, contractors, and program beneficiaries for use in reporting P2 Program performance.

                                                          282                                    Back to Table of Contents

-------
 Goal 4                                                 Objective 2                                            Measure 297

EPP: EPA staff review the submitted forms. FEC data undergo thorough internal technical review before they are run through the EEBC
calculator.

The assumptions needed for the EEBC to translate environmental attributes and activities into environmental benefits are relatively
extensive and were reviewed when the EEBC underwent the original peer review process, and again when they were updated during the
development of version 2.0 of the EEBC.
3c. Data Oversight                                                                                                           |
Branch Chief, Planning and Assessment Branch
3d. Calculation Methodology                                                                                                  |
Unit of analysis: MTCO2e

Green Suppliers Network (GSN) and Energy, Economy, and the Environment (E3)  : The program assumes that many partner facilities will
choose not to submit any actual P2 outcome data to maintain confidentiality and that facility partners will not accept NIST MEP
headquarters sharing any non-aggregated potential or actual P2 data with EPA.

To accommodate facility preferences for confidentiality, the Program uses an implementation-rate methodology to calculate and report
results.  Based on actual results reported in the Michigan multiple-facility projects, the Program assumes the following GSN P2-cost
savings implementation rates, assuming energy-related savings occur at a higher rate and represent a larger share of total savings (2010,
30%; 2011, 32%; 2012, 34%; 2013, 36%, 2014, 38%;  and, 2015, 40%). Also based on the Michigan project, the Program assumes the
following GSN energy-based (MTCO2e) implementation rates (2010, 35%; 2011, 37%; 2012, 39%; 2013, 41%; 2014, 43%; and 2015,
45%) and the following implementation rates for other environmental projects, taking into account the economy (2010, 15%; 2011, 17%;
2012, 19%; 2013, 21%; 2014, 23%, and 2015, 25%).

The implementation rates for E3 projects are assumed to be higher for energy-based recommendations because of more highly leveraged
resources for implementation and the higher visibility  of E3. Implementation rates used for E3 energy-based recommendations (related to
MTCO2e) are as follows: 2010, 50%; 2011, 52%; 2012, 54%; 2013,  56%;  2015, 58%; and, 2015, 60%. Implementation rates used for E3
cost savings are as follows: 2010, 41%; 2011, 44%; 2012, 47%; 2013, 49%; 2014, 52%; and, 2015, 55%. Implementation rates used for E3
other environmental projects are as follows: 2010, 15%; 2011, 20%; 2012, 25%; 2013, 30% 2014, 35%; and 2015, 40%.

EPA counts recurring results from GSN and E3 facility implementation of equipment and process changes that are expected to be observed
for multiple years. EPA is using an average lifetime of equipment or process change as a factor to apply to all GSN and E3 results
achieved. Preliminary bench-marking indicates that a six-year period is an appropriate average lifetime for GSN technology and process
changes.  In the future, EPA may be able to access case-specific data efficiently to determine specific depreciation rates for equipment and
process changes installed.

Environmentally Preferable Products (EPP):  EPP staff run annually reported data from Federal Electronics Competition partners and
annual sales data of Electronic Product Environmental Assessment Tool (EPEAT) products through an Electronics Environmental Benefits
                                                          283                                    Back to Table of Contents

-------
   Goal 4                                                   Objective 2                                             Measure 297
  Calculator (EEBC) to calculate pounds of hazardous pollution reduced, units of energy conserved, and costs saved (among other benefits).
  The assumptions needed for the EEBC to translate environmental attributes and activities into environmental benefits are relatively
  extensive and are laid out in the EEBC (e.g., the average lifecycle of a computer, the weight of packaging for a computer, etc.)

  Energy savings per dollar invested in FEC are calculated by comparing energy savings data to FEC program resource data that are housed
  in a central OPPT finance database.

  EPP counts benefit estimates that encompass the purchase, use, and disposal of green electronics products over a five year product
  life-cycle. As additional electronics products are explored, benefits will be counted according to respective product life-cycles.
  4. Reporting and Oversight
  4a. Oversight and Timing of Results Reporting
Planning and Accountability Lead in the Resource Management Staff in the Office of Program Management Operations. Reporting
semiannually: mid-year and end-of-year.	
  4b. Data Limitations/Qualifications
  Green Suppliers Network (GSN) and Energy, Economy, and the Environment (E3)  : To a degree, EPA assumes that partner facilities report
  actual data accurately to NIST Manufacturing Extension Partnership (NIST MEP) headquarters, that MEP and State technical assistance
  providers make accurate estimates of potential P2 results if projects are implemented, and that NIST MEP headquarters accurately
  aggregates the data before sharing them with EPA.

  The program assumes that many partner facilities will choose not to submit actual P2 outcome data to maintain confidentiality and that
  facility partners will not accept NIST MEP headquarters sharing any non-aggregated potential or actual P2 data with EPA.

  Facilities reviewed by NIST MEP and State technical assistance providers are often reluctant to have their individual facility opportunity
  assessments shared with EPA or to share proprietary information on quantitative benefits with NIST or EPA.  MEP programs can also vary
  in the level of detail they report from the facility-level opportunity assessments (potential results) to MEP Headquarters, where data are
  aggregated and then sent to EPA. To address these limitations, EPA has strengthened the Request for Proposals requirements for the
  grantee MEP centers eligible to perform GSN and E3 reviews.

  EPP:  FEC has a built-in reliance on partners for data reporting.  EPEAT relies on manufacturers of EPEAT-registered products, and the
  GEC, for data reporting.	
  4c. Third-Party Audits
  EPP: The Electronics Environmental Benefits Calculator (EEBC) underwent internal and external review during its development phases.
  It was also reviewed and beta-tested during development of version 2.0.	


                                                              284                                      Back to Table of Contents

-------
Goal 4                                                          Objective 2                                                    Measure 297




Record Last Updated: 02/13/2012 01:16:59 PM
                                                                    285                                            Back to Table of Contents

-------
  Goal 5                                               Objective 1                                           Measure 400
  Measure Code : 400 - Millions of pounds of air pollutants  reduced, treated, or
  eliminated through concluded enforcement actions.
  Office of Enforcement and Compliance Assurance (OECA)
   1. Measure and DQR Metadata
   Goal Number and Title                          5 - Enforcing Environmental Laws
   Objective Number and Title                      ! - Enforce Environmental Laws
   Sub-Objective Number and Title                  2 - Support Taking Action on Climate Change and Improving Air Quality
   Strategic Target Code and Title                   1 - Bv 2015, reduce, treat, or eliminate 2,400 million estimated cumulative pounds of air pollutants
   Managing Office	Office of Compliance	
   Performance Measure Term Definitions
Air pollutants:
The Clean Air Act lists the pollutants and sources of pollutants that are to be regulated by EPA. Pollutants include hazardous air pollutants,
criteria pollutants, and chemicals that destroy stratospheric ozone.  Sources of pollutants include stationary sources (e.g., chemical plants,
gas stations, and power plants) and mobile sources (e.g., cars, trucks, and planes).

For more information, see: http://www.epa.gov/air/airpollutants.html


Reduced, Treated or Eliminated: Reduced, treated, or eliminated is the quantity of pollutant(s) that will no longer be released to the
environment as a result of a non-complying facility returning to its allowable permit limits through the successful completion of an
enforcement settlement. Facilities may further reduce, treat or eliminate pollutants by carrying out voluntary Supplemental Environmental
Projects.

Concluded enforcement actions: For purposes of this measure, there are two categories of concluded enforcement actions counted.

The first are administrative enforcement actions which are  undertake by EPA through authority granted to it under various federal
environmental statutes, such as CERCLA, RCRA, CAA, CWA, TSCA, and others. Administrative enforcement actions can take several
forms, including EPA issuing an administrative order requiring a facility to implement specific corrective measures to filing an
administrative complaint commencing a formal administrative adjudication. An administrative action is concluded when a written
agreement between the defendant/respondent and EPA resolving the complaint is documented, signed by the Regional Administrator or
designee, and is filed with the regional hearing clerk.

The second type of enforcement action is known as a civil judicial action which is a formal lawsuit, filed in court, against a person who has
                                                         286                                    Back to Table of Contents

-------
  Goal 5                                                  Objective 1                                              Measure 400
either failed to comply with a statutory or regulatory requirement or an administrative order. Civil judicial actions attorneys from the U.S.
Department of Justice prosecute civil cases for EPA. A concluded action occurs when a consent decree is signed by all parties to the action
and filed in the appropriate court and signed by a judge or a written ruling or decision is made by a judge after a full trial.
 2. Data Definition and Source Reporting
 2a. Original Data Source
 EPA Regional Enforcement Organizations
 EPA Regional Program Organizations
 EPA Headquarters Enforcement Organizations
 Facility Personnel and Facility Contractors
 DOJ
 2b. Source Data Collection
 EPA calculates the estimated pollutant reductions after case settlement or during discussions with the facility personnel over specific
 plans for compliance. The final enforcement documents often spell out the terms and methodologies the facility must follow to mitigate
 and prevent the future release of pollutants.   These documents serve as the starting point for EPA's calculations.

 Example of consent decree document containing pollutant mitigation instructions to the facility:
 http://www.epa.gov/compliance/resources/cases/civil/caa/essroc.html
 2c. Source Data Reporting
 When a formal administrative or judicial enforcement case is "concluded" enforcement staff enters information into ICIS to document the
 environmental benefits achieved by the concluded enforcement case.  Original source documents may include facility permits, legal
 documents such as consent decrees and administrative orders, inspection reports, case engineer reports and facility reports. For civil
 judicial cases, the information is reported when a consent decree or court order, or judgment is entered (not lodged). For administrative
 cases, information is reported when an administrative order or final agreement is signed.

 Environmental benefits should be reported in the year the case is settled, regardless of when the benefits will occur. Reductions are
 calculated after the judicial consent decree is lodged or entered, or when the administrative compliance order is signed by the region
 designee and filed with the regional hearing clerk.
                                                            287                                      Back to Table of Contents

-------
 Goal 5                                                 Objective 1                                            Measure 400
FY2012 CCDS.docx
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
 The ICIS FE&C data system meets Office of Environmental Information (OEI) Lifecycle Management Guidance, which includes data
validation processes, internal screen audit checks and verification, system and user documents, data quality audit reports, third party testing
reports, and detailed report specifications data calculation methodology.  Reference: Quality Assurance and Quality Control procedures:
Data Quality: Life Cycle Management Policy, (EPA CIO2121, April 7, 2006)

The Integrated Compliance Information System (ICIS) is a three phase multi-year modernization project that improves the ability of EPA
and the states to ensure compliance with the nation's environmental laws with the collection of comprehensive enforcement and compliance
information. Phase I, implemented in FY02, replaced several legacy systems, and created an integrated system to support federal
enforcement and compliance tracking, targeting and reporting, including GPRA reporting. Phase II, also called Permit Compliance System
(PCS) Modernization, expands ICIS to include the National Pollutant Discharge Elimination System (NPDES) program and enables
improved management of the complete program (e.g., stormwater) as well as replacing the legacy PCS. PCS is currently identified as an
Agency Federal Managers' Financial Integrity Act (FMFIA) weakness, and the modernization of the system is critical to address the
weakness. Phase II was first implemented in FY06 for 21 states and 11 tribes/territories that use ICIS to directly manage their NPDES
programs. In FY08, seven more states moved to ICIS from the legacy PCS and began electronically flowing their Discharge Monitoring
Report (DMR) data from their states systems via the Exchange Network and CDX to ICIS. In FY09, Phase II  continued with
implementation of the National Installation of NetDMR allowing NPDES permittees to electronically submit DMR data from permitted
facility systems via the Exchange Network to ICIS and migrated three additional states. In FY11 OECA implemented Full-Batch Release 1
of Phase II allowing Batch Flows of permits and facility  data from states. FY12 will include Full-Batch Release 2 enabling batch flow will
allow Batch Flows of inspection data from states. Inspection information and was implemented early in FY12.  The final part of Phase II
which will add the remaining NPDES Batch Flows and migrate and all remaining states is projected to be completed in FY13. Phase III
will modernize the Air Facility System (AFS) into ICIS.  AFS is used by EPA and States to track Clean Air Act enforcement and
compliance activities. Integration of AFS into ICIS will modernize and replace a legacy system that does not meet current business needs.
Implementation of this phase is projected for FY14.

ICIS contains both source data and transformed data.

OECA's Data System Quality Assurance Plan


                                                          288                                    Back to Table of Contents

-------
 Goal 5                                                   Objective 1                                             Measure 400


Data System Quality Assurance Plan (ICIS).doc	
3b. Data Quality Procedures
Annual Data Certification Process - OECA has instituted a semi-annual data certification process for the collection and reporting of
enforcement and compliance information.  The certification process was set up to ensure all reporting entities are aware of the reporting
deadlines, receive the most up-to-date reporting instructions for select measures, follow best data management practices to assure reporting
accuracy, and have access to the recent methodologies for calculating pounds of pollutants reduced. The air pounds of pollutants reduced
measure is covered by the annual data certification process.

As part of the annual data certification process, regions are provided a checklist to assist them in their data quality procedures.
FY11 Data Quality Check List.pdf
Data System Quality Assurance Plan (ICIS).doc

Quality Management Plan - September 2011
 OC QMP Concurrence Signatures.pdf OC QMP 2011 Final.docx
3c. Data Oversight
 Source Data Reporting Oversight
HQ - Director, Enforcement Targeting and Data Division
Region 1 - Division Director, Office of Environmental Stewardship
Region 2 - Director, Office of Enforcement and Compliance Assistance
Region 3 - Director, Office of Enforcement, Compliance and Environmental Justice
Region 4 - Regional Counsel and Director, Office of Environmental Accountability
Region 5 - Director, Office of Enforcement and Compliance Assurance
Region 6 - Compliance Assurance and Enforcement Division Director
Region 7 - Enforcement Coordinator
Region 8 - Director, Policy, Information Management and Environmental Justice
Region 9 - Enforcement Coordinator
Region 10 - Director, Office of Compliance and Enforcement

Information Systems Oversight Personnel
HQ -  ICIS System Administrator
                                                            289                                      Back to Table of Contents

-------
 Goal 5                                                 Objective 1                                            Measure 400
 Region 1 - ICIS Steward and Data Systems Administrator
Region 2 - ICIS System Administrator
Region 3 - ICIS Data Steward and System Administrator
Region 4 - ICIS System Administrator, Regional Compliance and Enforcement Data Steward
Region 5 - ICIS Data Steward and Systems Administrator
Region 6 - ICIS Data Steward
Region 7 - ICIS Data Steward and Systems Administrator
Region 8 - ICIS System Administrator
Region 9 - ICIS System Administrator
Region 10 - ICIS System Administrator and Data Steward

3d. Calculation Methodology
 The Case Conclusion Data Sheet (CCDS) is a manual data collection tool HQ implemented in FY 1996, updated in FY 2012, to collect
information on concluded federal enforcement cases including the case name and identification number, injunctive relief, environmental
benefits (including environmental benefits from Supplemental Environmental Projects [SEPs]), and assessed penalties.  The CCDS data are
entered into the Integrated Information and Compliance System (ICIS). OECA uses data obtained from the CCDS via ICIS to assess the
environmental outcomes of its enforcement program.

The CCDS guidance provides detailed calculation methodologies for estimating the environmental benefits on a variety of environmental
statutes including air, water, waste, toxics and pesticides.  Additionally, the CCDS provides specific instruction on how to enter the
environmental benefits information into ICIS.

To view the the CCDS guidance in its entirety go to:
 CCDS.xps
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Oversight of Final Reporting:
The Deputy Regional Administrators, the Office of Civil Enforcement Director, and the Monitoring, Assistance and Program Division
Director all must sign the attached certification form.
Data Certification Form.pdf
                                                          290                                    Back to Table of Contents

-------
 Goal 5                                                   Objective 1                                               Measure 400
Timing of Results Reporting: Semiannually
4b. Data Limitations/Qualifications                                                                                                 |
Pollutant reductions or eliminations reported in ICIS project an estimate of pollutants to be reduced or eliminated if the defendant carries
out the requirements of the settlement. The estimates use information available at the time  a case settles or an order is issued.  In  some
instances, EPA develops and enters this information on pollutant reduction estimates after the settlement or during  continued discussions
over specific plans for compliance. Due to the time required for EPA to negotiate a settlement agreement with a defendant, there may be a
delay in completing the CCDS. Additionally,  because of unknowns at the time of settlement, different levels of technical proficiency,  or
the nature of a case, OECA's expectation is that the overall amount of pollutants reduced or eliminated is prudently underestimated based
on CCDS information.  EPA also bases the pollutant estimates on the expectation that the defendant/respondent implements the negotiated
settlement agreement.
4c. Third-Party Audits
Inspector General Report on Pounds of Pollutants Reduced Estimates:
Projected Lbs of Pollutants Reduced.pdf
 Record Last Updated: 02/13/2012 01:16:52 PM
                                                             291                                       Back to Table of Contents

-------
  Goal 5                                               Objective 1                                          Measure 402
  Measure Code : 402 - Millions of pounds of water pollutants reduced, treated, or
  eliminated through concluded enforcement actions.
  Office of Enforcement and Compliance Assurance (OECA)
   1. Measure and DQR Metadata
   Goal Number and Title
5 - Enforcing Environmental Laws
   Objective Number and Title
                                               1 - Enforce Environmental Laws
   Sub-Objective Number and Title
3 - Support Protecting America's Waters
   Strategic Target Code and Title
1 - By 2015, reduce, treat, or eliminate 1,600 million estimated cumulative pounds of water pollutants
   Managing Office
  Office of Compliance
   Performance Measure Term Definitions
Water pollutants:
                                               EPA divides water pollution sources into two categories: point and non-point. Point
                                               sources of water pollution are stationary locations such as sewage treatment plants,
                                               factories and ships. Non-point sources are more diffuse and include agricultural
                                               runoff, mining activities and paved roads. Under the Clean Water Act, the National
                                               Pollutant Discharge Elimination System (NPDES) permit program controls water
                                               pollution by regulating point sources that discharge pollutants into waters of the
                                               United States. EPA works with state and local authorities to monitor pollution
                                               levels in the nations water and provide status and trend information on a
                                               representative variety of ecosystems.

                                               The Clean Water Act (CWA) establishes the basic structure for regulating
                                               discharges of pollutants into the waters of the United States and regulating quality
                                               standards for surface waters. The basis of the CWA was enacted in 1948 and was
                                               called the Federal Water Pollution Control Act, but the Act was significantly
                                               reorganized and expanded in 1972. "Clean Water Act" became the Act's common
                                               name with amendments in 1977.

                                               Under the CWA, EPA has implemented pollution control programs such as setting
                                               wastewater standards for industry. We have also set water quality standards for all
                                               contaminants in surface waters.

                                               The CWA made it unlawful to discharge any pollutant from a point source into
                                                        292                                   Back to Table of Contents

-------
  Goal 5                                                   Objective 1                                             Measure 402
                                                   navigable waters, unless a permit was obtained. EPA's National Pollutant
                                                   Discharge Elimination System (NPDES) permit program controls discharges. Point
                                                   sources are discrete conveyances such as pipes or man-made ditches. Individual
                                                   homes that are connected to a municipal system, use a septic system, or do not have
                                                   a surface discharge do not need an NPDES permit; however, industrial, municipal,
                                                   and other facilities must obtain permits if their discharges go directly to surface
                                                   waters.

                                                   Nonpoint source (NPS) pollution, or polluted runoff, is the major source and cause
                                                   of water quality impairment for waters on the  state water quality limited segment
                                                   lists required under CWA 303(d). Polluted runoff occurs when rain, snowmelt,
                                                   irrigation water, and other water sources move across and through land, picking up
                                                   pollutants and carrying them into lakes, rivers, wetlands, coastal waters and
                                                   underground sources of drinking water. Taking a watershed approach to
                                                   environmental issues provides an excellent opportunity for communities and
                                                   agencies to work together to achieve water quality improvements.
Reduced, Treated or Eliminated: Reduced, treated, or eliminated is the quantity of pollutant(s) that will no longer be released to the
environment as a result of a non-complying facility returning to its allowable permit limits through the successful completion of an
enforcement settlement. Facilities may further reduce, treat or eliminate pollutants by carrying out voluntary Supplemental Environmental
Projects.

Concluded enforcement actions: For purposes of this measure, there are two categories of concluded enforcement actions counted.

The first are administrative enforcement actions which are undertake by EPA through authority granted to it under various federal
environmental statutes, such as CERCLA, RCRA, CAA, CWA, TSCA, and others. Administrative enforcement actions can take several
forms, including EPA issuing an administrative order requiring a facility to implement specific corrective measures to filing an
administrative complaint commencing a formal administrative adjudication.  An administrative action is concluded when a written
agreement between the defendant/respondent and EPA resolving the complaint is documented, is signed by the Regional Administrator  or
designee, and is filed with the regional hearing clerk.

The second type of enforcement action is known as a civil judicial action which is a formal lawsuit, filed in court, against a person who  has
either failed to comply with a statutory or regulatory requirement or an administrative order. Civil judicial actions  attorneys from the U.S.
Department of Justice prosecute civil cases for EPA.  A concluded action occurs when a consent decree is signed by all parties to the action
and filed in the appropriate court and signed by a judge or a written ruling or decision is made by a judge after a full trial.
                                                             293                                     Back to Table of Contents

-------
 Goal 5                                                 Objective 1                                            Measure 402
2.  Data Definition and Source Reporting
2a. Original Data Source	
EPA Regional Enforcement Organizations
EPA Regional Program Organizations
EPA Headquarters Enforcement Organizations
Facility Personnel and Facility Contractors
DOJ	
2b. Source Data Collection
EPA calculates the  estimated pollutant reductions after case settlement or during discussions with the facility personnel over specific plans
for compliance. The final enforcement documents often spell out the terms and methodologies the facility must follow to mitigate and
prevent the future release of pollutants.  These documents serve as the starting point for EPA's calculations.

Example of consent decree document containing pollutant mitigation instructions to the facility:
http://www.epa.gov/compliance/resources/cases/civil/caa/essroc.html
2c. Source Data Reporting	
When a formal administrative or judicial enforcement case is "concluded" enforcement staff enters information into ICIS to document the
environmental benefits achieved by the concluded enforcement case. Original source documents may include facility permits, legal
documents such as consent decrees and administrative orders, inspection reports, case engineer reports and facility reports. For civil
judicial cases, the information is reported when a consent decree or court order, or judgment is entered (not lodged). For administrative
cases, information is reported when an administrative order or final agreement is signed.

Environmental benefits should be reported in the year the case is settled, regardless of when the benefits will occur. Reductions are
calculated after the judicial consent decree is lodged or entered, or when the administrative compliance order is signed by the region
designee and filed with the regional hearing clerk.
FY2012 CCDS.docx
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
 The ICIS FE&C data system meets Office of Environmental Information (OEI) Lifecycle Management Guidance, which includes data
                                                          294                                     Back to Table of Contents

-------
 Goal 5                                                  Objective 1                                             Measure 402
validation processes, internal screen audit checks and verification, system and user documents, data quality audit reports, third party testing
reports, and detailed report specifications data calculation methodology. Reference: Quality Assurance and Quality Control procedures:
Data Quality: Life Cycle Management Policy, (EPA CIO2121, April 7, 2006)

The Integrated Compliance Information System (ICIS) is a three phase multi-year modernization project that improves the ability of EPA
and the states to ensure compliance with the nation's environmental laws with the collection of comprehensive enforcement and compliance
information. Phase I, implemented in FY02, replaced several legacy systems, and created an integrated system to support federal
enforcement and compliance tracking, targeting and reporting, including GPRA reporting. Phase II, also called Permit Compliance System
(PCS) Modernization,  expands ICIS to include the National Pollutant Discharge Elimination System (NPDES) program and enables
improved management of the complete program (e.g., stormwater) as well as replacing the legacy PCS. PCS is currently identified as an
Agency Federal Managers' Financial Integrity Act (FMFIA) weakness, and the modernization of the system is critical to address the
weakness. Phase II was first implemented in FY06 for 21 states and 11 tribes/territories that use ICIS to directly manage their NPDES
programs. In FY08, seven more states moved to ICIS from the legacy PCS and began electronically flowing their Discharge Monitoring
Report (DMR) data from their states systems via the Exchange Network and CDX to ICIS.  In FY09, Phase II continued with
implementation of the  National Installation of NetDMR allowing NPDES permittees to electronically submit DMR data from permitted
facility systems via the Exchange Network to ICIS and migrated three additional states. In FY11 OECA implemented Full-Batch Release 1
of Phase II allowing Batch Flows of permits and facility data from states. FY12 will include Full-Batch Release 2 enabling batch flow will
allow Batch Flows of inspection data from states. Inspection information and was implemented early in FY12. The final part of Phase II
which will add the remaining NPDES Batch Flows and migrate and all remaining states is projected to be completed in FY13. Phase III
will modernize the Air Facility System (AFS) into ICIS. AFS is used by EPA and States to track Clean Air Act enforcement and
compliance activities. Integration of AFS into ICIS will modernize and replace a legacy system that does not meet current business needs.
Implementation of this phase is projected for FY14.

ICIS contains both source data and transformed data.

OECA's Data System Quality Assurance Plan
Data System Quality Assurance Plan (ICIS).doc	
3b. Data Quality Procedures	
Annual Data Certification Process - OECA has instituted a semi-annual data certification process for the collection and reporting of
enforcement and compliance information.  The certification process was set up to ensure all reporting entities are aware of the reporting
deadlines, receive the most up-to-date reporting instructions for select measures, follow best data management practices to assure reporting
accuracy, and have access to the recent methodologies for calculating pounds of pollutants reduced. The air pounds of pollutants reduced
measure is covered by the annual data certification process.

As part of the annual data certification process, regions are provided a checklist to assist them in their data quality procedures.

                                                           295                                      Back to Table of Contents

-------
 Goal 5                                                 Objective 1                                            Measure 402
FY11 Data Quality Check List.pdf

OECA's Quality Management Plan - September 2011
 OC QMP Concurrence Signatures.pdf OC QMP 2011 Final.docx
3c. Data Oversight
Source Data Reporting Oversight
HQ - Director, Enforcement Targeting and Data Division
Region 1 - Division Director, Office of Environmental Stewardship
Region 2 - Director, Office of Enforcement and Compliance Assistance
Region 3 - Director, Office of Enforcement, Compliance and Environmental Justice
Region 4 - Regional Counsel and Director, Office of Environmental Accountability
Region 5 - Director, Office of Enforcement and Compliance Assurance
Region 6 - Compliance Assurance and Enforcement Division Director
Region 7 - Enforcement Coordinator
Region 8 - Assistant Regional Administrator for Enforcement,  Compliance and Environmental Justice
Region 9 - Enforcement Coordinator
Region 10 - Director, Office of Compliance and Enforcement

Information Systems Oversight Personnel
HQ - ICIS System Administrator
Region 1 - ICIS Steward and Data Systems Administrator
Region 2 - ICIS System Administrator
Region 3 - ICIS Data Steward and System Administrator
Region 4 - ICIS System Administrator, Regional Compliance and Enforcement Data Steward
Region 5 - ICIS Data Steward and Systems Administrator
Region 6 - ICIS Data Steward
Region 7 - ICIS Data Steward and Systems Administrtor
Region 8 - ICIS System Administrator
Region 9 - ICIS System Administrator
Region 10 - ICIS System Administrator and Data Steward	
3d. Calculation Methodology                                                                                                 |
 The Case Conclusion Data Sheet (CCDS) is a manual data collection tool HQ implemented in FY 1996, updated in FY 2012, to collect
information on concluded federal enforcement cases including the case name and identification number, injunctive relief, environmental
benefits (including environmental benefits from Supplemental  Environmental Projects [SEPs]), and assessed penalties. The CCDS data are
                                                          296                                    Back to Table of Contents

-------
 Goal 5                                                  Objective 1                                             Measure 402
entered into the Integrated Information and Compliance System (ICIS). OECA uses data obtained from the CCDS via ICIS to assess the
environmental outcomes of its enforcement program.

The CCDS guidance provides detailed calculation methodologies for estimating the environmental benefits on a variety of environmental
statutes including air, water, waste, toxics and pesticides.  Additionally, the CCDS provides specific instruction on how to enter the
environmental benefits information into ICIS.

To view the the CCDS guidance in its entirety go to:
 CCDS.xps
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Oversight of Final Reporting:
The Deputy Regional Administrators, the Office of Civil Enforcement Director, and the Monitoring, Assistance and Program Division
Director all must sign the attached certification form.
Data Certification Form.pdf

Timing of Results Reporting: Semiannually
4b. Data Limitations/Qualifications
Pollutant reductions or eliminations reported in ICIS project an estimate of pollutants to be reduced or eliminated if the defendant carries
out the requirements of the settlement. (Information on expected outcomes of state enforcement is not available.)  The estimates use
information available at the time a case settles or an order is issued.  In some instances, EPA develops and enters this information on
pollutant reduction estimates after the settlement or during continued discussions over specific plans for compliance.  Due to the time
required for EPA to negotiate a settlement agreement with a defendant, there may be a delay in completing the CCDS. Additionally,
because of unknowns at the time of settlement, different levels of technical proficiency, or the nature of a case, OECA's expectation is that
the overall amount of pollutants reduced  or  eliminated is prudently underestimated based  on CCDS information.  EPA also bases the
pollutant estimates on the expectation that the  defendant/respondent implements the negotiated settlement agreement.
4c. Third-Party Audits                                                                                                          |
Inspector General Report on Pounds of Pollutants Reduced:


                                                           297                                      Back to Table of Contents

-------
 Goal 5
Objective 1
Measure 402
               »^J
                 •4*
Projected Lbs of Pollutants Reduced.pdf
 Record Last Updated: 02/13/2012 01:16:55 PM
                                                                      298
                                                   Back to Table of Contents

-------
  Goal 5                                                Objective 1                                            Measure 404
  Measure Code : 404 - Millions of pounds of toxic and pesticide pollutants reduced,
  treated, or eliminated through concluded enforcement actions.
  Office of Enforcement and Compliance Assurance (OECA)
   1.  Measure and DQR Metadata
   Goal Number and Title                          5 - Enforcing Environmental Laws
   Objective Number and Title                      ! - Enforce Environmental Laws
   Sub-Objective Number and Title                   5 - Support Ensuring the Safety of Chemicals and Preventing Pollution
   Strategic Target Code and Title                    1 - Bv 2015, reduce, treat, or eliminate 19 million estimated cumulative pounds of toxic and pesticide
   Managing Office                                   Office of Compliance
   Performance Measure Term Definitions
Toxic and pesticide pollutants:
The Toxic Substances Control Act of 1976 provides EPA with authority to require reporting, record-keeping and testing requirements; and
restrictions relating to chemical substances  and/or mixtures; and the production, importation, use, and disposal of specific chemicals,
including lead-based paint,  polychlorinated biphenyls (PCBs),  and asbestos.  Lead-based paint is particularly dangerous  to children:
exposure may cause reduced intelligence, learning disabilities, behavior problems and slowed physical development. Because LBP is found
in pre-1978 buildings, it is more common in  communities predominated by older housing, which usually are low-income, minority and EJ
communities.  Asbestos in  schools, if not properly managed, can expose  children, teachers and other school staff to harm that may not
manifest for years. PCBs bioaccumulate and thus cause a variety  of adverse health effects. Asbestos and PCBs are also generally found in
older buildings.  Additionally, PCBs are generally found in older transformers, capacitors and some hydraulic equipment and more recently
in recycled and  used oil.  Inappropriate abatement and disposal  of asbestos and PCBs can be  dangerous.  For more information on the
Toxics program go to: http://www.epa.gov/compliance/civil/tsca/tscaenfstatreq.html

The Federal Insecticide, Fungicide and Rodenticide Act (FIFRA) provides EPA the authority to regulate pesticides to prevent unreasonable
adverse affects on the environment. The term "unreasonable adverse effects on the environment" means: "(1) any unreasonable  risk to man
or the environment, taking into account the economic, social, and environmental costs and benefits of the use of any pesticide, or (2) a
human dietary risk from residues that result from a use of a pesticide in or on any food inconsistent with the standard under section 408 of
the Federal Food, Drug, and Cosmetic Act." The term pesticide includes many kinds of ingredients in products, such as insect repellants,
weed killers, disinfectants, and swimming pool chemicals which are designed to prevent, destroy, repel or reduce pests  of any sort.
Pesticides are found in nearly every home, business, farm, school, hospital and park in the United States. EPA must evaluate pesticides
thoroughly before they can be marketed and used in the United States to ensure that they will meet federal safety standards to protect human
health and the environment. Pesticides that meet the requirements are granted a license or "registration" which permits their distribution,
sale, and use according to specific use directions and requirements identified on the label. For more information on the pesticide program
goto: http://www.epa.gov/compliance/civil/fifra/fifraenfstatreq.html

                                                          299                                    Back to Table of  Contents

-------
  Goal 5                                                  Objective 1                                             Measure 404

Reduced, Treated or Eliminated: Reduced, treated, or eliminated is the quantity of pollutant(s) that will no longer be released to the
environment as a result of a non-complying facility returning to its allowable permit limits through the successful completion of an
enforcement settlement.  Facilities may further reduce, treat or eliminate pollutants by carrying out voluntary Supplemental Environmental
Projects.

Concluded enforcement actions: For purposes of this measure, there are two categories of concluded enforcement actions counted.

The first are administrative enforcement actions which are  undertake by EPA through authority granted to it under various federal
environmental statutes, such as CERCLA, RCRA, CAA, CWA, TSCA, and others. Administrative enforcement actions can take  several
forms, including EPA issuing an administrative order requiring a facility to implement specific corrective measures to filing an
administrative complaint commencing a formal administrative adjudication. An administrative action is concluded when a written
agreement between the defendant/respondent and EPA resolving the complaint is documented, signed by the Regional Administrator or
designee, and is filed with the regional hearing clerk.

The second type of enforcement action is  known as a civil judicial action which is a formal lawsuit, filed in court, against a person who has
either failed to comply with a statutory or regulatory requirement or an administrative order.  Civil judicial actions attorneys from the U.S.
Department of Justice prosecute civil cases for EPA.  A concluded action occurs when a consent decree is signed by all parties to the action
and filed in the appropriate court and signed by a judge or a written ruling or decision is made by a judge after a full trial.


 2. Data Definition and Source Reporting	
 2a. Original Data Source	
 EPA Regional Enforcement Organizations
 EPA Regional Program Organizations
 EPA Headquarters Enforcement Organizations
 Facility Personnel and Facility Contractors
 DOJ	
 2b. Source Data Collection
 EPA calculates the estimated pollutant reductions after case settlement or during discussions with the facility personnel over specific plans
 for compliance.  The final enforcement documents often spell out the terms and methodologies the facility must follow to mitigate and
 prevent the future release of pollutants.  These documents serve as the starting point for EPA's calculations.

 Example of consent decree document containing pollutant mitigation instructions to the facility:
 http://www.epa.gov/compliance/resources/cases/civil/caa/essroc.html
                                                            300                                      Back to Table of Contents

-------
 Goal 5                                                 Objective 1                                             Measure 404
2c. Source Data Reporting
When a formal administrative or judicial enforcement case is "concluded" enforcement staff enters information into ICIS to document the
environmental benefits achieved by the concluded enforcement case. Original source documents may include facility permits, legal
documents such as consent decrees and administrative orders, inspection reports, case engineer reports and facility reports.  For civil
judicial cases, the information is reported when a consent decree or court order, or judgment is entered (not lodged). For administrative
cases, information is reported when an administrative order or final agreement is  signed.

Environmental benefits should be reported in the year the case is settled, regardless of when the benefits will occur. Reductions are
calculated after the judicial consent decree is lodged or entered, or when the administrative compliance order is signed by the region
designee and filed with the regional hearing clerk.
FY2012 CCDS.docx
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
 The ICIS FE&C data system meets Office of Environmental Information (OEI) Lifecycle Management Guidance, which includes data
validation processes, internal screen audit checks and verification, system and user documents, data quality audit reports, third party testing
reports, and detailed report specifications data calculation methodology. Reference: Quality Assurance and Quality Control procedures:
Data Quality: Life Cycle Management Policy, (EPA CIO2121, April 7, 2006)

The Integrated Compliance Information System (ICIS) is a three phase multi-year modernization project that improves the ability of EPA
and the states to ensure compliance with the nation's environmental laws with the collection of comprehensive enforcement and compliance
information. Phase I, implemented in FY02, replaced several legacy systems, and created an integrated system to support federal
enforcement and compliance tracking, targeting and reporting, including GPRA reporting. Phase II, also called Permit Compliance System
(PCS) Modernization, expands ICIS to include the National Pollutant Discharge Elimination System (NPDES) program and enables
improved management of the complete program (e.g., stormwater) as well as replacing the legacy PCS. PCS is currently identified as an
Agency Federal Managers' Financial Integrity Act (FMFIA) weakness, and the modernization of the system is critical to address the
weakness. Phase II was first  implemented in FY06 for 21 states  and 11 tribes/territories that use ICIS to directly manage their NPDES
programs. In FY08, seven more states moved to ICIS from the legacy PCS and began electronically flowing their Discharge Monitoring
Report (DMR) data from their states systems via the Exchange Network and CDX to ICIS. In FY09, Phase II  continued with
implementation of the National Installation of NetDMR allowing NPDES permittees to electronically submit DMR data from permitted
facility systems via the Exchange Network to ICIS and migrated three additional states. In FY11 OECA implemented Full-Batch Release 1
of Phase II allowing Batch Flows of permits and facility data from states. FY12 will include Full-Batch Release 2 enabling batch flow will
allow Batch Flows of inspection data from states. Inspection information and was implemented early in FY12.  The final part of Phase II
                                                          301                                     Back to Table of Contents

-------
 Goal 5                                                   Objective 1                                             Measure 404
which will add the remaining NPDES Batch Flows and migrate and all remaining states is projected to be completed in FY13. Phase III
will modernize the Air Facility System (AFS) into ICIS. AFS is used by EPA and States to track Clean Air Act enforcement and
compliance activities. Integration of AFS into ICIS will modernize and replace a legacy system that does not meet current business needs.
Implementation of this phase is projected for FY14.

ICIS contains both source data and transformed data.

OECA's Data System Quality Assurance Plan
Data System Quality Assurance Plan (ICIS).doc	
3b. Data Quality Procedures
Annual Data Certification Process - OECA has instituted a semi-annual data certification process for the collection and reporting of
enforcement and compliance information.  The certification process was set up to ensure all reporting entities are aware of the reporting
deadlines, receive the most up-to-date reporting instructions for select measures, follow best data management practices to assure reporting
accuracy, and have access to the recent methodologies for calculating pounds of pollutants reduced.  The toxics and pesticides pounds of
pollutants reduced measure is covered by the annual data certification process.

As part of the annual data certification process, regions are provided a checklist to assist them in their data quality procedures.
FY1 1 Data Quality Check List.pdf

OECA's QMP - September 2011
PC QMP Concurrence Signatures.pdf PC QMP 201 1 Final. docx
3c. Data Oversight
Source Data Reporting Oversight:
HQ - Director, Enforcement Targeting and Data Division
Region 1 - Division Director, Office of Environmental Stewardship
Region 2 - Director, Office of Enforcement and Compliance Assistance
Region 3 - Director, Office of Enforcement, Compliance and Environmental Justice
Region 4 - Regional Counsel and Director, Office of Environmental Accountability
Region 5 - Director, Office of Enforcement and Compliance Assurance
Region 6 - Compliance Assurance and Enforcement Division Director
                                                            302                                     Back to Table of Contents

-------
 Goal 5                                                Objective 1                                            Measure 404
Region 7 - Enforcement Coordinator
Region 8 - Director, Policy, Information Management and Environmental Justice
Region 9 - Enforcement Coordinator
Region 10 - Director, Office of Compliance and Enforcement

Information Systems Oversight Personnel
HQ - ICIS System Administrator
 Region 1 - ICIS Steward and Data Systems Administrator
Region 2 - ICIS System Administrator
Region 3 - ICIS Data Steward and System Administrator
Region 4 - ICIS System Administrator, Regional Compliance and Enforcement Data Steward
Region 5 - ICIS Data Steward and Systems Administrator
Region 6 - ICIS Data Steward
Region 7 - ICIS Data Steward and Systems Administrtor
Region 8 - ICIS System Administrator
Region 9 - ICIS System Administrator
Region 10 - ICIS System Administrator and Data Steward	
3d. Calculation Methodology
 The Case Conclusion Data Sheet (CCDS) is a manual data collection tool HQ implemented in FY 1996, updated in FY 2012, to collect
information on concluded federal enforcement cases including the case name and identification number, injunctive relief, environmental
benefits (including environmental benefits from Supplemental Environmental Projects [SEPs]), and assessed penalties. The CCDS data are
entered into the Integrated Information and Compliance System (ICIS).  OECA uses data obtained from the CCDS via ICIS to assess the
environmental outcomes of its enforcement program.

The CCDS guidance provides detailed calculation methodologies for estimating the environmental benefits on a variety of environmental
statutes including air, water, waste, toxics and pesticides. Additionally, the CCDS provides specific instruction on how to enter the
environmental benefits information into ICIS.

To view the the CCDS guidance in its entirety go to:
 CCDS.xps
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
                                                         303                                    Back to Table of Contents

-------
 Goal 5                                                   Objective 1                                              Measure 404
Oversight of Final Reporting: The Deputy Regional Administrators, the Office of Civil Enforcement Director, and the Monitoring,
Assistance and Program Division Director all must sign the attached certification form.
Data Certification Form.pdf

Timing of Results Reporting: Semiannually
4b. Data Limitations/Qualifications
Pollutant reductions or eliminations reported in ICIS project an estimate of pollutants to be reduced or eliminated if the defendant carries
out the requirements of the settlement. (Information on  expected outcomes of state enforcement is not available.)  The estimates use
information available at the time a case settles or an order is issued.   In some instances, EPA develops and enters this  information on
pollutant reduction estimates after the  settlement or during continued discussions over specific plans for compliance.  Due to the time
required for EPA to negotiate a settlement agreement with a defendant, there may be a delay in completing the CCDS.  Additionally,
because of unknowns at the time of settlement, different levels of technical proficiency, or the nature of a case, OECA's expectation is that
the overall amount  of  pollutants reduced or eliminated is prudently underestimated based  on CCDS information.  EPA also bases the
pollutant estimates on the expectation that the defendant/respondent implements the negotiated settlement agreement.
4c. Third-Party Audits                                                                                                             |
Inspector General Report on Pounds of Pollution Reduced Estimates:
Projected Lbs of Pollutants Reduced.pdf
 Record Last Updated: 02/13/2012 01:16:57 PM
                                                             304                                      Back to Table of Contents

-------
  Goal 5                                               Objective 1                                           Measure 405
  Measure Code : 405 - Millions of pounds of hazardous waste reduced, treated, or
  eliminated through concluded enforcement actions.
  Office of Enforcement and Compliance Assurance (OECA)
   1. Measure and DQR Metadata
   Goal Number and Title                          5 - Enforcing Environmental Laws
   Objective Number and Title                      ! - Enforce Environmental Laws
   Sub-Objective Number and Title                  4 - Support Cleaning Up Communities and Advancing Sustainable Development
   Strategic Target Code and Title                   1 - Bv 2015, reduce, treat, or eliminate 32,000 million estimated pounds of hazardous waste
   Managing Office	Office of Compliance	
   Performance Measure Term Definitions
Hazardous waste: Hazardous waste is defined as liquid, solid, contained gas, or sludge wastes that contain properties that are dangerous or
potentially harmful to human health or the environment.

Hazardous wastes are generally regulated by the Resource Conservation and Recovery Act (RCRA) and cleaned up under the RCRA
Corrective Action Program or CERCLA (Comprehensive Environmental Response, Compensation, and Liability Act; also known as
Superfund). RCRA is comprised of three major programs: Subtitle C (the hazardous waste management program), Subtitle D (the solid
waste program), and Subtitle I (the UST program). Under Subtitle C, EPA has developed a comprehensive program to ensure that all
hazardous waste is safely managed from the time it is generated to its final disposition at a Treatment, Storage, or Disposal (TSD) facility.
The objective of the  "cradle-to-grave" management system is to ensure that hazardous waste is handled in a manner that protects human
health and the environment. To this end, there are Subtitle C regulations for the generation, transportation, and treatment, storage, or
disposal of hazardous wastes.
Through the RCRA Corrective Action Program, EPA requires the investigation and cleanup, or in-situ or ex-situ treatment of hazardous
releases at RCRA facilities. The corrective action program is structured around elements common to most cleanups under other EPA
programs: an initial site assessment, characterization of the contamination, and the evaluation and implementation of cleanup alternatives,
both immediate and long-term. Components of a cleanup action can impact all media types, including releases to the air, surface or
groundwater, and cleanup of contaminated soil.

For more information on the different types of hazardous waste go to: http://www.epa.gov/wastes/hazard/wastetypes/index.htm

Reduced, Treated or Eliminated: Reduced, treated, or eliminated is the quantity of pollutant(s) that will no longer be released to the
environment as a result of a non-complying facility returning to its allowable permit limits through the successful completion of an
enforcement settlement. Facilities may further reduce, treat or eliminate pollutants by carrying out voluntary Supplemental Environmental
Projects.
                                                         305                                   Back to Table of Contents

-------
  Goal 5                                                  Objective 1                                             Measure 405

Concluded enforcement actions:  For purposes of this measure, there are two categories of concluded enforcement actions counted.

The first are administrative enforcement actions which are undertake by EPA through authority granted to it under various federal
environmental statutes, such as CERCLA, RCRA, CAA, CWA, TSCA, and others. Administrative enforcement actions can take several
forms, including EPA issuing an administrative order requiring a facility to implement specific corrective measures to filing an
administrative complaint commencing a formal administrative adjudication.  An  administrative action is concluded when a written
agreement between the defendant/respondent and EPA resolving the complaint is documented, signed by the Regional Administrator or
designee, and is filed with the regional hearing clerk.

The second type of enforcement action is  known as a civil judicial action which is a formal lawsuit, filed in court, against a person who has
either failed to comply with a statutory or regulatory requirement or an administrative order. Civil judicial actions attorneys from the U.S.
Department of Justice prosecute civil cases for EPA.  A concluded action occurs when a consent decree is signed by all parties to the action
and filed in the appropriate court and signed by a judge or a written ruling or decision is made by a judge after a full trial.
 2. Data  Definition and Source Reporting	
 2a. Original Data Source
  EPA Regional Enforcement Organizations
 EPA Regional Program Organizations
 EPA Headquarters Enforcement Organizations
 Facility Personnel and Facility Contractors
 DOJ	
 2b. Source Data Collection
 EPA calculates the  estimated pollutant reductions after case settlement or during discussions with the facility personnel over specific plans
 for compliance. The final enforcement documents often spell out the terms and methodologies the facility must follow to mitigate and
 prevent the future release of pollutants. These documents serve as the starting point for EPA's calculations.

 Example of consent decree document containing pollutant mitigation instructions to the facility:
 http://www.epa.gov/compliance/resources/cases/civil/caa/essroc.html
 2c. Source Data Reporting
 When a formal administrative or judicial enforcement case is "concluded" enforcement staff enters information into ICIS to document the

                                                            306                                     Back to Table of Contents

-------
 Goal 5                                                 Objective 1                                            Measure 405
environmental benefits achieved by the concluded enforcement case. Original source documents may include facility permits, legal
documents such as consent decrees and administrative orders, inspection reports, case engineer reports and facility reports. For civil
judicial cases, the information is reported when a consent decree or court order, or judgment is entered (not lodged). For administrative
cases, information is reported when an administrative order or final agreement is signed.

Environmental benefits should be reported in the year the case is settled, regardless of when the benefits will occur. Reductions are
calculated after the judicial consent decree is lodged or entered, or when the administrative compliance order is signed by the region
designee and filed with the regional hearing clerk.
FY2012 CCDS.docx
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
 The ICIS FE&C data system meets Office of Environmental Information (OEI) Lifecycle Management Guidance, which includes data
validation processes, internal screen audit checks and verification, system and user documents, data quality audit reports, third party testing
reports, and detailed report specifications data calculation methodology. Reference: Quality Assurance and Quality Control procedures:
Data Quality: Life Cycle Management Policy, (EPA CIO2121, April 7, 2006)

The Integrated Compliance Information System (ICIS) is a three phase multi-year modernization project that improves the ability of EPA
and the states to ensure compliance with the nation's environmental laws with the collection of comprehensive enforcement and compliance
information. Phase I, implemented in FY02, replaced several legacy systems, and created an integrated system to support federal
enforcement and compliance tracking, targeting and reporting, including GPRA reporting. Phase II, also called Permit Compliance System
(PCS) Modernization, expands ICIS to include the National Pollutant Discharge Elimination System (NPDES) program and enables
improved management of the complete program (e.g., stormwater) as well as replacing the legacy PCS. PCS is currently identified as an
Agency Federal Managers' Financial Integrity Act (FMFIA) weakness, and the modernization of the system is critical to address the
weakness. Phase II was first  implemented in FY06 for 21 states and 11 tribes/territories that use ICIS to directly manage their NPDES
programs. In FY08, seven more states moved to ICIS from the legacy PCS and began electronically flowing their Discharge Monitoring
Report (DMR) data from their states systems via the Exchange Network and CDX to ICIS. In FY09, Phase II  continued with
implementation of the National Installation of NetDMR allowing NPDES permittees to electronically submit DMR data from permitted
facility systems via the Exchange Network to ICIS and migrated three additional states. In FY11 OECA implemented Full-Batch Release 1
of Phase II allowing Batch Flows of permits and facility data from states. FY12 will include Full-Batch Release 2 enabling batch flow will
allow Batch Flows of inspection data from states. Inspection information and was implemented early in FY12.  The final part of Phase II
which will add the remaining NPDES Batch Flows and migrate and all remaining states is projected to  be completed in FY13. Phase III
will modernize the Air Facility System (AFS) into ICIS. AFS is used by EPA and  States to track Clean Air Act enforcement and
                                                          307                                     Back to Table of Contents

-------
 Goal 5                                                   Objective 1                                             Measure 405
compliance activities. Integration of AFS into ICIS will modernize and replace a legacy system that does not meet current business needs.
Implementation of this phase is projected for FY14.

ICIS contains both source data and transformed data.

OECA's Data System Quality Assurance Plan
Data System Quality Assurance Plan (ICIS).doc
3b. Data Quality Procedures
Annual Data Certification Process - OECA has instituted a semi-annual data certification process for the collection and reporting of
enforcement and compliance information.  The certification process was set up to ensure all reporting entities are aware of the reporting
deadlines, receive the most up-to-date reporting instructions for select measures, follow best data management practices to assure reporting
accuracy, and have access to the recent methodologies for calculating pounds of pollutants reduced.  The hazardous waste pounds of
pollutants reduced measure is covered by the annual data certification process.

As part of the annual data certification process, regions are provided a checklist to assist them in their data quality procedures.
FY11 Data Quality Check List.pdf

OECA's Quality Management Plan - September 2011
PC QMP Concurrence Signatures.pdf PC QMP 2011 Final.docx
3c. Data Oversight
Source Data Reporting Oversight
HQ - Director, Enforcement Targeting and Data Division
Region 1 - Division Director, Office of Environmental Stewardship
Region 2 - Director, Office of Enforcement and Compliance Assistance
Region 3 - Director, Office of Enforcement, Compliance and Environmental Justice
Region 4 - Regional Counsel and Director, Office of Environmental Accountability
Region 5 - Director, Office of Enforcement and Compliance Assurance
Region 6 - Compliance Assurance and Enforcement Division Director
Region 7 - Enforcement Coordinator
Region 8 - Assistant Regional Administrator for Enforcement,  Compliance and Environmental Justice

                                                            308                                     Back to Table of Contents

-------
 Goal 5                                                 Objective 1                                            Measure 405
Region 9 - Enforcement Coordinator
Region 10 - Director, Office of Compliance and Enforcement

Information Systems Oversight Personnel
HQ - ICIS System Administrator
 Region 1 - ICIS Steward and Data Systems Administrator
Region 2 - ICIS System Administrator
Region 3 - ICIS Data Steward and System Administrator
Region 4 - ICIS System Administrator, Regional Compliance and Enforcement Data Steward
Region 5 - ICIS Data Steward and Systems Administrator
Region 6 - ICIS Data Steward
Region 7 - ICIS Data Steward and Systems Administrtor
Region 8 - ICIS System Administrator
Region 9 - ICIS System Administrator
Region 10 - ICIS System Administrator and Data Steward	
3d. Calculation Methodology                                                                                                 |
 The Case Conclusion Data Sheet (CCDS) is a manual data collection tool HQ implemented in FY 1996, updated in FY 2012, to collect
information on concluded federal enforcement cases including the case name and identification number, injunctive relief, environmental
benefits (including environmental benefits from Supplemental Environmental Projects [SEPs]), and assessed penalties. The CCDS data are
entered into the Integrated Information and Compliance System (ICIS).  OECA uses data obtained from the CCDS via ICIS to assess the
environmental outcomes of its enforcement program.

The CCDS guidance provides detailed calculation methodologies for estimating the environmental benefits on a variety of environmental
statutes including air, water, waste, toxics and pesticides.  Additionally, the CCDS provides specific instruction on how to enter the
environmental benefits information into ICIS.

To view the the CCDS guidance in its entirety go to:
 CCDS.xps
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
Oversight of Final Reporting: The Deputy Regional Administrators, the Office of Civil Enforcement Director, and the Monitoring,
Assistance and Program Division Director all must sign the attached certification form.
                                                          309                                     Back to Table of Contents

-------
 Goal 5                                                    Objective 1                                              Measure 405
          4Z,
Data Certification Form.pdf

Timing of Results Reporting: Semiannually
4b. Data Limitations/Qualifications
Pollutant reductions or eliminations reported in ICIS project an estimate of pollutants to be reduced or eliminated if the defendant carries
out the requirements  of the settlement. (Information on expected outcomes  of state  enforcement is not available.)   The estimates use
information available at the time a case settles or an order is issued.  In some instances, EPA develops and enters  this information on
pollutant reduction estimates after the settlement or during continued discussions over specific plans for compliance. Due to the time
required for EPA to negotiate a settlement agreement with a defendant, there may be a delay in completing the CCDS. Additionally,
because of unknowns at the time of settlement, different levels of technical proficiency, or the nature of a case, OECA's expectation is that
the overall amount of pollutants reduced  or  eliminated is  prudently underestimated based on CCDS information.  EPA also bases the
pollutant estimates on the expectation that the  defendant/respondent implements the negotiated settlement agreement. _
4c. Third-Party Audits                                                                                                              |
Inspector General Report on Pounds of Pollutants Reduced estimates:
Projected Lbs of Pollutants Reduced.pdf
 Record Last Updated: 02/13/2012 01:16:59 PM
                                                             310                                      Back to Table of Contents

-------
  GoalS                                               Objective 1                                           Measure 410
  Measure Code :  410 - Number of civil judicial and  administrative enforcement cases
  initiated.
  Office of Enforcement and Compliance Assurance  (OECA)
   1. Measure and DQR Metadata
   Goal Number and Title                          5 - Enforcing Environmental Laws
   Objective Number and Title                      ! - Enforce Environmental Laws
   Sub-Objective Number and Title                  1 - Maintain Enforcement Presence
   Strategic Target Code and Title                    2 - Bv 2015, initiate 19,500 civil judicial and administrative enforcement cases
   Managing Office	Office of Compliance	
   Performance Measure Term Definitions
Civil Judicial Enforcement Cases:  a civil judicial enforcement case is a formal lawsuit, filed in court, against a person who has either
failed to comply with a statutory or regulatory requirement,  administrative order, or against a person who has contributed to a release. Civil
judicial actions are often employed in situations that present repeated or significant violations or where there are serious environmental
concerns. Attorneys from the U.S. Department of Justice prosecute civil judicial enforcement cases for the Agency.

Civil Administrative Enforcement Cases: A civil administrative enforcement case is an enforcement action taken by EPA under its own
authority. Administrative enforcement cases can take several forms, including EPA issuing an administrative order requiring a facility to
implement specific corrective measures to filing an administrative complaint commencing a form administrative adjudication.
Administrative actions tend to be resolved quickly and can often be quite effective in bringing the facility into compliance with the
regulations or in remedying a potential threat to human health of the environment.

Initiated: A civil judicial enforcement case is considered initiated when it has been referred to DOJ. A referral is a formal written request
to another agency or unit of government to proceed with judicial enforcement relating to the violation(s) in question.

Civil administrative enforcement cases are considered initiated when an administrative order or an administrative penalty order on consent
has been issued by a Regional Administrator or designee.
 2. Data Definition and Source Reporting
 2a. Original Data Source
 EPA attorneys
                                                         311                                    Back to Table of Contents

-------
 GoalS                                                 Objective 1                                            Measure 410
EPA regional hearing clerks
DOJ attorneys
Federal and state courts
2b. Source Data Collection
The source data for this measure is found on initiated enforcement documents.  For example, the attached initiated administrative order was
issued by the Region 4 Assistant Administrator. An enforcement record is created in ICIS with the regional administrator's signature date
which indicates the case has been initiated.

Example of an initiated case document:
Admin Order.pdf
2c. Source Data Reporting
Referral Letters
Administrative Penalty Orders
Administrative Compliance Orders
Unilateral Administrative Orders
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
 The ICIS FE&C data system meets Office of Environmental Information (OEI) Lifecycle Management Guidance, which includes data
validation processes, internal screen audit checks and verification, system and user documents, data quality audit reports, third party testing
reports, and detailed report specifications data calculation methodology. Reference: Quality Assurance and Quality Control procedures:
Data Quality: Life Cycle Management Policy, (EPA CIO2121, April 7, 2006)

The Integrated Compliance Information System (ICIS) is a three phase multi-year modernization project that improves the ability of EPA
and the states to ensure compliance with the nation's environmental laws with the collection of comprehensive enforcement and compliance
information. Phase I, implemented in FY02, replaced several legacy systems, and created an integrated system to support federal
enforcement and compliance tracking, targeting and reporting, including GPRA reporting. Phase II, also called Permit Compliance System
(PCS) Modernization, expands ICIS to include the National Pollutant Discharge Elimination System (NPDES) program  and enables
improved management of the complete program (e.g., stormwater) as well as replacing the legacy PCS. PCS is currently identified as an
Agency Federal Managers' Financial Integrity Act (FMFIA) weakness, and the modernization of the system is critical to address the
weakness. Phase II was first implemented in FY06 for 21 states and 11 tribes/territories that use ICIS to directly manage their NPDES
programs. In FY08, seven more states moved to ICIS from the legacy PCS and began electronically flowing their Discharge Monitoring
Report (DMR) data from their states systems via the Exchange Network and CDX to ICIS. In FY09, Phase II continued with
                                                          312                                     Back to Table of Contents

-------
 GoalS                                                  Objective 1                                             Measure 410
implementation of the National Installation of NetDMR allowing NPDES permittees to electronically submit DMR data from permitted
facility systems via the Exchange Network to ICIS and migrated three additional states. In FY11 OECA implemented Full-Batch Release 1
of Phase II allowing Batch Flows of permits and facility data from states. FY12 will include Full-Batch Release 2 enabling batch flow will
allow Batch Flows of inspection  data from states. Inspection information and was implemented early in FY12. The final part of Phase II
which will add the remaining NPDES Batch Flows and migrate and all remaining states is projected to be completed in FY13. Phase III
will modernize the Air Facility System (AFS) into ICIS. AFS is used by EPA and States to track Clean Air Act enforcement and
compliance activities. Integration of AFS into ICIS will modernize and replace a legacy system that does not meet current business needs.
Implementation of this phase is projected for FY14.

ICIS contains both source data and transformed data.

OECA's Data System Quality Assurance Plan
Data System Quality Assurance Plan (ICIS).doc	
3b. Data Quality Procedures
Annual Data Certification Process - OECA has instituted a semi-annual data certification process for the collection and reporting of
enforcement and compliance information. The certification process was set up to ensure all reporting entities are aware of the reporting
deadlines, receive the most up-to-date reporting instructions for select measures, follow best data management practices to assure reporting
accuracy, and have access to the recent methodologies for calculating pounds of pollutants reduced. The cases initiated measure is covered
by the annual data certification process.

As part of the annual data certification process, regions are provided a checklist to assist them in their data quality procedures.
FY1 1 Data Quality Check List.pdf
OECA's Quality Management Plan - September 2011
PC QMP Concurrence Signatures.pdf PC QMP 201 1 Final. docx
3c. Data Oversight
Source Data Reporting Oversight
HQ - Director, Enforcement Targeting and Data Division
Region 1 - Division Director, Office of Environmental Stewardship
Region 2 - Director, Office of Enforcement and Compliance Assistance
Region 3 - Director, Office of Enforcement, Compliance and Environmental Justice
                                                           313                                      Back to Table of Contents

-------
 GoalS                                                 Objective 1                                            Measure 410
Region 4 - Regional Counsel and Director, Office of Environmental Accountability
Region 5 - Director, Office of Enforcement and Compliance Assurance
Region 6 - Compliance Assurance and Enforcement Division Director
Region 7 - Enforcement Coordinator
Region 8 - Director, Policy, Information Management and Environmental Justice
Region 9 - Enforcement Coordinator
Region 10 - Director, Office of Compliance and Enforcement

Information Systems Oversight Personnel
HQ -  ICIS System Administrator
 Region 1 - ICIS Steward and Data Systems Administrator
Region 2 - ICIS System Administrator
Region 3 - ICIS Data Steward and System Administrator
Region 4 - ICIS System Administrator, Regional  Compliance and Enforcement Data Steward
Region 5 - ICIS Data Steward and Systems Administrator
Region 6 - ICIS Data Steward
Region 7 - ICIS Data Steward and Systems Administrtor
Region 8 - ICIS System Administrator
Region 9 - ICIS System Administrator
Region 10 - ICIS System Administrator and Data Steward	
3d. Calculation Methodology	
A civil or judicial case is counted as initiated when one instance of the following occurs:

Civil judicial enforcement cases are considered initiated when a referral has been made to DOJ.

Civil administrative enforcement cases are considered initiated when an administrative order or an administrative penalty order on consent
has been issued by a Regional Administrator or designee.
4. Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The Deputy Regional Administrators, the Office of Civil Enforcement Director, and the Monitoring, Assistance and Program Division
Director all must sign the attached certification form.
Data Certification Form.pdf
4b. Data Limitations/Qualifications
                                                          314                                     Back to Table of Contents

-------
 GoalS                                                      Objective 1                                                 Measure 410
The potential always exists that there are facilities,  not yet identified  as part of the regulated universe, subject to an EPA enforcement
action.	
4c. Third-Party Audits                                                                                                                   |
None to-date.
 Record Last Updated: 02/13/2012 01:17:00 PM
                                                                315                                        Back to Table of Contents

-------
  GoalS                                               Objective 1                                           Measure 411
  Measure Code : 411 - Number of civil judicial and administrative enforcement cases
  concluded.
  Office of Enforcement  and Compliance Assurance (OECA)
   1. Measure and  DQR Metadata
   Goal Number and Title                           5 - Enforcing Environmental Laws
   Objective Number and Title                       ! - Enforce Environmental Laws
   Sub-Objective Number and Title                   1 - Maintain Enforcement Presence
   Strategic Target Code and Title                    3 - Bv 2015, conclude 19,000 civil judicial and administrative enforcement cases
   Managing Office                                  Office of Compliance
   Performance Measure Term Definitions
Civil Judicial Enforcement Cases: a civil judicial enforcement case is a formal lawsuit, filed in court, against a person who has either
failed to comply with a statutory or regulatory requirement, administrative order, or against a person who has contributed to a release.  Civil
judicial actions are often employed in situations that present repeated or significant violations or where there are serious environmental
concerns. Attorneys from the U.S. Department of Justice prosecute civil judicial enforcement cases for the Agency.

Civil Administrative Enforcement Cases: A civil administrative enforcement case is an enforcement action taken by EPA under its own
authority. Administrative enforcement cases can take several forms, including EPA issuing an administrative order requiring a facility to
implement specific corrective measures to filing an administrative complaint commencing a form administrative adjudication.
Administrative actions tend to be resolved quickly and can ofte be quite effective in bringing the facility into compliance with the
regulations or in remedying a potential threat to human health of the environment.

Concluded: For purposes of this measure, there are two types of concluded enforcement actions counted.

The first are administrative enforcement actions which are  undertake by EPA through authority granted to it under various federal
environmental statutes, such as CERCLA, RCRA, CAA, CWA, TSCA, and others.  An administrative action is concluded when a written
agreement between the defendant/respondent and EPA resolving the complaint is documented in a Consent Agreement/Final Order
(CA/FOs), is signed by the Regional Administrator or designee, and is filed with the regional hearing clerk.

The second type of enforcement action is known as a civil judicial action.  Civil judicial actions attorneys from the U.S. Department of
Justice prosecute civil cases for EPA.  A concluded action occurs when a consent decree is signed by all parties to the action and filed in the
appropriate court and signed by a judge or a written ruling or decision is made by a judge after a full trial.
                                                         316                                    Back to Table of Contents

-------
 GoalS                                                Objective 1                                           Measure 411
2. Data  Definition and Source Reporting
2a. Original Data Source
 EPA attorneys
EPA regional hearing clerks
DOJ attorneys
Federal and state courts
2b. Source Data Collection
The source data for this measure is found on completed enforcement documents. For example, the attached final consent agreement and
final order (CAFO) contains the final date stamp affixed by the regional hearing clerk. An enforcement record is created in ICIS with the
CAFO's final date indicating the case has been concluded.

Example of a concluded enforcement case document:
CAFO.pdf
2c. Source Data Reporting
 Administrative Penalty Orders
Administrative Penalty Orders on Consent
Consent Decrees
Notice of Determination
Unilateral Administrative Orders
3. Information Systems and  Data Quality Procedures
3a.  Information Systems
The ICIS FE&C data system meets Office of Environmental Information (OEI) Lifecycle Management Guidance, which includes data
validation processes, internal screen audit checks and verification, system and user documents, data quality audit reports, third party testing
reports, and detailed report specifications data calculation methodology. Reference: Quality Assurance and Quality Control procedures:
Data Quality: Life Cycle Management Policy, (EPA CIO2121, April 7,  2006)

The Integrated Compliance Information System (ICIS) is a three phase  multi-year modernization project that improves the ability of EPA
and the states to ensure compliance with the nation's environmental  laws with the collection of comprehensive enforcement and compliance
information. Phase I, implemented in FY02, replaced several legacy systems, and created an integrated system to support federal
enforcement and compliance tracking, targeting and reporting, including GPRA reporting. Phase II, also called Permit Compliance System
(PCS) Modernization, expands ICIS to include the National Pollutant Discharge Elimination System (NPDES) program and enables
improved management of the complete program (e.g., stormwater) as well as replacing the legacy PCS. PCS is currently identified as an
                                                         317                                    Back to Table of Contents

-------
 GoalS                                                  Objective 1                                             Measure 411
Agency Federal Managers' Financial Integrity Act (FMFIA) weakness, and the modernization of the system is critical to address the
weakness. Phase II was first implemented in FY06 for 21 states and 11 tribes/territories that use ICIS to directly manage their NPDES
programs. In FY08, seven more states moved to ICIS from the legacy PCS and began electronically flowing their Discharge Monitoring
Report (DMR) data from their states systems via the Exchange Network and CDX to ICIS. In FY09, Phase II continued with
implementation of the National Installation of NetDMR allowing NPDES permittees to electronically submit DMR data from permitted
facility systems via the Exchange Network to ICIS and migrated three additional states. In FY11 OECA implemented Full-Batch Release 1
of Phase II allowing Batch Flows of permits and facility data from states. FY12 will include Full-Batch Release 2  enabling batch flow will
allow Batch Flows of inspection data from states. Inspection information and was implemented early in FY12. The final part of Phase II
which will add the remaining NPDES Batch Flows and migrate and all remaining states is projected to be completed in FY13. Phase III
will modernize the Air Facility System (AFS)  into ICIS. AFS is used by EPA and States to track Clean Air Act enforcement and
compliance activities. Integration of AFS into  ICIS will modernize and replace a legacy system that does not meet  current business needs.
Implementation of this phase is projected for FY14.

ICIS contains both source data and transformed data.

Data System Quality Assurance Plan
Data System Quality Assurance Plan (ICIS).doc
3b. Data Quality Procedures
Annual Data Certification Process - OECA has instituted a semi-annual data certification process for the collection and reporting of
enforcement and compliance information. The certification process was set up to ensure all reporting entities are aware of the reporting
deadlines, receive the most up-to-date reporting instructions for select measures, follow best data management practices to assure reporting
accuracy, and have access to the recent methodologies for calculating pounds of pollutants reduced. The cases concluded measure is
covered by the annual data certification process.

As part of the annual data certification process, regions are provided a checklist to assist them in their data quality procedures.
FY11 Data Quality Check List.pdf

OECA's Quality Management Plan - September 2011
 OC QMP Concurrence Signatures.pdf OC QMP 2011 Final.docx
                                                                                                                             I
                                                           318                                      Back to Table of Contents

-------
 GoalS                                                 Objective 1                                            Measure 411
3c. Data Oversight
Source Data Reporting Oversight:
HQ - Director, Enforcement Targeting and Data Division
Region 1 - Division Director, Office of Environmental Stewardship
Region 2 - Director, Office of Enforcement and Compliance Assistance
Region 3 - Director, Office of Enforcement, Compliance and Environmental Justice
Region 4 - Regional Counsel and Director, Office of Environmental Accountability
Region 5 - Director, Office of Enforcement and Compliance Assurance
Region 6 - Compliance Assurance and Enforcement Division Director
Region 7 - Enforcement Coordinator
Region 8 - Director, Policy, Information Management and Environmental Justice
Region 9 - Enforcement Coordinator
Region 10 - Director, Office of Compliance and Enforcement

Information Systems Oversight Personnel
HQ -  ICIS System Administrator
Region 1 - ICIS Steward and Data Systems Administrator
Region 2 - ICIS System Administrator
Region 3 - ICIS Data Steward and System Administrator
Region 4 - ICIS System Administrator, Regional Compliance and Enforcement Data Steward
Region 5 - ICIS Data Steward and Systems Administrator
Region 6 - ICIS Data Steward
Region 7 - ICIS Data Steward and Systems Administrtor
Region 8 - ICIS System Administrator
Region 9 - ICIS System Administrator
Region 10 - ICIS System Administrator and Data Steward	
3d. Calculation Methodology	
A civil or judicial case is counted as concluded when one instance of the following occurs:

An  administrative action is concluded when a written agreement between the defendant/respondent and EPA resolving the complaint is
documented in a Consent Agreement/Final Order (CA/FOs), is signed by the Regional Administrator or designee, and is filed with the
regional hearing clerk.

A civil judicial action is concluded when a consent decree is signed by all parties to the action and filed in the appropriate court and signed
by a judge or a written ruling or decision is made by a judge after a full trial.
                                                          319                                     Back to Table of Contents

-------
 GoalS                                                    Objective 1                                              Measure 411

4.  Reporting and Oversight _
4a. Oversight and Timing of Results Reporting
The Deputy Regional Administrators, the Office of Civil Enforcement Director, and the Monitoring, Assistance and Program Division
Director all must sign the attached certification form.
Data Certification Form.pdf
4b. Data Limitations/Qualifications
The potential always exists that there are facilities, not yet identified as part of the regulated universe, subject to an EPA enforcement
action.
4c. Third-Party Audits
None to-date.
 Record Last Updated: 02/13/2012 01:17:00 PM
                                                             320                                      Back to Table of Contents

-------
  GoalS                                               Objective 1                                           Measure 418
  Measure Code : 418 - Percentage of criminal cases having the most significant
  health, environmental, and deterrence impacts.
  Office of Enforcement  and  Compliance Assurance (OECA)
   1. Measure and  DQR Metadata
   Goal Number and Title                           5 - Enforcing Environmental Laws
   Objective Number and Title                       ! - Enforce Environmental Laws
   Sub-Objective Number and Title                   1 - Maintain Enforcement Presence
   Strategic Target Code and Title                    5 - Eacn vear through 2015, support cleanups and save federal dollars for sites
   Managing Office                                  Office of Criminal Enforcement
   Performance Measure Term Definitions
Criminal Case Docket: A criminal case exists when EPA's criminal enforcement program, specifically special agents in the Criminal
Investigation Division (CID), investigate allegations of criminal violations of environmental law. The EPA active ("open") criminal case
docket consists of cases in all stages of the legal process - from initial investigations to charged cases to convicted cases that are awaiting
sentencing or are on appeal.

Most Significant Health, Environmental, and Deterrence Impacts: The most significant cases are defined by the categories of health
effects (e.g.,  death, serious injury, or exposure, etc.),  pollutant release and discharge characteristics (e.g., documented exposure, need for
remediation,  etc.) and defendant profiles (e.g., size of business, compliance history, etc.) The cases with the most significant health,
environmental and deterrent impacts fall into Tier 1 and Tier 2 of four possible categories of tiers (as calculated by the tiering methodology
(cf section 3d). The tier designation is used throughout the investigative process including case selection and prosecution.

For more information about EPA's Criminal Enforcement Program, visit http://www. epa. gov/compliance/criminal/ .


 2. Data Definition and Source Reporting	
 2a. Original Data Source                                                                                                   |
 All data used to calculate and classify the "most significant cases" result from  evidence collected during the investigative process. The
 Criminal Investigation Division (CID) special agent  assigned to the case creates an Investigative Activity Report (IAR, cf 419,420, 421).
 The IAR is the primary means used to document all investigative activity operational activities, judicial activities, or responses to
 investigative tasking or leads. Investigative activities include interviews, surveillance, electronic monitoring, arrests, searches, evidence
 handling and disposition, and document reviews. Operational activities include undercover reports, and consensual monitoring. Judicial
 activities include legal documents such as indictments, criminal informations,  criminal complaints, guilty pleas, trials, convictions, and
 sentencing hearings and results. Investigative tasking relates to collateral requests from CID headquarters and other offices, as well as
                                                          321                                    Back to Table of Contents

-------
 GoalS                                                 Objective 1                                            Measure 418
memorializing activity conducted in furtherance of lead inquiries.	
2b. Source Data Collection                                                                                                    |
Source Data Collection Methods:
Tabulation of records or activities. Information used for the case tiering methodology (cf section 3d) comes from the evidence collected
during the course of the  investigation. Forensic evidence gathering (e.g., environmental sampling and analysis) is conducted by the
National Enforcement Investigations Center (NEIC) or other EPA laboratories or programs in conformity with their established protocols.

The data for case tiering is compiled through the lARs and legal documents which are collected and entered into the Criminal Case
Reporting System (CCRS, cf section 3a). OCEFT collects data on a variety of case attributes to describe the range, complexity, and quality
of the national docket. Data for selected attributes are being used to categorize the cases into four tiers based on the severity of the crime
associated with the alleged violation.

Date/Time Intervals Covered by Source Data:
Ongoing.

EPA QA requirements/guidance governing collection:
All criminal enforcement special agents receive training on the accurate completion of IAR reports and the entry of criminal case data into
the CCRS.

Geographical Extent of Source Data:
National.
2c. Source Data Reporting                                                                                                     |
Form/mechanism for receiving data and entering into  EPA system:
After a criminal case is opened, all major data and information is entered into CCRS and is tracked through all subsequent stages of the
criminal enforcement process. All case information and data that will be used for the case tiering methodology is entered into CCRS,
including information about the pollutants involved and the impact on the public and the environment that  result from forensic sampling
and analysis undertaken  as a routine part of the investigation of the alleged violations.

Timing and frequency  of reporting: The status of the case is updated as the legal process proceeds.
3.  Information Systems and Data Quality Procedures
3a.  Information Systems
CCRS stores criminal enforcement data in an enforcement sensitive database which contains historical data on all criminal enforcement
prosecutions as well as information about the pollutants involved and the impact on the public and the environment. CCRS contains a drop
down menu for entering all data used to assign a case to a specific tier. When all required fields are populated, the system automatically


                                                          322                                     Back to Table of Contents

-------
 GoalS                                                    Objective 1                                              Measure 418
determines the tier for the case. Designating a tier is mandatory for all open criminal cases.

CCRS is an internal EPA database; All public legal documents relating to prosecuted criminal cases (e.g., the indictments, guilty pleas, trial
verdicts and judge's sentencing decisions) are publicly available through Public Access to Court Electronic Records (PACER), an
electronic public access service that allows users to obtain case  and docket information from federal appellate, district and bankruptcy
courts (http://www.pacer.gov/).


3b. Data Quality Procedures
Environmental and forensic data used to conduct case tiering is  supplied from EPA's National Enforcement Investigations Center (NEIC),
national databases, and other EPA programs. This data has been QA/QCd following the protocols established by those programs. It should
be noted that the data will often serve as evidence in criminal judicial enforcement proceedings, so the quality and sufficiency of the data is
carefully reviewed.	
3c. Data Oversight                                                                                                                |
Initial oversight at the field level is the responsibility of the Special Agent-in-Charge and Assistant Special Agent-in-Charge of the criminal
office managing the case. That information is further reviewed by OCEFT HQ through semi-annual case management reviews conducted
by the Assistant Director for Investigations, CID.

3d. Calculation Methodology
 The methodology for the measure "percent of criminal cases with the most significant health, environmental and deterrence impact" used
the FY 2010 criminal enforcement docket to develop the baseline and targets for FY 2011-15..  The cases are analyzed and scored on a
variety of case attributes describing the range, complexity and quality of the criminal enforcement docket. Cases are then entered into one
of four categories ("tiers") depending upon factors such as the human health (e.g., death, serious injury) and environmental impacts, the
nature of the pollutant and its release into the environment, and violator characteristics (e.g., repeat violator, size and location(s) of the
regulated entity)

Many of the data elements used in the tier method are directly linked to the Federal Sentencing Guidelines:

http://www.ussc.gov/guidelines/2010_guidelines/index.cfm

See the two attachments for graphic representations of the criminal case tier methodology and the explanations of the categories. They
indicate the process used to assign a case to one of the four tiers.
   k-7  ^
tiering.pptx     tieirngmethodology2012.ppt
                                                             323                                      Back to Table of Contents

-------
 GoalS                                                   Objective 1                                             Measure 418
 Tiering is based upon these decision rules:

Tier 1 (1  or highest): any case involving death or actual serious injury; otherwise a case that possesses specified attributes in at least three
of four established categories.

Tier 2 (second): two categories out of four

Tier 3 (third): one category out of four

Tier 4 (fourth): no category

Tier 1 and Tier 2 cases added together and divided by the total number of open cases in the criminal case docket is how the "most
significant cases"  cases measure, that also serves as the Key Performance indicator for the criminal enforcement program, is calculated.
The measure only reflects the percentage of cases in the upper two tiers.

Time frame: Updated throughout the fiscal year as the case docket changes. Fiscal Year (October - September) Semiannual reporting.

Unit of analysis: Percent	


4.  Reporting and Oversight	
4a. Oversight and Timing of Results Reporting                                                                                       |
Oversight of Final Reporting:: Once initial case tiering has been conducted by the case agent, initial oversight, review and quality
assurance at the field level is the responsibility of the Special Agent-in-Charge and Assistant Special Agent-in-Charge of the criminal
enforcement office managing the case. It receives a second round of review in HQ by CID's Assistant Director for Investigations, who also
conducts a semi-annual review of all cases in the criminal case docket. The review includes discussions of any new evidence or
information  that would potentially  affect or change the tier in which a case had been assigned. Any decision to categorize a case as being a
Tier 4 (lowest level) case must be approved by both the SAC and the Assistant Director for Investigations.  Data is verified on an on-going
basis.

Timing of Results Reporting: Semiannually.
4b. Data Limitations/Qualifications                                                                                                 |
A case's tier classification may change as cases are investigated and additional information uncovered. Potential data limitations include
inaccurate environmental sampling or mistakes in evidence gathering that can result in improper classification or "tiering" of an individual
case. Determining data for some characteristics used in tiering may be based upon ranges or estimates (e.g., the extent of documented
human population exposure to a toxic pollutant may be based upon a consensus or "best estimate" of the geographic area surrounding the


                                                            324                                     Back to Table of Contents

-------
  GoalS                                                       Objective 1                                                 Measure 418
 release rather than a detailed examination of all people potentially exposed).

 4c. Third-Party Audits
~N/A
  Record Last Updated: 02/13/2012 01:17:00 PM
                                                                  325                                          Back to Table of Contents

-------
  GoalS                                                Objective 1                                           Measure 419
  Measure Code : 419 - Percentage of criminal cases with individual  defendants.
  Office of Enforcement and Compliance Assurance (OECA)
   1.  Measure and DQR Metadata	
   Goal Number and Title                           5 - Enforcing Environmental Laws
   Objective Number and Title                       ! - Enforce Environmental Laws
   Sub-Objective Number and Title                   1 - Maintain Enforcement Presence
   Strategic Target Code and Title                    7 - By 2015, maintain an 85 percent conviction rate for criminal defendants
   Managing Office                                  Office of Criminal Enforcement
   Performance Measure Term Definitions
Criminal Cases: A criminal case exists when EPA's criminal enforcement program, specifically special agents in the Criminal
Investigation Division (CID), investigate allegations of criminal violations of environmental law. The EPA active ("open") criminal case
docket consists of  cases in all stages of the legal process - from initial investigations to charged cases to convicted cases that are awaiting
sentencing or are on appeal.

A criminal case with charges filed is one in which, based upon an investigation by the EPA criminal enforcement program, the U.S.
Department of Justice formally files charges against one or more defendants (either a person, company or both) alleging a criminal violation
of one or more of the environmental statutes and/or associated violations of the U.S. Criminal Code in U.S. District Court.

Individual Defendants: An individual defendant is a person, as opposed to a company. Criminal enforcement can be employed against
persons and companies. Individuals, unlike companies, can be sentenced to prison, as well as paying a monetary fine, for breaking the
criminal law. It is the possibility of incarceration that most distinguishes criminal law from civil law and, therefore, enables criminal law to
provide the most deterrence.

For more information about EPA's Criminal Enforcement Program, visit http://www. epa. gov/compliance/criminal/ .


 2. Data Definition and Source Reporting	
 2a. Original Data Source                                                                                                    |
 As part of the investigative process, the Criminal Investigation Division (CID) special agent assigned creates an Investigative Activity
 Report (IAR). The IAR is the primary means used to document all investigative activity, operational activities, judicial activities, or
 responses to investigative tasking or leads. Investigative activities include interviews, surveillance, electronic monitoring, arrests,  searches,
 evidence handling and disposition, and document reviews.  Operational activities include undercover reports, and consensual monitoring.
 Judicial activities include indictments, criminal informations, criminal complaints, guilty pleas, trials, convictions, and sentencing hearings
                                                          326                                    Back to Table of Contents

-------
 GoalS                                                   Objective 1                                             Measure 419
and results. Investigative tasking relates to collateral requests from CID headquarters and other offices, as well as memorializing activity
conducted in furtherance of lead inquiries.

All relevant data is entered into the Criminal Case Reporting System (CCRS, cf section 3a), which tracks a criminal investigation from the
time it is first opened through all stages of the legal process to a conclusion (e.g., when the case is indicted, when a defendant is found
guilty, sentenced or acquitted.) CCRS is used to create the IAR.

 Once the defendants are charged, the data used to compile the measure is based upon the legal documents outlining the criminal charges
(which can either take the form of a criminal information or criminal indictment) that is filed by either the Office of the U.S. Attorney or
the Environmental Crimes Section at DOJ HQ and filed in the U.S. District Court in which the alleged criminal violations occurred. The
charges are part of the case file.

2b. Source Data Collection                                                                                                       |
Source Data  Collection Methods: The measure is  based upon enforcement and legal documents which memorialize the status of a
criminal prosecution. As noted above, the data for the measure are formally compiled through the lARs and DOJ legal documents entered
into CCRS. In addition, all public legal documents relating to a charged case (e.g., the indictment or criminal information), including the
names of all defendants, is also entered into and are publicly available through Public Access to Court Electronic Records (PACER), an
electronic public access service that allows users to obtain case and docket information from federal appellate, district and bankruptcy
courts http://www.pacer.gov/).

Date/Time Intervals Covered by Source Data:
Ongoing.

EPA OA Requirements/Guidance Governing collection;
All criminal enforcement special agents receive training on the accurate completion of IAR reports and the entry of criminal case data into
the CCRS.

Geographical Extent of Source Data:
National.
2c. Source Data Reporting                                                                                                       |
After DOJ formally charges the defendants, the information is entered into CCRS (e.g., all the violations alleged, all of the defendants
charged, as well as forensic information about the pollutants involved and the impact on the public and the environment.) The status of the
case is updated as the legal process proceeds. The case agents update and enter into CCRS or submit to their superior lARs which highlight
changes in the case and all subsequent stages of the criminal enforcement process (e.g., a case is dismissed or the defendants are either
acquitted or convicted and sentenced.)

Timing and frequency of reporting: The status of the case is updated as the legal process proceeds.

                                                            327                                     Back to Table of Contents

-------
 GoalS                                                  Objective 1                                             Measure 419

3.  Information Systems and  Data Quality Procedures	
3a.  Information Systems                                                                                                      |
The Criminal Case Reporting System (CCRS) stores criminal enforcement information and data in an enforcement sensitive database which
contains historical data on all criminal enforcement prosecutions as well as information about the pollutants involved and the impact on the
public and the environment.  CCRS maintains information pertaining to individuals and companies associated with the Criminal
Investigation Division's criminal leads and cases, as well as other information related to the conduct of criminal investigations.

 The data is used to document the progress and results of criminal investigations. The data used for all criminal enforcement performance
measures are in the CCRS database.

The status of the case is updated on CCRS as the legal process proceeds. All legal documents relating to a prosecution are entered into the
system

3b.  Data Quality Procedures                                                                                                   |
The Criminal Investigations Division (CID) has a process for document control and records and has Quality Management Plans in place.
The information on charged cases that is entered into CCRS goes through  several layers of review. Initial verification of the quality and
accuracy of case information is the responsibility of the Special Agent-in-Charge (SAC) of the office that is managing the case. HQ
responsibility for QA/QC is conducted by the System Administrator of CCRS.

3c. Data Oversight
Initial oversight at the field level is the responsibility of the Assistant Special Agent-in-Charge (ASAC) and  Special Agent-in-Charge
(SAC) of the criminal enforcement office managing the case.  That information is further reviewed by OCEFT HQ through semi-annual
case management reviews conducted by the Assistant Director of Investigations,  CID and quarterly reports by the System Administrator  of
CCRS. The System Administrator, who creates all statistical and management reports based on information in CCRS, conducts regular
oversight of the  data entered by  the criminal enforcement field offices to ensure that all data entered into CCRS is complete and accurate.

3d.  Calculation Methodology                                                                                                   |
The methodology for the criminal enforcement measure "Percent of criminal cases with individual defendants" employed a three year
analysis (FY2008-2010) to develop the baseline and targets.

The decision rules reflect the legal status of the individuals who are named as charged  defendants. The data files relevant to this analysis
include defendant names and type (individual or company), date of charges filed  and the actual statutes  (either or both environmental or
U.S. Criminal Code) listed in the criminal indictment or criminal information.

There are no assumptions or "quantifiers" used in calculating the measure. The measure is based upon the legal status of cases, i.e., whether
the  case has at least one individual person charged as a defendant that is being prosecuted. The measure is calculated by dividing the

                                                           328                                      Back to Table of Contents

-------
  GoalS                                                  Objective 1                                             Measure 419
 number of charged cases that have at least one individual defendant during the current Fiscal Year (numerator) by the total number of
 charged criminal cases during the current Fiscal Year (denominator).

 Timeframe: Fiscal Year (October - September) Semiannual reporting.

 Unit of analysis: Percent.


 4. Reporting and  Oversight	
 4a. Oversight and Timing of Results Reporting
 The System Administrator of the CCRS has the responsibility for compiling and verifying the accuracy of the report on charged defendants.
 Once compiled, data goes  through a second level of verification through the Assistant Director of Investigations, CID. While data is
 verified on an on-going basis, final verification is conducted at the end of the fiscal year.

 Timing of Results Reporting:
 Semiannually.	
 4b. Data Limitations/Qualifications	
 N/A, since the measure is based on the legal status of prosecuted individual defendants

 4c. Third-Party Audits
"N/A
  Record Last Updated: 02/13/2012 01:16:59 PM
                                                            329                                      Back to Table of Contents

-------
  Goal 5                                                Objective 1                                            Measure 420
  Measure Code : 420 - Percentage of criminal cases with charges  filed.
  Office of Enforcement and Compliance Assurance (OECA)
   1.  Measure and DQR Metadata	
   Goal Number and Title                          5 - Enforcing Environmental Laws
   Objective Number and Title                      ! - Enforce Environmental Laws
   Sub-Objective Number and Title                   1 - Maintain Enforcement Presence
   Strategic Target Code and Title                    6 - By 2015, increase the number of criminal cases with charges filed
   Managing Office                                  Office of Criminal Enforcement
   Performance Measure Term Definitions
Criminal Cases: A criminal case exists when EPA's criminal enforcement program, specifically special agents in the Criminal
Investigation Division (CID), investigate allegations of criminal violations of environmental law. The EPA active ("open") criminal case
docket consists of  cases in all stages of the legal process - from initial investigations to charged cases to convicted cases that are awaiting
sentencing or are on appeal.

Charges Filed: A criminal case with charges filed is one in which, based upon an investigation by the EPA criminal enforcement program,
the U.S. Department of Justice formally files charges against one or more defendants (either a person, company or both) alleging a criminal
violation of one or more of the environmental statutes and/or associated violations of the U.S. Criminal Code in U.S. District Court.

For more information about EPA's Criminal Enforcement Program, visit http://www. epa. gov/compliance/criminal/ .


 2. Data Definition and Source Reporting	
 2a. Original Data Source	
 As part of the investigative process, the Criminal Investigation Division (CID) special agent assigned to the case completes an
 Investigation Activity Report (IAR). The IAR is the primary means used to document all investigative activity, operational activities,
 judicial activities, or responses to investigative tasking or leads. Investigative activities include interviews, surveillance, electronic
 monitoring, arrests, searches, evidence handling and disposition, and document reviews. Operational activities include undercover reports,
 and consensual monitoring.  Judicial activities include indictments, criminal informations,  criminal complaints, guilty pleas, trials,
 convictions, and sentencing hearings and results.  Investigative tasking relates to collateral requests from CID headquarters and other
 offices, as well as memorializing activity conducted in furtherance of lead inquiries.


 All relevant data is entered into the Criminal Case Reporting System (CCRS, cf section 3a), which tracks a criminal investigation from the
                                                          330                                    Back to Table of Contents

-------
 Goal 5                                                   Objective 1                                              Measure 420
time it is first opened through all stages of the legal process to a conclusion (e.g., when the case is indicted, when a defendant is found
guilty, sentenced or acquitted.) CCRS is used to create the IAR.

 Once the defendants are charged, the data used to compile the measure is based upon the legal documents outlining the criminal charges
(which can either take the form of a criminal information or criminal indictment) that is filed by either the Office of the U.S. Attorney or
the Environmental Crimes Section at DOJ HQ and filed in the U.S. District Court in which the alleged criminal violations occurred. The
charges are part of the case file.
2b. Source Data Collection	|
Source Data Collection Methods:  The measure is based upon enforcement and legal documents which memorialize the status of a
criminal prosecution. As noted above, the data for the measure are formally compiled through the lARs and DOJ legal documents entered
into CCRS. In addition, all public legal documents relating to a charged case (e.g., the indictment or criminal information), including the
names of all defendants, is also entered into and are publicly available through Public Access to Court Electronic Records (PACER),  an
electronic public access service that allows users to obtain case and docket information from federal appellate, district and bankruptcy
courts (http://www.pacer.gov/).

Date/time Intervals Covered by Source Data:
Ongoing.

EPA OA Requirements/Guidance Governing Collection:
All criminal enforcement special agents receive training on  the accurate completion of IAR reports and the entry of criminal case data into
the CCRS.

Geographical Extent of Source Data:
National.
2c. Source Data Reporting                                                                                                       |
After DOJ formally charges the defendants, the information is entered into CCRS (e.g., all the violations alleged, all of the defendants
charged, as well as forensic information about the pollutants involved and the impact on the public and the environment.) The status of the
case is updated as the legal process proceeds. The case agents update and enter into CCRS or submit to their superior lARs which highlight
changes in the case and all subsequent stages of the criminal enforcement process (e.g., a case is dismissed or the defendants are either
acquitted or convicted and sentenced.)

Timing and frequency of reporting: The status  of the case is updated as  the legal process proceeds
                                                            331                                      Back to Table of Contents

-------
 Goal 5                                                  Objective 1                                             Measure 420
3.  Information Systems and  Data Quality Procedures
3a.  Information Systems                                                                                                      |
The Criminal Case Reporting System (CCRS) stores criminal enforcement information and data in an enforcement sensitive database which
contains historical data on all criminal enforcement prosecutions as well as information about the pollutants involved and the impact on the
public and the environment.  CCRS maintains information pertaining to individuals and companies associated with the Criminal
Investigation Division's criminal leads and cases, as well as other information related to the conduct of criminal investigations.

. The data is used to document the progress and results of criminal investigations. The data used for all criminal enforcement performance
measures are in the CCRS database.

The status of the case is updated on CCRS as the legal process proceeds. All legal documents relating to a prosecution are entered into the
system

3b.  Data Quality Procedures
The Criminal Investigations Division (CID) has a process for document control and records and has Quality Management Plans in place.
The information on charged cases that is entered into CCRS goes through  several layers of review. Initial verification of the quality and
accuracy  of case information is the responsibility of the Special Agent-in-Charge (SAC) of the office that is managing the case. HQ
responsibility for QA/QC is conducted by the System Administrator of CCRS
3c. Data Oversight	|
Initial oversight at the field level is the responsibility of the Assistant Special Agent-in-Charge (ASAC) and  Special Agent-in-Charge
(SAC) of the criminal enforcement office managing the case. That information is further reviewed by OCEFT HQ through semi-annual
case management reviews conducted by the Assistant Director of Investigations, CID, and quarterly reports by the System Administrator of
CCRS. The System Administrator, who creates all statistical and management reports based on information in CCRS, conducts regular
oversight of the data entered by the criminal enforcement field offices to ensure that all data entered into CCRS is complete and accurate.

3d. Calculation Methodology                                                                                                   |
The methodology for the criminal  enforcement measure "Percent of criminal cases with charges filed" employed a five year analysis
(FY2006-2010) to develop the baseline and targets. The decision rules reflect the legal status of the defendants charged. The data files
relevant to this analysis include defendant names and type (individual or company), date of charges filed and the actual statutes (either or
both environmental or U.S. Criminal Code) listed in the criminal indictment or criminal information.

There are no "assumptions" or "quantifiers" used in calculating the measure. The measure is based upon the legal status of cases, i.e.,
whether the case has been closed without prosecution or is being prosecuted. The measure is calculated by dividing the number of cases
that have been charged (i.e., with an indictment or criminal information) during the current Fiscal Year  (numerator) by the total number of
                                                           332                                     Back to Table of Contents

-------
 Goal 5                                                  Objective 1                                             Measure 420
criminal cases that were closed during the current Fiscal Year (denominator).


Time frame: Semiannual reporting.

Unit of analysis: Percent
4.  Reporting and Oversight
4a. Oversight and Timing of Results Reporting
The System Administrator of the OCEFT CCRS has the responsibility for compiling and verifying the accuracy of the report on charged
defendants. Once compiled, data goes through a second level of verification through the Assistant Director of Investigations, CID. While
data is verified on an on-going basis, final verification is conducted at the end of the fiscal year.

Timing of Results Reporting:
Semiannually.
4b. Data Limitations/Qualifications
N/A since the measure is based upon the legal status of charged cases

4c. Third-Party Audits
N/A
 Record Last Updated: 02/13/2012 01:16:59 PM
                                                           333                                      Back to Table of Contents

-------
  Goal 5                                                Objective 1                                            Measure 421
  Measure Code : 421 - Percentage of conviction rate for criminal defendants.
  Office of Enforcement and Compliance Assurance (OECA)
   1.  Measure and DQR Metadata	
   Goal Number and Title                          5 - Enforcing Environmental Laws
   Objective Number and Title                       ! - Enforce Environmental Laws
   Sub-Objective Number and Title                   1 - Maintain Enforcement Presence
   Strategic Target Code and Title                    7 - By 2015, maintain an 85 percent conviction rate for criminal defendants
   Managing Office                                   Office of Criminal Enforcement
   Performance Measure Term Definitions
Criminal Cases: A criminal case exists when EPA's criminal enforcement program, specifically special agents in the Criminal
Investigation Division (CID), investigate allegations of criminal violations of environmental law. The EPA active ("open") criminal case
docket consists of  cases in all stages of the legal process - from initial investigations to charged cases to convicted cases that are awaiting
sentencing or are on appeal.

A criminal case with charges filed is one in which, based upon an investigation by the EPA criminal enforcement program, the U.S.
Department of Justice formally files charges against one or more defendants (either a person, company or both) alleging a criminal violation
of one or more of the environmental statutes and/or associated violations of the U.S. Criminal Code in U.S. District Court.

Conviction: A defendant (either a person or company) who has been previously charged with committing one or more environmental
crimes is found  legally "guilty" of at least one of those crimes. Legal guilt (conviction) occurs either when the defendant pleads guilty or is
convicted following a trial.

For more information about EPA's Criminal Enforcement Program, visit http://www. epa. gov/compliance/criminal/ .


 2. Data Definition and Source Reporting	
 2a. Original Data Source                                                                                                     |
 As part of the investigative process, the Criminal Investigation Division (CID) special agent assigned completes an Investigation Activity
 Report (IAR). The IAR is the primary means used to document all investigative activity, operational activities, judicial activities, or
 responses to investigative tasking or leads. Investigative activities include interviews, surveillance, electronic monitoring, arrests, searches,
 evidence handling and disposition, and document reviews.  Operational activities include undercover reports, and consensual monitoring.
 Judicial activities  include indictments, criminal informations, criminal complaints, guilty pleas, trials, convictions, and sentencing hearings
 and results.  Investigative tasking relates to collateral requests from CID headquarters and other offices, as well as memorializing activity
                                                          334                                    Back to Table of Contents

-------
 Goal 5                                                 Objective 1                                            Measure 421
conducted in furtherance of lead inquiries.

All relevant data is entered into the Criminal Case Reporting System (CCRS, cf section 3a), which tracks a criminal investigation from the
time it is first opened through all stages of the legal process to a conclusion (e.g., when the case is indicted, when a defendant is convicted,
sentenced or acquitted).CCRS is used to create the IAR

The data used to compile the measure is based upon the legal documents filed in the U.S. District Court where the defendant is prosecuted.
Charges can be dismissed after exculpatory evidence in their favor was entered into the record or the legal process can results in either a
conviction or an acquittal. A conviction is also reaffirmed at the subsequent sentencing of a convicted defendant, when the judge imposes
the sentence through a legal document known as  the Judgment and Commitment Notice. (J&C).

2b. Source Data Collection
Source Data Collection Methods:
The measure is based upon enforcement and legal documents which memorialize the status of a criminal prosecution.  As noted above, the
data for the measure are formally compiled through the lARs and DOJ legal documents entered into CCRS. In addition, all public legal
documents relating to a charged case, including the conviction, are also entered into and are publicly available through Public Access to
Court Electronic Records (PACER), an electronic public access service that allows users to obtain case and docket information from
federal appellate, district and bankruptcy courts

 (http ://www. pacer. gov/).

Date/Time Intervals Covered by Source Data:
Ongoing.

Geographical Extent of Source Data:
National.
2c. Source Data Reporting
The status of the case is updated as the legal process proceeds. The case agents update and enter into CCRS or submit to their superior
lARs which highlight changes in the case and all subsequent stages of the criminal enforcement process (e.g., a case is dismissed or the
defendants are either acquitted or convicted and sentenced)

Timing and frequency of reporting: The status of the case is updated as the legal process proceeds
3.  Information Systems and Data Quality Procedures	
3a.  Information Systems
                                                          335                                     Back to Table of Contents

-------
 Goal 5                                                   Objective 1                                             Measure 421
The Criminal Case Reporting System (CCRS) stores criminal enforcement data in an enforcement sensitive database which contains
historical data on all criminal enforcement prosecutions as well as information about the pollutants involved and the impact on the public
and the environment. CCRS maintains information pertaining to individuals and companies associated with the Criminal Investigation
Division's criminal leads and cases, as well as other information related to the conduct of criminal investigations.

 The data is used to document the progress and results of criminal investigations. The data used for all criminal enforcement performance
measures are in the CCRS database.

The status of the case is updated on CCRS as the legal process proceeds. All legal documents relating to a prosecution are entered into the
system.

3b. Data Quality Procedures
The Criminal Investigations Division (CID) has a process for document control and records management and has Quality Management
Plans in place. The information on defendant dismissals, convictions or acquittals that is entered into CCRS  goes through several layers of
review.  Initial verification of the quality and accuracy of case information is the responsibility of the Special Agent-in-Charge (SAC) of
the office that is managing the case. HQ responsibility for QA/QC is conducted by the System Administrator of CCRS.	
3c. Data Oversight                                                                                                               |
Initial oversight, review and quality assurance at the field level is the responsibility of the Special Agent-in-Charge (SAC)  and Assistant
Special Agent-in-Charge (AS AC) of the criminal enforcement office managing the case. That information is further reviewed by OCEFT
HQ through semi-annual case management reviews conducted by the Assistant Director of Investigations, CID, and quarterly reports by the
System Administrator of CCRS. The System Administrator, who creates all statistical and management reports based on information in
CCRS,  conducts regular oversight of the data entered by the criminal enforcement field offices to ensure that all data entered into CCRS is
complete and accurate.	
3d. Calculation Methodology
The methodology for the criminal enforcement measure "Conviction rate for criminal defendants" employed a five year analysis
(FY2006-2010) to develop the baseline and targets. The decision rules reflect the legal status of the defendants. The data files relevant to
this analysis include defendant names and type (individual or company), date of charges filed and the results (convicted, acquitted, or
charges dismissed) of the prosecution regarding each of the charges on which the defendant was found guilty or not guilty  (either or both
environmental law or general U.S. Criminal Code). A defendant is defined as having been "convicted" if he is guilty of at  least one of the
criminal counts of which he has been charged.

There are no "assumptions" or "quantifiers" used in calculating the measure. The measure is based upon the  legal status of cases, i.e.,
whether the  defendant has been convicted, acquitted or had the charges dismissed after exculpatory evidence in their favor was entered into
the record. The measure is calculated by dividing the total number of defendants who have been convicted during the current Fiscal Year
(numerator) by the total number of defendants with a legal result of their case in the current Fiscal Year (denominator). The "legal result"
denominator includes all defendants whose charges were dismissed, who were acquitted or had their charges overturned  on appeal
following  conviction.

                                                            336                                     Back to Table  of Contents

-------
  Goal 5                                                  Objective 1                                              Measure 421
 Semiannual reporting.

 Unit of analysis: Percent	


 4. Reporting and  Oversight	
 4a. Oversight and Timing of Results Reporting
 Oversight of Final Reporting: The System Administrator of the OCEFT CCRS has the responsibility for compiling and verifying the
 accuracy of the report on the percentage of convicted defendants. Once compiled, data goes through a second level of verification through
 the Assistant Director of Investigations, CID. While data is verified on an on-going basis, final verification is conducted at the end of the
 year.

 Timing of Results Reporting:
 Semiannually.	
 4b. Data Limitations/Qualifications	
 The only data limitations that result (although infrequently) occur when a defendant who has been initially convicted of one or more
 environmental crimes has all of his charges overturned by the U.S. Appellate Court on appeal in a subsequent fiscal year than the one in
 which the measure is being reported. The conviction rate for charged defendants has historically been in the 90% range, and is not
 materially affected by post-conviction appeals,  so the low incidence of defendants having their convictions eventually overturned does not
 limit the suitability of the performance measure.

 4c. Third-Party Audits
~N/A
  Record Last Updated: 02/13/2012 01:16:59 PM
                                                             337                                     Back to Table of Contents

-------