WELCOME
The Fourth Annual U.S. Environmental Protection Agency (EPA)
Conference on Statistics is sponsored by the Statistical Policy
Advisory Committee (SPAC) and funded by the Statistical Policy
Branch, Office of Policy, Planning and Evaluation. The Conference is
held solely for the benefit and participation of EPA personnel.
Programming and arrangements for the Conference have been
provided by the SPAC Conference Planning Committee, chaired by
John Warren. Listed below are the members of the Conference
Planning Committee for the 1988 EPA Conference on Statistics:
Gerald Akland, EMSL-RTP Mark Goldstein, Region III
R. Clifton Bailey, OW William Hunt, OAQPS-RTP
Joseph Carra, OSWER Henry Kahn, OW
Jim Craig, OSWER Kathleen Knox, OPTS
John Creason, HERL-RTP Mel Kollander, OPPE
Jim Daley, OPPE Elizabeth Margosches, OPTS
Dennis Ann Daniel, WIC Barry Nussbaum, OAR
George Flatman, EMSL-LV John Warren, OPPE
-------
TABLE OF CONTENTS
Page
CONFERENCE AGENDA 1
CONFERENCE ATTENDEES 1°
ABSTRACTS
Communications Panel — Movln' On: "Was it more fun as a
manager or statistician" — William Hunt and Barry Nussbaum 19
The "Cheap" Statistician: Where Is S/He? — Henry Kahn 21
Data Quality Objectives (DQOs): New OMB Requirements —
John Warren 22
"Federal Register. OMB Guidelines for Federal Statistical
Activities, Notice" 22a
Biological/Human Sampling
"Pharmacokinetic Modeling Using SAAM — A Tool for
Simulation and Model Fitting" — Bernard Most 23
"Considerations in Measuring Biologicals in Human Populations" --
Cheryl Siegel Scott 24
"Biomarkers: Promise and Practice" -- John Fowle III 25
Graphics
"Expert Systems to Assist in Decisions Concerning Land
Disposal of Hazardous Wastes" — Daniel G. Greathouse 26
Bioassay and Toxicity Measurements
"Sources of Variability in Laboratory Animal
Carcinogenicity Studies" — Joseph K. Baseman 27
The Group Depth (Focus Group) Interview
"The Group Depth Interview: An Unstructured Approach
for Collecting In-Depth Information on Attitudes
and Motivations" -- Alfred E. Goldman 28
Human Exposure Monitoring
"New Directions of Exposure Monitoring" — John D. Spengler 29
Guide to EPA Information Center Services for Statisticians
"Guide to EPA Information Center Services for Statisticians" —
Denny Daniel 30
"In-House Graphics" — Nancy Sneath 31
-------
Spatial Statistics
"Geostatistical Software: Practical Applications" --
Evan J. Englund and George T. Flatman 32
Poster Session
"Forest Effects of Acid Deposition: An Epidemiological
Approach to Data Representation Graphics and Exploratory
Data Integration of Ecological Information" -- Ruth H. Allen.
J. Jacob Wind, and Ronald W. Matheny 33
"Appropriate LC50 Statistics for Effluent Toxicity Analysis" --
M. Bastian, P. Koska, T. Vinson, and C. Young (Presenter:
James E. Stiebing) 34
"Regional Surface Water Quality Characteristics of Nebraska" —
Norman H. Crisp (Presenter: Thomas T. Holloway) 36
"Guide to EPA Information Center Services for Statisticians" --
Denny Daniel 37
"Chlorinated Paraffins: A Report on the Findings from
Two Field Studies at Sugar Creek, Ohio, and Tinkers
Creek, Ohio" — Susan Dillman 38
"Geostatistical Software Demonstration" -- Evan J. Englund 40
"Household Solvent Products: A National Usage Survey" --
Mary Frankenberry, Patrick Kennedy. Cindy Stroup,
Donna Eisenhower, Paul Flyer, and John Rogers 41
"Tools for Presenting Spacial and Temporal Patterns
of Environmental Monitoring and Effects Data" --
L. Thomas Heiderscheit, Wilson B. Riggan, and John Creason 42
"Orientation to Quality Assurance Management" -- Kevin Hull and
Susan A. Santo 43
"Sampling Strategy for Network Design" -- Jerry Jaikanen,
Donald E. Myers, and George T. Flatman 44
"Statistical Evaluation of Water Quality Trends" — Reta Roe 45
"In-House Graphics" -- Nancy Sneath 48
"Control Chart Strategy" — Thomas H. Starks and George T. Flatman 49
Practical Considerations for Agency Surveys
"National Survey of Pesticides in Drinking Water Wells" —
James Boland 50
"Hazardous Waste Surveys: Problems in Development and
Implementation" — Jim Craig 51
Guest Presentation
"Statistics versus Statistics" -- Leo Breiman 52
Royce Hotel Map 53
Williamsburg Dining 54
NOTES
11
-------
AGENDA
FOR THE
FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18, 1988
Royce Hotel
Willlamsburg, Virginia
-------
AGENDA
FOURTH ANNUAL EPA CONFERENCE ON STATISTICS
March 15-18, 1988
TUESDAY. MARCH .15
3:30 - 5:30pm Registration and Check-in
(Westminster Ballroom. Foyer)
6:00 - 7:00pm OPENING SESSION
(Westminster Ballroom, Section B)
Introduction: N. Phillip Ross, Chief, Statistical Policy Branch. Office
of Policy, Planning, and Evaluation
Welcoming Remarks: Robert H. Wayland, III, Deputy Assistant
Administrator, Office of Policy, Planning, and Evaluation
Overview of Conference: John Warren, Chair, Conference Planning
Committee, Office of Policy, Planning, and Evaluation
Conference Information: Marcla Gardner, SRA Technologies, Inc.
7:00 - 8:00pm Opening Reception
(Westminster Ballroom, Section A)
WEDNESDAY. MARCH H
8:00 - 8:30am Continental Coffee
(Westminster Ballroom, Section A)
• Please Note: Sessions will begin and end PROMPTLY at the specified time.
-------
AGENDA FOR THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18. 1988
8:30 - 9:45am COMMUNICATIONS PANEL — MOVIN1 ON:
"Was it more fun as a manager or statistician?"
(Westminster Ballroom, Section B)
Session Co-chairs:
William Hunt, Chief, Monitoring and Reports Branch, Office of Air
Quality Planning and Standards, Research Triangle Park, and
Barry Nussbaum. Chief, Operations and Compliance Policy Branch,
Office of Air and Radiation
Panel:
Gerald Akland, Director, Monitoring and Assessment Division, Environ-
mental Monitoring Systems Division, Research Triangle Park,
Joseph Carra, Director, Waste Management Division. Office of Solid
Waste,
William Hunt, Chief. Monitoring and Reports Branch, Office of Air
Quality Planning and Standards, Research Triangle Park,
Henry Kahn, Chief, Statistics Section, Office of Water Regulations and
Standards, and
Barry Nussbaum, Chief, Operations and Compliance Policy Branch.
Office of Air and Radiation
9:45 - 10:00am Break
(Westminster Ballroom, Section A)
10:00 - ll:30am COMMUNICATIONS PANEL DISCUSSION SESSION
(Westminster Ballroom, Section B)
ll:30am - l:15pm Lunch Break
12:30 - l:15pm Poster Session - Participant Meeting
(Warwick Room)
1:15 - 2:45pm THE "CHEAP" STATISTICIAN: Where Is S/He?
(Westminster Ballroom, Section B)
Session Chair: Henry Kahn. Chief. Statistics Section, Office of Water
Regulations and Standards
-------
AGENDA FOR THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18, 1988
THE "CHEAP" STATISTICIAN (continued)
Panel:
Thomas Curran, Statistician, Office of Air Quality Planning and
Standards, Research Triangle Park,
Barnes Johnson, Statistician, Statistical Policy Branch, Office of
Policy, Planning, and Evaluation,
Thomas Kelly, Director, Office of Standards and Regulations, Office of
Policy, Planning, and Evaluation, and
Thomas O'Farrell, Acting Director, Industrial Technology Division,
Office of Water Regulations and Standards, Office of Water
2:45 - 3:00pm Break
(Westminster Ballroom, Section A)
3:00 - 4:15pm DATA QUALITY OBJECTIVES (DQO's): NEW OMB
REQUIREMENTS
(Westminster Ballroom, Section B)
Session Chair: John Warren, Statistical Policy Branch, Office of
Policy, Planning, and Evaluation
Panel:
Oscar Morales, Information Policy Branch, Office of Policy, Planning,
and Evaluation,
Dean Neptune, Office of Acid Deposition, Environmental Monitoring,
and Quality Assurance, Office of Research and Development, and
John Warren, Statistical Policy Branch, Office of Policy, Planning, and
Evaluation
4:30 - 6:30pm Walking Tour of Williamsburg (Optional - Reservations
required & limited number of places available)
THURSDAY. MARCH 17
8:00 - 8:30am Continental Coffee
(Berkley Room)
-------
AGENDA FOR THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18. 1988
8:30 - 9:45am MINISESSION SERIES I
BIOLOGICAL/HUMAN SAMPLING
(Williamsburg Parlor)
Session Co-chairs:
John Creason, Health Effects Research Laboratory. Research Triangle
Park, Office of Research and Development, and
Kathleen Knox. Benefits and Use Division, Office of Pesticide Programs
Pharmacoklnetic Modeling Using SAAM - A Tool for Simulation and
Model Fitting
Presenter: Bernard Most, Northrup Services. Inc.
Considerations in Measuring Blologicals In Human Populations
Presenter: Cheryl Siegel Scott, Exposure Evaluation Division, Office
of Toxic Substances
Blomarkers: Promise and Practice
Presenter: John R. Fowle III, Office of Health Research, Office of
Research and Development
GRAPHICS
(Jamestown Parlor)
Session Chair: Mark Goldstein, Information Resources Branch, Region
III
Expert Systems to Assist In Decisions Concerning Land Disposal of
Hazardous Waste
Presenter: Dan Greathouse, Hazardous Waste Engineering Research
Laboratory, Cincinnati, Office of Research and Development
BIOASSAY AND TOXICITY MEASUREMENTS
(Yorktown Parlor)
Session Chair: Elizabeth Margosches, Exposure Evaluation Division,
Office of Toxic Substances
Sources of Variability In Laboratory Animal Carcinogenlcity Studies
Presenter: Joseph Haseman, National Institute for Environmental and
Health Sciences
-------
AGENDA FOR THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18, 1988
9:45 - 10:00am Break
(Berkley Room)
10:00 - ll:15am MINISESSION SERIES II
THE GROUP DEPTH (FOCUS GROUP) INTERVIEW
(Yorktown Parlor)
Session Chair: Mel Hollander, Statistical Policy Branch. Office of
Policy, Planning, and Evaluation
The Group Depth Interview: An Unstructured Approach for Collecting
In-Depth Survey Information on Attitudes and Motivations
Presenter: Alfred Goldman, Booz Allen and Hamilton
HUMAN EXPOSURE MONITORING
(Williamsburg Parlor)
Session Chair: Gerald Akland, Director, Monitoring and Assessment
Division, Environmental Monitoring Systems Laboratory, Research
Triangle Park, Office of Research and Development
New Directions of Exposure Monitoring
Presenter: John D. Spengler, Harvard School of Public Health
GUIDE TO EPA INFORMATION CENTER SERVICES FOR
STATISTICIANS
(Empire Ballroom, Parlor B)
Session Co-chairs:
Denny Daniel, Director, Washington Information Center, and
Mark Goldstein, Information Resources Branch, Region III
Guide to EPA Information Center Services for Statisticians
Presenter: Denny Daniel. Director, Washington Information Center
In-House Graphics
Presenter: Nancy Sneath, Washington Information Center
ll:30am - l:30pm Group Luncheon (For information check the Registration
Table)
-------
AGENDA FOR THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18, 1988
1:30 - 2:45pm MINISESSION SERIES III
BIOLOGICAL/HUMAN SAMPLING
(Williamsburg Parlor)
Session Co-chairs:
John Creason, Health Effects Research Laboratory, Research Triangle
Park, Office of Research and Development, and
Kathleen Knox, Benefits and Use Division, Office of Pesticide Programs
Pharmacokinetic Modeling Using SAAM - A Tool for Simulation and
Model Fitting
Presenter: Bernard Most, Northrup Services, Inc.
Considerations in Measuring Biologicals in Human Populations
Presenter: Cheryl Slegel Scott, Exposure Evaluation Division, Office
of Toxic Substances
Biomarkers: Promise and Practice
Presenter: John R. Fowle III, Office of Health Research, Office of
Research and Development
SPATIAL STATISTICS
(Jamestown Parlor)
Session Co-chairs:
George Flatman, Environmental Monitoring Systems Laboratory, Las
Vegas, Office of Research and Development, and
Mark Goldstein, Information Resources Branch, Region III
Geostatistlcal Software: Practical Applications
Presenter: Evan Englund, Environmental Monitoring Systems
Laboratory, Las Vegas, Office of Research and Development
-------
AGENDA FOR THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18, 1988
THE GROUP DEPTH (FOCUS GROUP) INTERVIEW
(Yorktown Parlor)
Session Chair: Mel Hollander, Statistical Policy Branch, Office of
Policy, Planning, and Evaluation
The Group Depth Interview: An Unstructured Approach for Collecting
In-Depth Survey Information on Attitudes and Motivations
Presenter: Alfred Goldman. Booz Allen and Hamilton
2:45 - 3:00pm 'Break
(Williamsburg Ballroom, Lobby Area)
3:00 - 4:15pm MINISESSION SERIES IV
GUIDE TO EPA INFORMATION CENTER SERVICES FOR
STATISTICIANS
(Empire Ballroom, Parlor B)
Session Co-chairs: Denny Daniel, Director, Washington Information
Center, and Mark Goldstein, Information Resources Branch, Region III
Guide to EPA Information Center Services for Statisticians
Presenter: Denny Daniel, Director, Washington Information Center
In-House Graphics
Presenter: Nancy Sneath, Washington Information Center
SPATIAL STATISTICS
(Jamestown Parlor)
Session Co-chairs: George Flatman, Environmental Monitoring Systems
Laboratory, Las Vegas. Office of Research and Development, and Mark
Goldstein, Information Resources Branch, Region III
Refreshments will also be available in the lobby area of the Empire
Ballroom.
-------
AGENDA FOR THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18, 1988
SPATIAL STATISTICS (continued)
Geostatistical Software: Practical Applications
Presenter: Evan Englund, Environmental Monitoring Systems
Laboratory, Las Vegas, Office of Research and Development
4:30 - 6:30pm POSTER SESSION
(Warwick Room and Empire Ballroom, Parlor C)
Session Chair: R. Clifton Bailey, Statistics Section, Office of Water
Regulations and Standards
6:30 - 7:30pm Guest Speaker Reception
(Empire Ballroom, Lobby Area)
FRIDAY. MARCH .18
8:00 - 8:30am Continental Coffee
(Empire Ballroom, Parlor A)
8:30 - 10:15am PRACTICAL CONSIDERATIONS FOR AGENCY SURVEYS
(Empire Ballroom, Parlors A & B)
Session Co-chairs:
Jim Daley, Information and Regulation Systems Division, Office of
Policy, Planning, and Evaluation, and
Jim Craig, Waste Management Division, Office of Solid Waste
National Survey of Pesticides in Drinking Water Wells
Presenters: James Boland, Hazard Evaluation Division, Office of
Pesticide Programs
Hazardous Waste Surveys: Problems in Development and
Implementation
Presenter: Jim Craig, Waste Management Division, Office of Solid
Waste
10:15 - 10:45am Break
-------
AGENDA FOR THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18, 1988
10:45am - 12:45pm GUEST PRESENTATION, CONFERENCE SUMMARY, AND
ADJOURNMENT
(Westminster Ballroom)
Introduction: John Warren, Chair, Conference Planning Committee,
Office of Policy, Planning, and Evaluation
Statistics versus Statistics
Guest Speaker: Leo Breiman. Professor of Statistics, University of
California, Berkeley
Presentation of Awards: Leo Breiman. Professor of Statistics, Univer-
sity of California, Berkeley
Conference Wrap-up: N. Phillip Ross. Chief. Statistical Policy Branch,
Office of Policy, Planning, and Evaluation, and Marcia Gardner. SRA
Technologies, Inc.
l:00pm BUSES LEAVE
-------
ATTENDEE LIST
-------
ATTENDEES
FOURTH ANNUAL EPA CONFERENCE ON STATISTICS
Gerald Akland. Director, Monitoring and Assessment Division, Environmental
Monitoring Systems Laboratory (MD-56). Research Triangle Park, NC 27711,
(919) 541-2346. (FTS) 629-2346.
Ri Clifton Bailey. Statistics Section, Office of Water Regulations and
Standards (WH-586), 401 M Street. S.W., Washington, DC 20460, (202) 382-
5411, (FTS) 382-5411.
Jerome Blondell. Exposure Assessment Branch, Office of Pesticide Programs
(TS-769C), 401 M Street, S.W., Washington, DC 20460, (703) 557-0336,
(FTS) 557-0336.
James Boland. Hazard Evaluation Division, Office of Pesticide Programs (TS-
769C), 401 M Street, S.W., Washington, DC 20460, (703) 557-1636, (FTS)
557-1636.
Leo Breiman. Professor of Statistics. University of California, Berkeley, CA
94720.
Paul Britton. Environmental Monitoring and Support Laboratory, 26 West
Martin Luther King Drive, Cincinnati, OH 45268, (513) 569-7325, (FTS)
684-7325.
Martin W^ Brossman. Quality Assurance Officer, Monitoring and Data
Support Division, Office of Water Regulations and Standards (WH-553), 401
M Street, S.W., Washington, DC 20460, (202) 382-7040, (FTS) 382-7040.
James Brown. Office of Solid Waste (WH-565E), 401 M Street, S.W.,
Washington, DC 20460. (202) 475-7240. (FTS) 475-7240.
Byron Bunger. Office of Radiation Programs (ANR-461), 401 M Street. S.W.,
Washington, DC 20460. (202) 475-9644, (FTS) 475-9644.
Sharon Campfield. Health Effects Research Laboratory (MD-55A). Research
Triangle Park, NC 27711. (919) 541-3508, (FTS) 629-3508.
Joseph Carra. Waste Management Division, Office of Solid Waste (WH-565)
401 M Street, S.W.. Washington. DC 20460. (202) 475-7276, (FTS) 475-7276.
10
-------
ATTENDEES OF THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18, 1988
James E.. Casey. Office of Standards and Regulations, Office of Policy,
Planning, and Evaluation (PM-223). 401 M Street, S.W., Washington, DC
20460, (202) 475-8664, (FTS) 475-8664.
Jim Cogliano. Cancer Assessment Group. Office of Health and Environmental
Assessment (RD-689). 401 M Street, S.W., Washington. DC 20460, (202) 382-
2575. (FTS) 382-2575.
Margaret G^ Conomos. Exposure Evaluation Division. Office of Toxic
Substances (TS-798). 401 M Street, S.W., Washington, DC 20460, (202) 382-
3958, (FTS) 382-3958.
Jim Craig. Waste Management Division, Office of Solid Waste (WH-565), 401
M Street, S.W., Washington, DC 20460, (202) 382-3410, (FTS) 382-3410.
John Creason. Health Effects Research Laboratory (MD-55), Research
Triangle Park, NC 27711, (919) 541-2598, (FTS) 629-2598.
Thomas Curran. Office of Air Quality Planning and Standards (MD-14).
Research Triangle Park, NC 27711, (919) 541-5467. (FTS) 629-5467.
James ML Daley. Information and Regulations Systems Division, Office of
Standards and Regulations (PM-223), 401 M Street, S.W., Washington, DC
20460, (202) 382-2743, (FTS) 382-2743.
Dennis Ann Daniel. Manager, Technical Center, Washington Information
Center. 401 M Street, S.W., Washington, DC 20460. (202) 488-5955, (FTS)
488-5955.
John Davidson. Office of Policy Analysis, Office of Policy, Planning, and
Evaluation (PM-223), 401 M Street, S.W., Washington, DC 20460, (202) 382-
5484, (FTS) 382-5484.
Linda DeLuise. Registrations Division, Office of Pesticide Programs
(TS-757C). 401 M Street, S.W., Washington. DC 20460, (703) 557-1900,
(FTS) 557-1900.
Susan Dillman. Exposure Evaluation Division, Office of Toxic Substances
(TS-798), 401 M Street, S.W., Washington, DC 20460, (202) 382-5375, (FTS)
382-5375.
Richard G± Eilers. Drinking Water Research Division. Water Engineering
Research Laboratory. 26 West Martin Luther King Drive. Cincinnati, OH
45268, (513) 569-7809. (FTS) 684-7809.
Evan .T Englund. Environmental Monitoring Systems Laboratory, P.O. Box
93478. Las Vegas. NV 89193-3478, (702) 798-2248, (FTS) 545-2248.
11
-------
ATTENDEES OF THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18, 1988
Gary F\ Evans. Environmental Monitoring Systems Laboratory (MD-56),
Research Triangle Park, NC 27711, (919) 541-3124, (FTS) 629-3124.
Robert B. Faoro. Office of Air Quality Planning and Standards (MD-14),
Research Triangle Park, NC 27711. (919) 541-5459, (FTS) 629-5459.
Jerzy A_. Filar. Statistical Policy Branch, Office of Policy, Planning, and
Evaluation (PM-223), 401 M Street, S.W., Washington, DC 20460, (202) 382-
2680, (FTS) 382-2680.
George T\ Flatman. Environmental Monitoring Systems Laboratory, P.O. Box
93478. Las Vegas, NV 89193-3478, (702) 798-2628, (FTS) 545-2628.
John R^ Fowle III, Office of Health Research, Office of Research and
Development (RD-683), 401 M Street, S.W., Washington, DC 20460, (202)
382-5895, (FTS) 382-5895.
Neil Frank. Office of Air Quality Planning and Standards (MD-14), Research
Triangle Park, NC 27711, (919) 541-5560. (FTS) 629-5560.
Mary Frankenberry. Exposure Evaluation Division, Office of Toxic Substan-
ces .(TS-798), 401 M Street, S.W., Washington, DC 20460, (202) 382-3890,
(FTS) 382-3890.
George Garland. Chief, State Programs Branch, Office of Solid Waste
(WH-563), 401 M Street, S.W., Washington, DC 20460, (202) 382-2210, (FTS)
382-2210.
Alfred EL Goldman. Booz Allen and Hamilton, 400 Market Street, Phila-
delphia, PA 19106, (215) 627-8110.
Mark tL Goldstein. Information Resources Branch, Region III (3PM50). 841
Chestnut Street, Philadelphia, PA 19107. (215) 597-3604, (FTS) 597-3604.
Daniel G^ Greathouse. Hazardous Waste Engineering Research Laboratory, 26
West Martin Luther King Drive, Cincinnati, OH 45268, (513) 569-7859,
(FTS) 684-7859.
Gary Forrest Grindstaff. Office of Toxic Substances (TS-798). 401 M Street,
S.W., Washington, DC 20460, (202) 382-3952, (FTS) 382-3952.
Thomas C.. Harris. Benefits and Use Division, Office of Pesticide Programs
(TS-768C), 401 M Street, S.W., Washington, DC 20460. (703) 557-1616,
(FTS) 557-1616.
12
-------
ATTENDEES OF THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18. 1988
Joseph Haseman. National Institute for Environmental and Health Sciences,
P.O. Box 12233, Research Triangle Park, NC 27709. (919) 541-4996, (FTS)
629-4996.
Lu Thomas Heiderscheit. Health Effects Research Laboratory (MD-55).
Research Triangle Park, NC 27711, (919) 541-2590, (FTS) 629-2590.
Richard C^ Hertzberg. Environmental Criteria and Assessment Office. 26
West Martin Luther King Drive. Cincinnati. OH 45268, (513) 569-7582,
(FTS) 684-7582.
Matthew V.. Hnatov. Statistics Section. Office of Water Regulations and
Standards (WH-586), 401 M Street, S. W., Washington, DC 20460, (202) 382-
5412, (FTS) 382-5412.
Karen Hogan. Exposure Evaluation Division, Office of Toxic Substances (TS-
798), 401 M Street, S.W., Washington, DC 20460, (202) 382-3895. (FTS) 382-
3895.
John W. Holley. Office of Mobile Sources (EN-397F), 401 M Street, S.W.,
Washington, DC 20460, (202) 382-2635, (FTS) 382-2635.
Thomas T.. Holloway. Chief. Water Monitoring Section, Environmental
Support Division, Region VII, 25 Funston Road, Kansas City, KS 66115,
(913) 236-3884, (FTS) 757-3884.
Howard Howell. Administrative Systems Division, Office of Information
Resources Management (PM-218), 401 M Street, S.W., Washington, DC
20460, (202) 475-8287, (FTS) 475-8287.
Kimberly A^ Hummel. Environmental Services Division, Region III (3ES11).
841 Chestnut Building, Philadelphia, PA 19107, (215) 597-3362. (FTS) 597-
3362.
William F\ Hunt. Jr.. Chief, Monitoring and Reports Branch. Office of Air
Quality Planning and Standards (MD-14), Research Triangle Park, NC
27711, (919) 541-5559. (FTS) 629-5559.
Barnes Johnson. Statistical Policy Branch, Office of Policy, Planning, and
Evaluation (PM-223), 401 M Street, S.W., Washington. DC 20460, (202) 382-
2684, (FTS) 382-2684.
Henry D. Kahn. Chief. Statistics Section, Office of Water Regulations and
Standards (WH-586), 401 M Street. S.W., Washington, DC 20460, (202) 382-
5406, (FTS) 382-5406.
13
-------
ATTENDEES OF THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18. 1988
Thomas E^ Kelly. Director, Office of Standards and Regulations. Office of
Policy, Planning, and Evaluation (PM-223), 401 M Street, S.W., Washington.
DC 20460. (202) 382-4001. (FTS) 382-4001.
Kathleen Knox. Benefits and Use Division, Office of Pesticide Programs
(TS-768C), 401 M Street, S.W., Washington, DC 20460, (703) 557-1753,
(FTS) 557-1753.
Mel Kollander. Statistical Policy Branch, Office of Policy, Planning, and
Evaluation (PM-223), 401 M Street, S.W., Washington, DC 20460, (202) 382-
2734, (FTS) 382-2734.
Stephen Kroner. Office of Water Regulations and Standards (WH-553), 401 M
Street, S.W., Washington, DC 20460. (202) 382-7051, (FTS) 382-7501.
Herbert Lacayo. Jr.. Statistical Policy Branch, Office of Policy. Planning,
and Evaluation (PM-223), 401 M Street, S.W., Washington, DC 20460, (202)
382-2714, (FTS) 382-2714.
Richard Levy. Hazard Evaluation Division, Office of Pesticide Programs (TS-
769C), 401 M Street, S.W., Washington, DC 20460, (703) 557-3715, (FTS)
557-3715.
Lloyd Lininger. Statistical Policy Branch. Office of Policy, Planning, and
Evaluation (PM-223), 401 M Street, S.W., Washington, DC 20460, (202) 382-
2680, (FTS) 382-2680.
Arthur Lubin. Region V, 536 South Clark Street, Chicago. IL 60605, (312)
886-6226, (FTS) 886-6226.
Bruce Madariaga. Office of Air Quality Planning and Standards (MD-12),
Research Triangle Park. NC 27711, (919) 541-5290, (FTS) 629-5290.
Elizabeth Margosches. Exposure Evaluation Division. Office of Toxic
Substances (TS-798), 401 M Street. S.W., Washington. DC 20460. (202) 382-
3511. (FTS) 382-3511.
Ronald W^ Matheny, Office of Research Program Management, Office of
Research and Development (RD-674), 401 M Street, S.W., Washington, DC
20460, (202) 382-7466, (FTS) 382-7466.
Craig JL. McCormack. Office of Policy Analysis, Office of Policy. Planning,
and Evaluation (PM-220), 401 M Street, S.W., Washington, DC 20460, (202)
382-5873, (FTS) 382-5873.
14
-------
ATTENDEES OF THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18, 1988
Robert E^ McGaughy. Office of Health and Environmental Assessment, Office
of Research and Development (RD-689), 401 M Street, S.W.. Washington, DC
20460, (202) 382-5898, (FTS) 382-5898.
Karen Milne. Exposure Evaluation Division. Office of Toxic Substances (TS-
798), 401 M Street. S.W., Washington, DC 20460, (202) 382-2263, (FTS) 382-
2263.
Lisa E_. Moore. Office of Research and Development (MD-235), 26 West
Martin Luther King Drive. Cincinnati. OH 45268, (513) 569-7671. (FTS)
684-7671.
Oscar Morales. Information Policy Branch, Office of Policy, Planning, and
Evaluation (PM-223), 401 M Street, S.W., Washington, DC 20460, (202) 382-
2738, (FTS) 382-2738.
Phillip R^ Morgan. Pennsylvania Department of Environmental Resources,
Fulton Bank Building, 3rd & Locust Streets. Harrisburg. PA 17100, (717)
787-9641.
Bernard Most. Northrup Services, Inc., P.O. Box 12313, Research Triangle
Park, NC 27709, Health Effects Research Laboratory (MD-55), Research
Triangle Park. NC 27711. (919) 541-2390, (FTS) 629-2390.
Donald E.. Myers. Department of Mathematics, University of Arizona,
Tucson. AZ 85721. (602) 621-6859.
Cornelius J. Nelson. Hazard Evaluation Division. Office of Pesticide
Programs (TS-769C), 401 M Street, S.W., Washington. DC 20460, (703) 557-
7398. (FTS) 557-7398.
William Nelson. Environmental Monitoring Systems Laboratory (MD-56).
Research Triangle Park, NC 27711, (919) 541-3184. (FTS) 629-3184.
Dean Neptune. Office of Acid Deposition, Environmental Monitoring, and
Quality Assurance, Office of Research and Development (RD-680), 401 M
Street, S.W.. Washington, DC 20460, (202) 475-9464, (FTS) 475-9464.
Barry Nussbaum. Field Operations and Support Division. Office of Mobile
Sources (EN-397F), 401 M Street. S.W., Washington, DC 20460, (202) 382-
2637, (FTS) 382-2637.
Robert O'Brien. Statistical Policy Branch, Office of Policy. Planning, and
Evaluation (PM-223), 401 M Street, S.W.. Washington. DC 20460, (202) 475-
9659. (FTS) 475-9659.
15
-------
ATTENDEES OF THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18. 1988
Thomas O'Farrell. Acting Director, Industrial Technology Division, Office of
Water Regulations and Standards (WH-552), 401 M Street. S.W., Washington.
DC 20460, (202) 382-7120, (FTS) 382-7120.
Susan Perlin. Office of Policy Analysis. Office of Policy, Planning, and
Evaluation (PM-220). 401 M Street, S.W., Washington, DC 20460, (202) 382-
5867, (FTS) 382-5867.
Dan Reinhardt. Exposure Evaluation Division, Office of Toxic Substances
(TS-798), 401 M Street, S.W., Washington, DC 20460. (202) 382-3585, (FTS)
382-3585.
Raymond C (Rocky) Rhodes. Health Effects Research Laboratory (MD-55),
Research Triangle Park, NC 27711.
Lorenz Rhomberg. Office of Health and Environmental Assessment. Office of
Research and Development (RD-689). 401 M Street, S.W., Washington, DC
20460. (202) 382-5723, (FTS) 382-5723.
Wilson Riggan. Health Effects Research Laboratory (MD-55), Research
Triangle Park. NC 27711, (919) 541-7540, (FTS) 629-7540.
Reta Roe, Office of Emergency Planning and Response, Region VII, 25
Funston Road. Kansas City, KS 66115, (913) 236-3881. ext. 215, (FTS) 757-
3881, ext. 215.
Melinda Ronca-Battista. Radon Division, Office of Radiation Programs (ANR-
464), 401 M Street, S.W., Washington, DC 20460, (202) 475-9605, (FTS) 475-
9605.
IiL Phillip Ross. Chief, Statistical Policy Branch. Office of Policy, Planning.
and Evaluation (PM-223), 401 M Street, S.W., Washington. DC 20460, (202)
382-2680. (FTS) 382-2680.
John G^ Schwemberger. Exposure Evaluation Division. Office of Toxic
Substances (TS-798), 401 M Street, S.W., Washington, DC 20460, (202) 382-
7195, (FTS) 382-7195.
Ronald W^ Schafer. Office of Air Quality Planning and Standards, Office of
Air and Radiation (EN-341), 401 M Street, S.W., Washington, DC 20460,
(202) 382-2810. (FTS) 382-2810.
Cheryl Siegel Scott. Exposure and Evaluation Division, Office of Toxic
Substances (TS-798). 401 M Street, S.W., Washington, DC 20460, (202) 382-
2282. (FTS) 382-2282.
16
-------
ATTENDEES OF THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18, 1988
Bimal Sinha. Statistical Policy Branch, Office of Policy, Planning, and
Evaluation (PM-223). 401 M Street, S.W., Washington, DC 20460, (202) 382-
2680, (FTS) 382-2680.
William P^ Smith. Statistical Policy Branch, Office of Policy, Planning, and
Evaluation (PM-223), 401 M Street, S.W., Washington, DC 20460, (202) 382-
2697, (FTS) 382-2697.
Nancy Sneath. Washington Information Center, 401 M Street. S.W., Washing-
ton, DC 20460, (202) 488-5921. (FTS) 488-5921.
John D.. Spengler. Professor of Environmental Health, Harvard School of
Public Health, Harvard University, 665 Huntington Road, Boston, MA 02115,
(617) 732-1255.
Thomas H.. Starks. Environmental Research Center, University of Nevada,
Las Vegas, NV 89154, (702) 739-0826.
James E,. Stiebing. Chief, Surveillance Branch, Region VI (6E-S). 1445 Ross
Avenue, Suite 1200. Dallas. TX 75202, (214) 655-2284, (255-2284).
Judy A^ Stober. Health Effects Research Laboratory, 26 West Martin Luther
King Drive, Cincinnati. OH 45268. (513) 569-7379. (FTS) 684-7379.
Cindy Stroup. Exposure Evaluation Division, Office of Toxic Substances (TS-
798), 401 M Street, S.W.. Washington, DC 20460, (202) 382-3886. (FTS) 382-
2886.
Clayton L^ Stunkard. Statistical Policy Branch. Office of Policy. Planning.
and Evaluation (PM-223), 401 M Street. S.W., Washington, DC 20460, (202)
382-3006, (FTS) 382-3006.
Jack Suggs. Environmental Monitoring Systems Laboratory (MD-77B).
Research Triangle Park, NC 27711. (919) 541-2791, (FTS) 629-2791.
David Svendsgaard. Health Effects Research Laboratory (MD-55). Research
Triangle Park, NC 27711, (919) 541-2468, (FTS) 629-2468.
Donald Thomsen. President, SIMS. 91 Parrish Road South, New Canaan, CT
06840.
Timothy Titus. Director, Chemical and Statistical Policy Division, Office of
Policy, Planning, and Evaluation (PM-223), 401 M Street, S.W., Washington,
DC 20460, (202) 382-4005. (FTS) 382-4005.
17
-------
ATTENDEES OF THE FOURTH ANNUAL
EPA CONFERENCE ON STATISTICS
March 15-18, 1988
John Warren. Statistical Policy Branch. Office of Policy, Planning, and
Evaluation (PM-223). 401 M Street, S.W., Washington. DC 20460. (202) 382-
2683. (FTS) 382-2683.
Robert H.. Wayland. Deputy Assistant Administrator, Office of Policy.
Planning, and Evaluation (PM-219), 401 M Street, S.W., Washington, DC
20460, (202) 382-4335. (FTS) 382-4335.
Dorothy Wellington. Statistical Policy Branch. Office of Policy, Planning,
and Evaluation (PM-223). 401 M Street, S.W., Washington, DC 20460, (202)
475-8204, (FTS) 475-8204.
•L. Jacob Wind. American Management Systems, Inc.. 1777 North Kent Street,
Arlington, VA 22209, (703) 841-6974.
Conference Participant
Conference Planning Committee Member
18
-------
ABSTRACTS
-------
SESSION: Communications Panel
TITLE: Movin' On: "Was It more fun as a manager or statistician?"
AUTHORS: William Hunt, Chief, Monitoring and Reports Branch, Office of Air
Quality Planning and Standards, Research Triangle Park, and Barry
Nussbaum, Chief, Operations and Compliance Policy Branch, Office of
Air and Radiation
One of the highlights of the 1987 EPA Conference on Statistics was a lively
panel discussion concerning communications between statisticians and high level
Agency managers. Several problem areas were identified for further exploration.
Among these were:
1. Few statisticians in EPA management positions.
2. Lack of growth in the number of statisticians within the Agency.
3. Lack of direct inquiry as to statistical validity of data — lack of feeling
that a statistician's views might be useful, not to mention essential.
4. Need for statisticians to join the "Tower of Babel," along with the other
professionals, so their concerns/views can be expressed to management. Is
management listening?
This year's communications panel will extend this discussion by centering on
EPA's statisticians who have attained various levels of management. They now
have both statistical and non-statistical responsibilities. While discussing the
above areas, they will address the following specific questions:
1. As you have progressed as a manager, have you been forced to put
statistical rigor in a lesser light?
2. Was it more fun as a manager or statistician?
3. How do you balance your statistical and programmatic responsibilities?
What do you tell your statistician subordinate when she or he insists on
more sampling and you can't get the answer to your boss on time?
4. How do you utilize your statistical background in your managerial role?
5. Now what do you think is the most effective role of a statistician?
6. How would you recommend that other statisticians pursue supervisory or
managerial positions?
19
-------
7. Has anything changed in the past year since three DAAs discussed their
perception of statistics?
8. Have you noticed a new quest for statistical validity?
The panel will include:
• Gerald Akland, Director, Monitoring and Assessment Division, Environ-
mental Monitoring Systems Laboratory, Research Triangle Park
• Joseph Carra, Director, Waste Management Division. Office of Solid
Waste
• William Hunt, Chief, Monitoring and Reports Branch, Office of Air
Quality Planning and Standards, Research Triangle Park
• Henry Kahn, Chief, Statistics Section, Office of Water Regulations and
Standards
• Barry Nussbaum, Chief, Operations and Compliance Policy Branch, Office
of Air and Radiation
Each panelist will have an opportunity to make opening remarks and then to
rebut remarks made by other panelists. This will be followed by discussion from
the audience.
20
-------
SESSION: The "Cheap" Statistician
TITLE: The "Cheap" Statistician: Where is S/He?
AUTHOR: Henry Kahn, Chief, Statistics Section. Office of Water Regulations and
Standards
The cost of statistical work is usually the focal point of the interaction
between managers and statisticians. This panel discussion between managers and
statisticians will deal with the costs of doing statistical work, the responsibilities
of statisticians and managers to see that appropriate cost and technical con-
siderations are incorporated into planning and design, the willingness of manage-
ment to pay for statistical work, doing statistical work cheaply, cost savings
through effective use of statistical design and analysis, and getting what you pay
for. The panel will include two managers and two statisticians. Each panel
member will make a presentation followed by panel discussion and open discus-
sion from the floor.
The panel will include:
• Thomas Curran. Statistician, Office of Air Quality Planning and Stan-
dards. Research Triangle Park
• Barnes Johnson, Statistician, Statistical Policy Branch, Office of Policy,
Planning, and Evaluation
• Thomas Kelly, Director, Office of Standards and Regulations, Office of
Policy, Planning, and Evaluation
• Thomas O'Farrell, Acting Director, Industrial Technology Division, Office
of Water Regulations and Standards, Office of Water
21
-------
SESSION: Data Quality Objectives
TITLE: Data Quality Objectives (DQOs): New OMB Requirements
AUTHORS: Oscar Morales, Information Policy Branch, Office of Policy, Planning,
and Evaluation, Dean Neptune, Office of Acid Deposition. Environ-
mental Monitoring, and Quality Assurance, Office of Research and
Development, and John Warren, Statistical Policy Branch, Office of
Policy, Planning, and Evaluation
The Office of Management and Budget (OMB) published in the Federal
Register. January 20, 1988, notice of Guidelines for Federal Statistical Activities:
"The Office of Management and Budget (OMB) is soliciting public
comment on a draft OMB circular that would revise government-wide
guidance for planning and conducting statistical surveys, publishing
statistical data, documenting statistical methods and procedures, and
using standard statistical classifications, definitions, and data sources.
The guidance, which applies to all Federal agencies subject to the
Paper Reduction Act of 1980, is intended to assure that the results of
statistical surveys and studies sponsored by the Federal government
are as reliable and useful as possible and that statistical activities are
conducted as efficiently as possible."
These guidelines will affect all statisticians in the Agency because:
"The circular would for the first time establish guidelines for
documenting all methods, procedures, and models used to produce
statistical estimates and would revise and strengthen existing guidance
on planning of statistical surveys, treatment of respondents, publica-
tion of statistical data, and use of standard statistical classifications,
definitions, and data sources."
Fortunately, the Agency has several mechanisms that will enable us to meet
(and in many cases exceed) the requirements outlined in the OMB guidelines.
Oscar Morales will explain the Information Clearance Request process as it
affects statisticians; and Dean Neptune will outline the Data Quality Objectives
program and where statisticians will be called upon to make their contribution.
There will be a discussion period at the end of the presentation and comments
received will be forwarded to OMB as part of the Agency's Official Comment.
Attached is a copy of the Federal Register containing OMB's "Guidelines for
Federal Statistical Activities."
22
-------
Wednesday
January 20, 1988
Part ir
Office of
Management and
Budget
Guidelines for Federal Statistical
Activities; Notice
22a
-------
1542
Federal Register / Vol. 53, No. 12 / Wednesday, January 20, 1988 / Notices
OFFICE OF MANAGEMENT AND
BUDGET
Guidelines for Federal Statistical
Activities
AGENCY: Office of Management and
Budget.
ACTION: Notice of a draft circular
establishing guidelines for Federal
statistical activities.
SUMMARY: The Office of Management
and Budget (OMB] is soliciting public
comment on a draft OMB Circular that
would revise government-wide guidance
for planning and conducting statistical
surveys, publishing statistical data,
documenting statistical methods and
procedures, and using standard
statistical classifications, definitions.
and data sources. The guidance, which
applies to all Federal agencies subject to
the Paperwork Reduction Act of I960, is
intended to assure that the results of
statistical surveys and studies
sponsored by the Federal government
are as reliable and useful as possible
and that statistical activities are
conducted as efficiently as possible.
DATE: Comments must be received on or
before April 19.1988.
ADDRESS: Comments are invited on any
aspect of the Circular. They should be
made in writing and sent to Dorothy M.
Telia. Office of Management and
Budget. Room 3001, New Executive
Office Building. Washington. DC 20503.
The comments will be available for
public examination at this address.
FOR FURTHER INFORMATION CONTACT:
Dorothy M. Telia, (202) 395-3093.
SUPPLEMENTARY INFORMATION: Statistics
collected and published by the Federal
government constitute a large part of the
available information about the United
States economy, population, natural
resources, environment, and public and
private institutions. These data are used
by the Federal government and others
as the basis for actions that affect
people's lives and well-being. It is
essential that they be collected,
processed, and published in a manner
that guarantees and inspires confidence
in their reliability. The statistical
programs of the Federal government are
decentralized among seventy or more
agencies or separate departmental units.
It is therefore also essential that, to the
extent permitted by law, there be
sufficient government-wide uniformity
in statistical methods and practices to
ensure the maximum usefulness of the
statistics produced.
The Paperwork Reduction Act of 1980,
as amended (44 U.S.C. 3504). gives the
Director of OMB broad responsibility for
improving the usefulness of information
collected, maintained, and disseminated
by the Federal government and for
reducing the Federal government's
reporting burden on the public. Among
the Director's functions under the Act
are statistical policy and coordination
functions, which include the
development and implementation of
policies, principles, standards, and
guidelines concerning statistical
collection procedures and methods,
statistical data classification, statistical
information presentation and
dissemination, and such statistical data
sources as may be required for the
administration of Federal programs.
This Circular provides revised
guidance for designing, conducting, and
publishing statistical surveys and
studies sponsored by Federal agencies.
The guidelines are intended to ensure
that such surveys and studies are
designed to produce reliable data as
efficiently as possible and that methods
are documented and results presented in
a manner that makes the data as
accessible and useful as possible. The
Circular would also establish guidelines
for the use of standard classifications,
definitions, and data sources. The
Circular would rescind and replace
guidance on the conduct of Federal
statistical activities currently contained
in 19 statistical policy directives.
(Section 2 of the Circular lists the
rescinded directives.) The Circular
would, however, leave in place Directive
No. 19. "Reports of the Department of
Commerce on International
Transactions" (43 FR 19272, May 4.
19781.
The Circular would for the first time
establish guidelines for documenting all
methods, procedures, and models used
to produce statistical estimates and
would revise and strengthen existing
guidance on planning of statistical
surveys, treatment of respondents,
publication of statistical data, and use of
standard statistical classifications.
definitions, and data sources. The
Circular would discontinue certain
classifications and definitions as
government-wide statistical standards,
as indicated below.
The attachments to the Circular
address the following subjects:
Planning Statistical Surveys. The
OMB paperwork regulation (5 CFR Part
1320) requires that when agencies seek
OMB approval to collect information.
they demonstrate that they have taken
reasonable steps to ensure that the
information is useful and that the cost
and burden of collecting it have been
minimized. Attachment A of the Circular
specifies the documentation that the
sponsoring agency shall include in its
request for OMB approval of a
statistical survey to demonstrate that
the survey is designed efficiently and
will produce reliable, useful results.
Treatment of Respondents The
guidelines m Attachment B are intended
to reassure respondents to statistical
surveys that the Federal government is
dealing with them honestly and
forthnghtly and that their interests are
being protected. For this purpose.
agencies that sponsor statistical surveys
should provide certain information to
potential respondents about the purpose
of each survey and the planned use of
the survey data, including any matching
or combination of individual respondent
data with data from administrative
sources. Any such match or combination
should meet the conditions specified m
section 3.c. of the attachment. Matches
for statistical purposes, to which the
guidance in Attachment B applies, are
not covered by the OMB Guidelines for
conducting computerized matching
programs (47 FR 21656, May 19.1982].
Agencies should take the specific steps
outlined in Attachment B to ensure the
protection from public disclosure of
information collected under a pledge of
confidentiality. Attachment B would
also establish certain design guidelines
for statistical surveys conducted under
mandatory reporting authority, in order
to minimize the burden of such surveys
on individual respondents.
Statistical Publications. Attachment C
contains guidelines for the presentation
and documentation of the results of
statistical surveys and studies. To
ensure that other agencies and the
public have an opportunity to verify and
use the results of all Federally-
sponsored statistical surveys and
studies, the sponsoring agency should
either publish the results or maintain
them in an accessible data base such
that requests for summary data or
tabulations can be met within 90 days.
This guidance is based on the
presumption that data collected for
statistical purposes have practical
utility, as defined in 5 CFR Part 1320,
only to the extent that they are
accessible to potential users within and
outside the Federal government. In
deciding whether to publish the results
or else to maintain them in an accessible
data base, agencies will be expected to
conform to the policies on the
dissemination of government
information products and services
established in section 8.a.(9) of OMB
Circular A-130, Management of Federal
Information Resources (50 FR 52730.
December24.1985).
Documentation of Methods and
Procedures. Attachment D contains
-------
Federal Register / Vol. 53. No. 12 / Wednesday. January 20. 1988 / Notices
1543
guidelines under which agencies would
maintain publicly available
documentation of all statistical methods,
procedures, and models used to produce
statistical data and estimates, thereby
enabling users to make informed.
independent judgments about the
quality of data and estimates and to
verify that they have been produced by
sound, replicable methods.
Compilation. Release, and Evaluation
of Principal Federal Economic
Indicators. Attachment E contains
guidelines for the compilation, release,
and evaluation of data series that have
been designated as principal economic
indicators by the Director of OMB. The
provisions in this attachment are the
same as those in Statistical Policy
Directive No. 3 (50 FR 38932. September
25.1985), except for the addition, in
Section 8, of the provision that agencies
inform the public of the uncertainty or
probable range of error in preliminary
estimates of economic indicators.
Use of Standard Classifications. Data
Sources, and Definitions. Attachment F
establishes, and prescribes the uses of.
certain standard statistical
classifications, data sources, and
definitions, which have all been
previously established in OMB
Statistical Policy Directives. Five
classifications or definitions would be
discontinued as government-wide
statistical standards either because the
current standards have not proven
useful for the statistical purposes
intended (standard Federal
administrative regions, the standard
industrial classification of enterprises.
and the standard reference base period
for Federal government general-purpose
index numbers) or because they are
used by only one or two agencies to
collect and publish statistics (the
standard classification of fields of
science and engineering and the
standard gas pressure base). In the
latter case, OMB believes it is more
practical for the principal user agencies
to maintain the standards. The Circular
makes it clear that the definitions of
Metropolitan Statistical Areas (MSAs).
the Standard Industrial Classification
(SIC), and the Standard Occupational
Classification (SOC) are established and
maintained by OMB solely for statistical
purposes and that agencies that use
these statistical standards in
nonstatistical programs bear the
responsibility for assunng that the
standard definitions or classifications
are appropriate for those uses;
Provision of'Statistical'Data to
International Organizations.
Attachment G provides guidance on the
implementation of Executive Order
10033.
In several cases, statistical activities
covered by this Circular are also
covered by other OMB Circulars. In such
cases, this Circular supplements or
clarifies the guidance in the other
Circulars as it relates to statistical
collections and publications. The
specific instances of each are noted in
the appropriate portions of the Circular.
Wendy L Gramm.
Administrator for Information and Regulatory
Affairs.
OMB Circular No. A-
To the Heads of Executive Departments
and Establishments.
Subject: Guidelines for Federal
Statistical Activities.
1. Purpose. This Circular revises
government-wide guidance for planning
and conducting statistical surveys,
publishing statistical data, and
documenting statistical methods and
procedures. These guidelines are
intended to assure that all statistical
surveys and studies sponsored by the
Federal government produce as accurate
and useful information as possible.
serve their purposes as efficiently as
possible, and impose no unnecessary
burden on respondents. The Circular
also prescribes four standards—a
standard data source for population
estimates, a standard data source for
labor force and unemployment
estimates, standard categories for
reporting race and ethnic background.
and a standard definition of poverty—to
be used in the administration of Federal
programs, consistent with statutory
requirements. It also clarifies the
responsibilities of agencies that use the
Standard Industrial Classification (SIC),
the Standard Occupational
Classification (SOC). and the standard
definitions of Metropolitan Statistical
Areas (MSAs) for nonstatistical
purposes.
2. Rescissions This Circular rescinds
and replaces Statistical Policy
Directives Nos. 1-2 and 5-18 (43 FR
19260. May 4.1978); Statistical Policy
Directive No. 3 (48 FR 3253, January 14.
1981). as revised (50 FR 38932.
September 25.1985); and the directive
entitled "Comparability of Statistics on
Business Size" (47 FR 2136Z. May 18.
1982).
3. Authority. The Paperwork
Reduction Act of 1980. as amended (44
U.S.C. 3504); the Budget and Accounting
Procedures Act of 1950, as amended (31
U.S.C. 1104): Executive Order 10253 of
June 11.1951. as amended (see 31 USCA
1104): and Executive Order 10033 of
February 8.1949. as amended (see 22
USCA 286F).
4. Background The Paperwork
Reduction Act grants the Director of the
Office of Management and Budget
(OMB) broad authority to develop and
implement government-wide policies.
principles, standards, and guidelines
concerning statistical collection
procedures and methods, statistical data
classifications, statistical information
presentation and dissemination, and
statistical data sources required for the
administration of Federal programs. Tne
Act also reassigns to the Director the
authority in the Budget and Accounting
Procedures Act to develop programs and
issue regulations and orders for the
improved gathering, compiling,
analyzing, publishing, and disseminating
of statistical information fur any
purpose by the various agencies of the
executive branch of the Federal
government. Since the Paperwork
Reduction Act took effect in 1981. OMB
has provided government-wide guidance
on collecting and publishing statistical
data in 19 statistical policy directives-
Directives Nos. 1-2 and 5-19 (43 FR
19260. May 4.1978): Directive No. 3 (46
FR 3253. January 14,1981). as revised (50
FR 38932, September 25,1985); and a
directive establishing standard business
size categories for statistical purposes.
"Comparability of Statistics on Business
Size" (47 FR 21382. May 18,1982). This
Circular rescinds and replaces all of
these directives except Directive No. 19.
"Reports of the Department of
Commerce on International
Transactions" (43 FR 19272. May 4.
1978).
OMB information collection reviews
and other evaluations of statistical
activities and publications have pointed
to a need for more explicit guidance to
agencies on (1) planning statistical
surveys so that they meet the
requirements set forth in 5 CFR Part 1320
for OMB approval under the Paperwork
Reduction Act: (2) treating respondents
to statistical surveys in a manner that
ensures the public's continued
willingness to provide accurate, timely
information to the Federal government
for statistical purposes: and (3)
documenting statistical surveys and
studies in such a way as to make their
results as useful as possible, minimize
the risk that statistics may be
misinterpreted, and ensure the public
that estimates have been produced by
sound, replicable methods. The Circular
establishes guidelines on these aspects
of statistical work, as well as on the use
of standard classifications, data sources.
and definitions for statistical and
administrative purposes: on the
-------
1544
Federal Register / Vol. 53. No. 12 / Wednesday. January 20. 1988 / Notices
compilation, release, and evaluation of
principal Federal economic indicators:
and on the provision of statistical data
to international organizations.
consistent with the requirements of
Executive Order 10033.
5. Coverage. The provisions of the
Circular apply to all Federal agencies
subject to the Paperwork Reduction Act
of 1980 (see 5 CFR 1320.7(a)). to all
statistical surveys and studies
sponsored by those agencies, and to all
publications resulting from such surveys
and studies.
6. Agency Implementation. This
Circular provides policy guidance on
statistical issues. This guidance should
be applied to the extent permitted by the
laws governing the agency's actions.
7. Definitions. For the purposes of this
Circular
a. "Benchmarking" is the
reconciliation of one estimate with
another that is thought to be more
accurate.
b. "Data" is used interchangeably
• with "information", as defined in 5 CFR
1320.7(k), when referring to information
collected in or resulting from statistical
surveys and studies.
c. An "estimate" is a numerical value
or relationship that is derived from a
survey, model, other statistical or
mathematical procedures, or
professional judgment.
d. "Imputation" means assigning
estimated values to fill in missing data
on individual statistical records or to
• replace data supplied by respondents
that an agency believes to be in error.
Changes to correct obvious coding or
recording errors made by the_agency are
not imputations. Filling in missing data
with data that the same respondent has
supplied on another statistical or
administrative record is not defined as
imputation if the data were supplied in
response to the same question for the
same time period.
e. "Publication" means any release of
statistical data or results of a statistical
study for distribution or sale to the
public on paper, computer tape, disk, or
any other semipermanent medium, or by
means of electronic data bases.
f. A Federal agency is the "sponsor"
of a statistical survey or study if:
(1) That agency conducts the survey
or study using funds appropriated to it
or available for discretionary use
through other means;
(2] That agency provides funds
appropriated to it to another Federal
agency, other government organization.
or a private contractor to conduct the
survey or study; or
(3) The survey or study is conducted
under a grant from or cooperative
agreement with that agency and:
(A) The grantee or cooperating party
is conducting the survey or study at the
specific request of the agency for the
planning, operation, or evaluation of its
programs: or
(B) The terms and conditions of the
grant or agreement provide for prior or
ongoing agency approval of the
collection of information or the
procedures to be used in the survey or
study.
g. "Statistics are the quantitative
results of a survey or study. Statistics
include both aggregate estimates and
the elements of individual data records.
h. Information collected for
"statistical purposes" is information
collected for the purpose of reporting
population characteristics, developing
statistical procedures, or constructing
sampling frames. Information collected
for any other purpose is for
"nonstatistical purposes".
i. "Statistical study" means any study
that makes use of a survey, model, or
statistical or mathematical procedure or
of estimates derived by these means.
j. "Statistical survey" means a
collection of data from or about any
population or group for the purpose of
studying characteristics of the
population or group. Both collections of
data gathered directly from respondents
in a census or sample of the study
population and compilations of data
from administrative records for
statistical purposes are defined as
statistical surveys.
8. Contents. The guidance provided by
this Circular is set forth in the
attachments:
Attachment A—Planning of statistical
surveys
Attachment B—Treatment of
respondents
Attachment C—Statistical publications
Attachment D—Documentation of
methods and procedures
Attachment E—Compilation, release.
and evaluation of principal Federal
economic indicators
Attachment F—Use of standard
classifications, data sources, and
definitions
Attachment G—Provision of statistical
information to international
organizations
9. Submission of Agency Plan. Within
120 days of the publication of this
Circular, the* head of each agency shall
submit to the Director a plan, including
completion dates, for bringing all the
agency's programs into compliance with
the guidelines in the Circular. Where
appropriate, these plans shall be
coordinated with the anticipated
schedule for paperwork clearance
reviews.
10. Judicial Review This Circular is
not intended to create any right or
benefit, substantive or procedural.
enforceable at law by a party against
the United States, its agencies, its
officers, or any person.
11. Information Contact Chief.
Statistical Policy Office. Office of
Information and Regulatory Affairs.
Telephone: (202) 395-3093.
ATTACHMENT A—Circular No. A-
Planning of Statistical Surveys
1. This attachment outlines the
documentation that is to accompany
statistical surveys when they are
submitted to OMB for approval under
the Paperwork Reduction Act. in order to
demonstrate that the surveys meet the
relevant requirements of the Act and
its implementing regulation. 5 CFR Part
1320. It replaces guidance on the
planning of statistical surveys
previously contained in Statistical
Policy Directive No. 1. "Standards for
Statistical Surveys." which is rescinded
by this Circular.
2. With every request for OMB
approval of a statistical survey under
the Paperwork Reduction Act. the
sponsoring agency shall submit
supporting documentation to
demonstrate that the survey has a useful
purpose and is designed to accomplish
that purpose as efficiently as possible.
That documentation shall include the
following information:
a. The analytical purpose of the
survey. The documentation shall state
specifically the analytical problem or
research question the survey is expected
to solve or answer and shall explain the
role the survey is expected to play in the
analysis. If the analysis calls for any
individual match or combination of data
collected directly from respondents in
the survey with data from
administrative sources, the
documentation shall fully describe the
purpose of and plans for the match or
combination. Any such match or
combination should comply with the
provisions in Attachment B. Section 3.c.
When requesting OMB approval of a
pretest, the sponsoring agency shall
specify the aspect of the survey that is
to be pretested, the design features of
the actual survey that will be
determined by the results of the pretest.
and the agency's plans and timetable for
evaluating the pretest results.
b. The statistical objectives of the
survey. The documentation shall specify
the population to be investigated, the
variables for which data are to be
gathered, the parameters to be
estimated on the basis of survey data.
-------
Federal Register / Vol. 53. No. 12 / Wednesday, January 20, 1988 / Notices
1545
and the accuracy requirements for these
estimates. If the survey is to investigate
relationships among variables,
differences among populations, or
changes over time, the documentation
shall indicate the necessary accuracy of
the estimates of these relationships,
differences, or changes.
c. The need for a sjurvey. For any
survey that involves a new collection of
data from the public, the sponsoring
agency shall describe its efforts to
obtain suitable data from other sources.
including existing surveys,
administrative records, and model-
based estimates, and explain why such
alternatives were rejected.
d. The feasibility of the survey. The
sponsoring agency shall submit
documentation to demonstrate that the
desired information exists and that it
can be obtained through a survey in a
sufficiently timely manner and at a
sufficient level of accuracy and detail to
serve the analytical purposes of the
survey.
e. Justification of the proposed
frequency of the survey. If the survey is
to be recurring, the sponsoring agency
shall submit documentation to
demonstrate that:
(1) Measurable changes in the
phenomena being studied are expected
to occur in the proposed interval
between data collections and that such
changes need to be estimated to fulfill
the analytical purpose of the survey;
(2) Without a survey at the proposed
frequency, changes during this interval
could not be estimated with an
acceptable degree of accuracy; and
(3) The sponsoring agency is able and
intends to collect, process, and publish
data promptly if collection is at the
frequency proposed.
f. The survey design. The sponsoring
agency shall submit documentation to
demonstrate that:
(1) The survey will collect all data not
available from other, sources that are
necessary for the analyses the survey ia
intended to support.
(2) The survey is designed to satisfy
the accuracy requirements of the
analyses it is intended to support. The
documentation shall include estimates
of the variance of key parameters.
including composites and projections, to
demonstrate that they are likely to be
within acceptable limits.-If the survey is
intended to measure relationships
among variables, differences among
populations, or changes over time, the
documentation shall also include
calculations to demonstrate that the
relationships, differences, or changes of
interest can probably be measured with
the precision required. If the sample
design calls for nonprobability sampling.
the documentation shall explain the
basis for statements about the accuracy
of the estimates.
(3] The survey design is operationally
feasible. The sponsoring agency shall
demonstrate that it is able to carry out
the survey as designed.
(4) Adequate steps have been taken to
minimize the impact of nonsamphng
error on the estimates to be derived
from survey data. The documentation
shall identify the potential sources of
error and give the sponsoring agency's
projection of the size of such errors and
their impact, individually and in the
aggregate, on the estimates. All potential
sources of nonsampling error shall be
analyzed, including:
(A) Any differences between the
target population (the universe of study)
and the sampling frame;
(B) Any conceptual differences
between the parameters to be estimated
on the basis of survey data and the
parameters desired for the planned
analyses;
(C) In statistical studies that involve
control groups, dissimilarities between •
the study group and the control group:
and
(O) The expected extent and impact of
overall and item nonresponse. The
documentation shall include estimates
of response rates based on the
sponsoring agency's prior experience (or
the experience of the organization
conducting the survey) with the same
survey or similar surveys. If the agency
anticipates significantly different rates
* of nonresponse for different subgroups
of the sampled population or for
different questions in the survey, it shall
provide separate estimates for all such
subgroups and questions. If timeliness of
response is important (for example, if
the sponsoring agency has specified cut-
off dates for publishing or using data), it
shall indicate the expected response
rates by the relevant cut-off dates.
Under S CFR 1320.8(8). statistical
surveys must be designed to produce
results that can be generalized to the
universe of study. Accordingly, the
sponsoring agency shall submit
documentation to demonstrate either
that nonsampling error is sufficiently
small that it is unlikely Urbias estimates
derived from survey data, or that
effective methods have been developed
to adjust for such error. All
benchmarking, imputation, and other
adjustment methods shall be described
in the documentation.
(5) The proposed design satisfies the
survey objectives In a way that
minimizes the burden on respondents.
consistent with sound administrative
practices and reasonable cost to the
government. In estimating the burden on
respondents, agencies shall take into
account the amount of time it will take
to provide proper responses and the cost
of the time of the particular individuals
who will be asked to respond, including
any time that will be required by agency
follow-up. Before submitting a survey to
OMB for approval, (he sponsoring
agency shall analyze the benefits and
costs of a range of possible design
options, including alternative sample
sizes and data collection techniques.
The documentation supporting the
request for approval shall summarize
the results of this analysis.
(6) The survey meets the conditions
set forth in Attachment B. Section 4.
whenever response to the survey is
mandatory. The documentation for
surveys that use mandatory reporting
authority shall also provide a
justification of the use of such authority
in terms of its effectiveness in meeting
the survey's objectives. The
documentation shall include the
sponsoring agency's plans for enforcing
the penalties for nonresponse.
g. Performance and Quality Measures.
The documentation'shall specify what
measures the sponsoring agency will use
to evaluate the performance of the
survey and the quality of the data
collected. It shall list the performance
indicators that will be calculated and
available after completion of the survey,
such as:
(1) Nonresponse rates;
(2) Rates of edit failure:
(3) Percentage of cases requiring
follow-up or reinterview; and
(4) Timeliness measures, such as the
number of days required to collect data
from respondents and the number of
days between the reference date of the
survey and the date survey results are
published.
The documentation shall also include
a description of the information
validation techniques and quality
control procedures that will be used to
verify that data in publications or in
final data bases are equivalent to the
data actually collected.
h. Disclosure control techniques. The
documentation shall state what
disclosure control techniques are to be
used in each kind of release of survey
data (e.g., summary tabulations and
microdata products). If variables or
table cells are to be suppressed, the
documentation shall indicate what
variables will be suppressed and what
cross classifications are likely to be lost.
The description of disclosure control
techniques shall be general enough so
that it cannot be used to breach the
protection against disclosure. The
documentation shall report the
-------
1546
Federal Register / Vol. 53. No. 12 / Wednesday. January 20. 1988 / Notices
sponsoring agency's efforts to minimize
the impact of its disclosure control
techniques on the usefulness of the
survey data.
i. Quality standards for publishing
data The documentation shall specify
what quality standards the sponsoring
agency will use to determine whether
data from the survey are publishable or
should be suppressed.
. j. Processing and reporting of survey
data. The documentation shall describe:
(1) The editing and imputation
procedures to be used in processing the
survey data;
(2) The procedures to be used in
preparing estimates based on survey
data, including benchmarking and
seasonal adjustment methods and
methods for creating composite
variables; and
(3) The contents of all planned
products of the survey, including .
tabulations to be published, analytical
reports to be prepared by the sponsoring
agency, and public use data products,
with the dates that the sponsoring
agency intends to release these
products. If the data to be gathered in
the survey are likely to be of substantial
interest for research purposes, the
sponsoring agency should plan to
produce a public-use data file. The data
Gle should be offered on a cost-recovery
basis, i.e., with the fees for purchase or
use of the file set so as to recover the
agency's incremental cost of preparing,
maintaining, and distributing the file, as
provided in OMB Circular A-25. The
agency's documentation shall indicate
whether the agency plans to release a
public-use data file in conformity with
the provisions in Attachment C. Section
2.d. The documentation shall also
provide a schedule for such release.
k. The basis for any pledge of
confidentiality. If the sponsoring agency
intends to make a pledge of
confidentiality covering information
collected in the survey, the
documentation shall include the
statement required by Attachment B.
Section 3.b.(l) of this Circular, and shall
indicate whether'the agency has
complied with all the other provisions of
Attachment B.. Section 3.a. and 3.b.
ATTACHMENT B—Circular No. A-
Treatment of Respondents
1. This attachment sets forth
guidelines for the treatment of
respondents to statistical surveys, and "
replaces guidance on this subject
previously contained in section 8 of
Statistical Policy Directive No. 1. The
guidelines in this attachment are
independent of obligations imposed on
agencies by the Privacy Act and apply
regardless of whether the record
systems supporting an agency's
statistical programs are exempt from
certain provisions of that Act under 5
U.S.C. 552a(k). Moreover, these
guidelines are to be applied in addition
to all relevant provisions of 5 CFR Part
1320. The guidelines for matching or
combining statistical and administrative
information (Sections 2.d.(3) and 3.c.)
apply only to matches for statistical
purposes and therefore do not overlap
the OMB Guidelines for conducting
computerized matching programs (47 FR
21656. May 19.1982). which specifically
exclude from their coverage matches for
statistical purposes.
2. Informing respondents. Agencies
that sponsor statistical surveys should
ensure that potential respondents are
provided the following information at
the time they are asked to participate in
such surveys:
a. The names of all sponsors of the
survey, including organizations other
than Federal agencies that are providing
funding for the survey:
b. The subjects about which
respondents will be asked to supply
information, in sufficient detail to alert
respondents to any sensitive topics in
the survey;
c. For surveys in which information is
to be collected periodically from the
same respondents, the period during
which and the frequency with which the
respondent wilt be asked to supply
information;
d. The uses that are intended to be
made of information from the survey
(e.g.. to publish statistics on a stated
subject or to carry out a particular
study). The sponsorir^ agency should
•explicitly state:
(1) Whether it intends to use the
information exclusively for statistical
purposes. In cases for which that is the
intent, the sponsoring agency should
explicitly state the measures it has
taken and its legal authority to prevent
nonstatistical uses:
(2) Whether a pledge of
confidentiality, made in accordance
with the requirements of Section 3.b. of
this attachment, covers the information
collected and the fact of the
respondent's participation in the survey:
and
(3) Whether it plans to match or
combine survey data with information
about survey respondents from
administrative sources. If any individual
match or combination of survey data
and administrative information ia
intended, the sponsoring agency should
explain its nature and purpose to
potential respondents; and
e. If the survey uses mandatory
reporting authority, a full quotation of
the relevant text of the statute(s) and/or
regulation(s) that establish the
mandatory reporting authority and the
penalties for nonresponse.
3. Protection of confidentiality and
privacy.
a. A Federal agency that collects
information for statistical purposes
under a pledge of confidentiality should
protect that information from public
disclosure to the extent permitted by
law. Measures to provide this protection
should Include at a minimum:
(1) Written policies that proscribe the
use of such information for
nonstatistical purposes;
(2) Written policies and procedures
for ensuring the physical security of the
information while in the possession of
the agency, its contractors, or its
grantees. Where appropriate, these
should include policies implementing the
provisions of Appendix III of OMB
Circular A-130 (50 FR 52730. December
24.1985). on the Security of Automated
Information Systems;
(3) A program to ensure that the
agency's employees, contractors, and
grantees are fully aware of their
responsibilities for the safeguarding of
confidential information; and
(4) Such other policies, programs, or
agency rules as are available and
necessary to protect from public
disclosure respondent information that
the agency believes qualifies for
exemption from disclosure under the
Freedom of Information Act (5 U.S.C
552).
Policies, programs, or rules
estabk'shed for this purpose should
eliminate any legal or administrative
discretion otherwise retained by the
agency to make public disclosures that
are not consistent with the provisions of
this section. Before submitting to OMB
for review any proposed information
collection for statistical purposes that is
to be covered by a pledge of
confidentiality, the sponsoring agency
shall have resolved any legal questions
affecting its ability to prevent public
disclosure of the information.
b. To ensure consistent and accurate
public understanding of pledges of
statistical confidentiality, no agency
should make such a pledge of
confidentiality unless:
(1) It has provided a statement to
OMB that it believes the information
•covered by the pledge Is exempt from
disclosure under the Freedom of
Information Act indicating upon what
exemption it relies and, if the exemption
is by statute, citing the relevant statute;
(2) It has met the requirements of
Section 3.a, of this attachment for
protecting confidentiality, and
-------
Federal Register / Vol. 53. No. 12 / Wednesday. January 20. 19B8 / Notices 1547
(3) It includes in the information
collection the following uniform pledge
of confidentiality:
This information collection conforms to
legal and administrative standards
established by the Federal government to
assure confidential treatment of statistical
information. The information you provide will
be used only for statistical purposes and will
not be published or released in any form that
would reveal specific information reported by
any individually identifiable respondent. The
[name of sponsoring agency or department]
has determined that the information you
provide, as well as the fact that you have
participated in this survey, is exempt from
public disclosure under the Freedom of
information Act.
In any case where potential respondents
might misinterpret agency statements
concerning statistical use or" confidential
treatment, OMB may require the
sponsoring agency to display on
information collection forma, or
otherwise convey to potential
respondents, an explicit statement of
noncompliance with the confidentiality
provisions of this Circular.
c. If information is collected for
statistical purposes, as stated to
respondents pursuant to Section 2.d.(l)
of this attachment, any individual match
or combination of that information with
information from administrative records
should meet the following conditions:
(1) The match or combination is
exclusively for statistical purposes:
(2) The sponsor of each statistical
survey involved in the match or
combination has informed all
respondents as required in Section
2.d.(3) of this attachment; and
(3) Any match or combination
involving a statistical survey is
conducted as described in the
sponsoring agency's request for OMB
approval of the survey, submitted in
accordance with Attachment A, Section
2.a. of this Circular.
4. Designing and conducting surveys
that use mandatory reporting authority.
Unless there is a statutory
requirement that the survey be designed
otherwise, any statistical survey that
uses mandatory reporting authority
should meet the following conditions:
a. The survey is designed so that all
units in the population of study, or
within each designated sampling
stratum of the population, have an equal
probability of being included in the
survey:
b. If the survey requires periodic
responses by the same respondents, the
sample is redrawn at regular intervals
not to exceed 3 years; and
c. The information that respondents
are required to provide includes only
factual information that can be obtained
from and verified by respondents'
records.
ATTACHMENT C—Circular No. A-
Statistical Publications
1. This attachment sets forth
guidelines for publishing the results of
statistical surveys and studies. It
replaces guidance previously contained
in Statistical Policy Directive No. 2.
"Standards for the Publication of
Statistics," which is rescinded by this
Circular.
2. Obligation to publish.
a. Agencies that sponsor statistical
surveys and studies should either
publish the results or else make them
available upon request in a form that
does not reveal information about any
individually-identifiable respondent.
Nonpublished results should be
maintained in a data base that is
sufficiently accessible that requests by
other agencies or the public for
tabulations or summaries can be met
within 90 days.
b. Except as provided in Attachment E
of this Circular for principal Federal
economic indicators, no data collected
in a statistical survey should be
suppressed or withheld from release
unless:
(1) Suppression is necessary to protect
the confidentiality of information
provided by individual respondents, or
(2) The data fail to meet the quality
standards for publication that the
sponsoring agency has specified in its
request for OMB approval of the data
collection, as required in Attachment A.
Section 2.L of this Circular. If it is
necessary to suppress data because they
fail to meet these standards, the
sponsoring agency shall report to OMB
die data items to be suppressed and the
basis, for suppression, as soon as the
agency has that information, and should:
(A) Notify potential users that the
data will not be published or otherwise
released; and
(B) Make no use of the suppressed
data in any published reports or
estimates.
c. Agencies should annually publish.
either in the Federal Register or in an
agency publication, a catalog of the
statistical data in their files, indicating
the form in which they are available to
the public (e.g.. hardcopy reports, public
use tapes, online, tabulations upon
request), the period of time for which
they will remain available in each form.
and the name, address, and telephone
number of the office within the issuing
agency to which inquiries may be
directed.
d. Agencies should provide for the
prompt release of public-use data files
that enable analysts outside the
sponsoring agency to reproduce and
verify the sponsoring agency's results.
test alternative hypotheses, and develop
alternative interpretations. If survey
plans submitted to OMB included the
preparation of public-use data files (see
Attachment A, Section 2.j.(3]), the
sponsoring agency should release these
files no later than 90 days after the
sponsoring agency's first published
reports (except for preliminary summary
tabulations) based on survey data. If
such files are not available when the
first reports are issued, the reports
should contain a notice regarding their
forthcoming availability.
3. Use of standard statistical
classifications and definitions. Agencies
should use the standard classifications
and definitions aet forth in Attachment F
when publishing statistics for which
there are such standards. When
• reporting statistics for which there are
no such standards, the classifications
and definitions used should support the
broadest possible range of analytical
uses of the data. Publications should
explain all standard classifications and
definitions that are used and provide
appropnate references.
4. Presentation of statistics The
following guidelines apply to the
presentation of statistical data in
publications.
a. Data tables, charts, and graphs.
Tables, charts, and graphs should be
designed and labeled so that their
meaning is clear and unambiguous. The
publication should include an
explanation of all technical terms and
the definitions of all categorizations or
appropriate references for them. Any
term whose usage differs from common
usage or might otherwise be .
misconstrued should be clearly defined.
Labels should be properly aligned-
groups clearly separated, and all group
totals included. Labels for subgroups
should be sufficiently indented to ensure
that relationships among the categories
are clear. If all subcategones are not
displayed, there should be an
explanation of what has been omitted. If
tables contain percentages, the
population totals that are needed to
reproduce the numbers on which the
percentages are based should be
reported.
b. Complete presentation.
(1) Any publication that contains data
tables based on sample surveys should
also report estimates of standard
deviations or variances. Variances and
standard deviations should be directly
measured from the sample whenever
possible. If generalized variances or
other approximations to the variances
-------
1548
Federal Register / Vol. 53. No. 12 / Wednesday. January 20. 1988 / Notices
are used in the publication, the
publication should clearly inform users
that the variance estimates are
approximations and statistics that rely
on these variances may be inaccurate.
The publication should fully describe
the methods used to estimate variances.
(2) To enable users to evaluate further
the reliability of estimates, the agency
issuing the publication should publish.
or offer and be prepared to provide upon
request, the actual number of
observations in each published table
cell.
(3] To enable users to discern the
extent and impact of imputation,
agencies should identify imputed data
items with separate codes when they
publish unsummarized data (as in
public-use data files). When they
publish summarized data, agencies
should report the percentage of
observations that are imputed for each
variable included in the summary and
should indicate (by code or other
means] the percentage of data items that
are imputed in each published table cell.
c. Discussion of findings. The
evidence to support conclusions and
statements about causes and effects
should be presented in full in the
publication. Inferences about
differences and changes should be
based upon generally accepted
statistical techniques, which should be
identified and described in the
publication.
d. Preliminary and revised estimates.
Any data or reports that are released
prior to final editing, compilation, or
correction should be clearly labeled as
preliminary. The release should explain
the limitations of the preliminary
estimates, describe (he nature of furore
corrections, and provide the scheduled
dates for the publication of revisions.
When revised data are published, the
quantitative difference between the
preliminary and revised data should be
shown, the revision process described,
and the effects of the revisions on the
interpretation of the data explained.
5. Documentation. All publications of
results of a statistical surveyor study
should include documentation that
describes the purpose of and the
procedures for conducting the survey or
study and the quality and limitations of
the results. For reports that are part of a
regular series, the required
documentation may be published
separately but should be kept
completely current and accurate. For
one-time surveys or studies and for the
first publication in a series, the
documentation should be available at
the same time as the publication. Public*
use data files should provide the
documentation both in paper form and
as part of the file. The documentation
should cover:
a. The purpose of the study.
b. A description of the study design.
The documentation should explain the
rationale for the study design and how
the study was conducted. The
description should be sufficiently
detailed to serve most needs of the
pnncipal users of the publication and to
ensure that all users are alerted to
aspects of the study design that affect
the interpretation of results. It should
include a brief account of the outcome
of the quality control procedures used.
The publication should refer the user to
the complete documentation of methods
and procedures that the sponsoring
agency maintains in accordance with
Attachment D of this Circular.
c. The sources of all statistical data
used in the study. The documentation
should clearly identify data that were
collected through different surveys or
from different administrative record
systems and describe how such data
were changed through editing and
imputation. The documentation should
clearly identify constructed and
estimated variables and describe the
methods used to construct composite
• variables and the models or statistical
procedures used to develop estimates. If
data have been subjected to aggregate
adjustments, such as benchmarking or
seasonal adjustment this should be
clearly noted and the adjustment
methods described.
d. A discussion of the limitations of
the survey or study results. Sufficient
information should be presented to
enable the user to judge the accuracy of
the reported results of the study and the
extent to which the results can be
extrapolated or generalized to other
populations or circumstances.
6. Review of publications. Before
releasing any publication, the agency
should review it to ensure that it clearly
and correctly presents information and
that it complies with the standards in
this Circular. Agencies that regularly
issue statistical publications should
establish a formal review process with
written procedures and specific
assignments of responsibility. The
process should provide for independent
review of each publication by at least
one competent professional who was
not involved in the preparation of the
publication or the survey or study on
which it is based.
7. Assistance to users. All
publications should contain the name.
address, and telephone number of an
office within the agency issuing the
publication that may be contacted for
further information or assistance. The
documentation of public-use files should
indicate what user services are
available for the file and what period of
time such services will remain available.
ATTACHMENT D—Circular No. A-
Documentation of Methods and
Procedures
1. Agencies that regularly publish
statistics should maintain complete and
current documentation of all methods
and procedures used to produce these
estimates. This documentation should
be publicly available in its entirety and
therefore should not contain any
information that the agency considers to
be exempt from disclosure under the
Freedom of Information Act (5 U.S.C.
552). The documentation should include.
but is not limited to. the areas detailed
in this attachment.
2. Surveys. Agencies that sponsor
statistical surveys should maintain for
each survey a file documenting the
survey design and the operations used
to collect process, and publish data
from the survey. The file should be
maintained until the survey is no longer
being conducted and demand for the
data no longer exists.
The documentation file should
Include:
a. Descriptions of: the target
population: the sampling frame: the
correspondence between the target
population and the sampling frame; the
sample design: collection methods.
locations, and dates; follow-up
procedures; the methods used to edit
and tabulate survey data; any
imputation and adjustment procedures
applied to survey data; and the
procedures used to control the quality of
survey operations.
b. Copies of the forms used in the
survey, including the survey
questionnaire and the instructions to
both respondents and interviewers;
c. The performance statistics, such as
those presented in Attachment A.
Section 2.g. of this Circular, that the
sponsoring agency, its contractors, or its
grantees have used for management and
evaluation of the survey: and
d. The results of any evaluations of
survey operations and data, including
quality control audits.
3. AH imputation procedures, coding
procedures, and procedures used to
adjust data acquired through surveys or
administrative records. The
documentation of such procedures.
Including benchmarking, revision, and
seasonal adjustment procedures, should
be complete enough to enable a
competent professional from outside the
agency to duplicate the procedures and
results.
-------
Federal Register / Vol. 53. No. 12 / Wednesday. January 20, 1988 / Notices 1549
t. Models The documentation of
models used to generate estimates
should be complete enough to enable a
competent professional from outside the
agency to use the model and duplicate
the sponsoring agency's results. Model
documentation should include the
following: a model specification; a
summary of the purpose of the model.
including its principles, structure, and
assumptions: a complete mathematical
statement of the model; a description of
any data base used with the model; a
descnption of the validation.
verification, and audit record associated
with the model; and the results of using
the model, including both the raw
outputs and analysis based on those'
outputs.
If the documentation is for a computer
model, it should also include a user's
guide explaining how to run the model.
5. The procedures used to generate
statistical estimates that are required
by statute or by this Circular to be used
by Federal agencies in making
determinations about the benefits,
obligations, privileges, or rights of
specific individuals or entities.
Complete, current documentation of all
procedures, including specific
assumptions and decision rules, should
be available in published form at the
time such estimates are released. All
statistical estimates used in making such
determinations should themselves be
published.
ATTACHMENT E—Circular No. A-
Compilation. Release, and Evaluation of
Principal Federal Economic Indicators
1. This attachment replaces Statistical
Policy Directive No. 3. "Compilation.
Release and Evaluabon of Principal
Federal Economic Indicators." which '
this Circular rescinds.
2. Agencies that publish statistical
estimates that have been designated by
the Director of OMB as principal Federal
economic indicators should follow the
procedures prescribed in this
attachment for the compilation, release,
and evaluation of these estimates.
3. Designation of Principal Indicators.
The Director of OMB shall determine,
after consultation with the affected
Federal agencies, the statistics and
estimates to be designated as principal
Federal economic indicators and
covered by this attachment to the
Circular. At the beginning of each
calendar year, OMB will publish the list
of indicators covered and the scheduled
dates for release of each indicator
during the year.
4. Prompt Release. The interval
between the end of the period to which
the statistics refer and the date when
the data or estimates are released to the
public should be as short as practicable.
Agencies should compile and release
series that are issued quarterly or more
frequently within 22 working days of the
end of the reference period.
5. Release Schedule. The releasing
agency is responsible for ensuring that
the interested public is aware of the
release time and date. The last report of
each calendar year should contain the
time and date of all reports in the
upcoming year. In addition, each release
should include an announcement of the
time and date of the next release. The
releasing agency shall provide a
schedule of releases for the upcoming
calendar year to the Statistical Policy
Office. Office of Information and
Regulatory Affairs, OMB, by November
30th of each year. Changes in the release
schedule may occur only if special.
unforeseen circumstances arise. The
releasing agency should announce and
fully explain any schedule changes as
soon as it has determined they are
unavoidable.
There should be one office in the
agency that can provide the release
schedule of all the agency's principal
economic indicators. The name, address,
and telephone number of this office
should be readily available to the
public. Agencies should establish and
maintain no more than two specific
times of day for the release of their
principal economic indicators and
should only release indicators at such
designated times.
ft. Announcement of Changes.
Agencies should announce any planned
change in data collection, analysis, or
estimation methods that may affect the
interpretation of a principal economic
indicator as far in advance of the change
as possible. The agency should include
the announcement in a regular report of
the economic indicator. When possible,
a period for public comment should be
provided between the announcement of
an intended change and its
implementation. At a minimum, for
quarterly and monthly series, the agency
should announce the change at least
three reports before the first report
affected by the change. For weekly and
annual series, the announcement should
precede the first report affected by the
change by at least three months, tn the
first report affected by the change, the
agency should include a complete
description of the change and its impact.
Agencies should fully explain
unforeseeable changes due to special
circumstances «s soon as they are
known and in the first report affected by
the change.
7. Release Procedure. The statistical
agency that produces each principal
economic indicator should issue it in a
press release or other printed report.
The agency should issue a press release
where this will significantly speed up
the dissemination of data to the public.
Each statistical agency 19 responsible
for establishing procedures to ensure
that there is no premature release of
information or data estimates during the
time required for preparation of the
public report. This includes the
protection of public-use data banks.
which should not receive any data or
estimates until they are officially
released. As soon as copies of materials
for public release have been prepared.
the agency should physically secure
them.
Except for the authorized distribution
described in this section, agencies
should ensure that no information or
data estimates are released before the
official release time.
The agency shall provide prerelease
information to the President, through the
Chairman of the Council of Economic
Advisers, as soon as it is available. The
agency should grant prerelease access
to others only under the following
conditions:
a. The agency head has established
whatever security arrangements, and
imposed whatever conditions on the
granting of access, that are necessary to
prevent unauthorized dissemination or
use.
b. The agency head will ensure that
any person granted access has been
fully informed of, and has agreed to.
these conditions.
c. Any prerelease of information
under an embargo will not precede the
official release time by more than 30
minutes.
d. In ail cases, prerelease access will
precede the official release time only to
the extent necessary for an orderly
review of the data.
All employees of the Executive
Branch who receive prerelease
distribution of information and data
estimates as authorized above are
responsible for ensuring that no release
occurs prior to the official release time.
Except for members of the staff of the
agency issuing the principal economic
indicator who have been designated by
the agency head to provide technical
explanations of the data, employees of
the Executive Branch should not
comment publicly on the data until at
least one hour after the official release
time.
8. Preliminary Estimates and
Revisions. Deciding when to release a
principal economic indicator requires
the balancing of accuracy and
timeliness. Agencies should not
-------
1550
Federal Register / Vol. 53. No. 12 / Wednesday, [anuary 20. 1988 / Notices
withhold information needed to evaluate
current economic conditions by
imposing unnecessarily stringent
accuracy requirements on preliminary
estimates. They should, however, fully
inform the public of the degree of
inaccuracy that is accepted to
accommodate timely release.
In the case of estimates based on
probability samples, agencies should
publish measures of uncertainty based
on the sampling variance and
consideration of nonsampling errors,
e.g., a root-mean-square error estimate
or its equivalent. In cases where the
confidence interval about any single
point estimate cannot be estimated.
results should be presented only in the
form of an interval estimate. Methods
used to estimate upper and lower bound
values that define the interval estimate
should be designed to meet three
objectives: (1) Preliminary interval
estimates should include the final point
estimate with high probability; (2) the
width of the interval should be
consistent with the error history of the
indicator being estimated: and (3) the
bound values should be consistent with.
any confidence intervals that can be
estimated for components of the
indicator.
For either point or interval estimates,
agencies shall apply the following
guidelines when issuing and evaluating
preliminary data and revisions:
a. Agencies should clearly identify
estimates as preliminary or revised;
b. If the difference between
preliminary and final aggregate
estimates, or the width of a preliminary
interval estimate, is large relative to
average pehod-to-period differences, the
agency should either take steps to
improve the accuracy of preliminary
estimates or delay the release of
estimates until a reliable estimate can
be made:
c. If preliminary estimates show signs
of a consistent bias (for example, if
revisions are consistently in the same
direction), the agency should take steps
to eliminate this bias;
d. Revisions occurring for routine
reasons, such as benchmarking and
updating of seasonality factors, should
be consolidated and released
simultaneously;
e. Agencies should release routine
revisions of a principal economic
indicator only as part of the regular
reporting schedule; and
f. Revisions occurring for other than
routine reasons should be fully
explained and should be released as
soon as adjustments can be completed.
9. Granting of Exceptions. Prior to
taking any action that may be contrary
to the provisions of Attachment E, the
head of a releasing agency shall consult
with the Director of OMB. If the Director
determines that the action is contrary to
the provisions of this attachment, the
head of the agency may apply for an
exception. Any agency requesting an
exception shall demonstrate that the
proposed exception is necessary and
consistent with the purposes of the
Circular.
10. Performance Evaluation. Each
agency that issues a principal Federal
economic indicator shall submit a
performance evaluation of that indicator
to the Statistical Policy Office. Office of
Information and Regulatory Affairs,
OMB. every three years. A schedule for
the performance evaluation of data
series or estimates designated as
principal Federal economic indicators
will be prepared by the Statistical Policy
Office. The evaluation shall address the
following issues:
a. The accuracy and reliability of the
series, e.g., the magnitude and direction
of all revisions, the performance of the
series relative to established
benchmarks, and the proportion and
effect of nonresponses or responses
received after the publication of
preliminary estimates;
b. The accuracy, completeness, and
accessibility of documentation
describing the methods used in
compiling and revising the indicator.
c. The agency's performance in
meeting the designated release schedule
and the prompt release objective of this
Circular;
d The agency's ability to avoid
disclosure prior to the scheduled release
time; and
e. Any additional issues that the
Director may specify in writing to the
agency at least 6 months in advance of
the scheduled submission date.
The Director will review the
evaluation to determine whether the
indicator is prepared and published in
conformity with all OMB statistical
policies, standards, and guidelines. OMB
will include a summary of the year's
evaluations and their reviews in the
annual report to Congress required by 44
U.S.C. 3514.
ATTACHMENT F—Circular No. A-
Use of Standard Classifications, Data
Sources, and Definitions
1. This attachment replaces guidance
on the use of standard classifications,
' data sources, and definitions previously
contained in Statistical Policy Directives
5-17 and in the directive on
"Comparability of Statistics on Business
Size" (47 FR 21382, May 18.1982). all of
which this Circular rescinds. Nine of the
standards established in these directives
are retained in the Circular. Five others
have been discontinued as government-
wide statistical standards, either
because they have not proven useful for
the statistical purposes intended or
because they are used by only one or
two agencies in collecting and
publishing statistics and can effectively
be maintained by these agencies. The
discontinued standards are a standard
reference base period for Federal
government general-purpose index
numbers; standard Federal
administrative regions; the standard
industrial classification of enterprises;
the standard classification of fields of
science and engineering; and the
standard gas pressure base.
2. Agencies should use standard
statistical classifications, data sources.
and definitions for the purposes and in
the manner specified in this section.
a. Metropolitan Statistical Areas
All agencies that conduct statistical
programs to collect and publish data for
Metropolitan Statistical Areas (MSAs)
should use the most recent definitions of
Metropolitan Statistical Areas
established by the Office of
Management and Budget.
OMB establishes and maintains the
definitions of Metropolitan Statistical
Areas solely for statistical purposes. In
periodically reviewing and revising the
MSA definitions, OMB does not take
into account or attempt to anticipate any
nonstatistical uses that may be made of
the definitions, nor will OMB modify the
definitions to meet the requirements of
any nonstatistical program.
Therefore, if an agency uses the MSA
definitions in a nonstatistical program, it
is that agency's responsibility to ensure
that the definitions are appropriate for
such use. In cases where an agency is
publishing for comment a proposed
regulation that would use the MSA
definitions for a nonstatistical purpose,
the agency should seek public comment
on the proposed use of the MSA
definitions. Agencies that use the MSA
definitions in a nonstatistical program
may modify the MSA definitions,
exclusively for the purposes of that
program. However, in order to avoid
confusion with the standard statistical
definitions, all such modifications
should be clearly identified as
deviations from the OMB standard
definitions of Metropolitan Statistical
Areas.
b. Standard Industrial Classification
All agencies that conduct statistical
programs to collect and publish
establishment data by industry type
should use the Standard Industrial
Classification (SIC), as published in the
-------
Federal Register / Vol. 53. No. 12 / Wednesday. January 20. 1988 / Notices
1551
most recent edition of the Standard
Industrial Classification Manual
OMB establishes and maintains the
Standard Industrial Classification solely
for statistical purposes. In periodically
reviewing and revising the SIC. OMB
does not take into account or attempt to
anticipate any nonstatistical uses that
may be made of the classification, nor
will OMB modify the classification to
meet the requirements of any
nonstatistical program.
Therefore, if an agency uses the SIC in
a nonstatistical program, it is that
agency's responsibility to ensure that
the classification is appropriate for such
use. In cases where an agency is
publishing for comment a proposed
regulation that would use the SIC for a
nonstatistical purpose, the agency
should seek public comment on the
proposed use of the SIC. Agencies that
use the SIC in a nonstatistical program
may modify the SIC, exclusively for the
purposes of that program. However, in
order to avoid confusion with the
standard statistical classification, all'
such modifications should be clearly
identified as deviations from the
Standard Industrial Classification.
Any agency requesting or requiring an
establishment to provide its SIC code as
part of an information collection shall
clearly identify within the information
collection instrument or its directions
•the name, address, and telephone
number of a unit within that agency that
will assist respondents in determining
their appropriate SIC code. When
submitting any such information
collection request to OMB for clearance,
the agency shall disclose any
modifications it has made in the SIC for
the purposes of any nonstatistical
program of which the information
collection is a part and shall
demonstrate that it has sufficient and .
appropriately-trained personnel to assist
its respondents in determining their SIC
codes.
c. Standard Occupational
Classification
All agencies that conduct statistical
programs to collect and publish data by
occupation should use the Standard
Occupational Classification (SOC), as
published in the most recent edition of
the Standard Occupational
Classification Manual.
OMB establishes and maintains the
Standard Occupational Classification
solely for statistical purposes. In
periodically reviewing and revising the
SOC. OMB does not take into account or
attempt to anticipate any nonstatistical
uses that may be made of the
classification, nor will OMB modify the
classification to meet the requirements
of any nonstatistical program.
Therefore, if an agency uses the SOC
in a nonstatisticai program, it is that
agency's responsibility to ensure that
the classification is appropriate for such
use. In cases where an agency is
publishing for comment a proposed
regulation that would use the SOC for a
nonstatistical purpose, the agency
should seek public comment on the
proposed use of the SOC. Agencies that
use the SOC in a nonstatislical program
may modify the SOC. exclusively for the
purposes of that program. However, in
order to avoid confusion with the
standard statistical classification, all
such modifications should be clearly
identified as deviations from the
Standard Occupational Classification.
Any agency requesting or requiring
respondents to provide SOC codes as
part of an information collection shall
clearly identify within the information
collection instrument or its directions
the name, address, and telephone
number of a unit within that agency that
will assist respondents in determining
their appropriate SOC codes. When
submitting any such information
collection request to OMB for clearance.
the agency shall disclose any
modifications it has made in the SOC for
the purposes of any nonstatistical
program of which the information
collection is a part and shall
demonstrate that it has sufficient and
appropriately-trained personnel to assist
its respondents in determining their SOC
codes.
d. Standard Business Size Categories
When publishing statistics, agencies
should use the size categories in the
table below to classify reporting
businesses by number of employees,
revenues, or assets. Tabulations based
on these categories should be
accompanied by precise definitions of
the variables used to measure size and
of the type of reporting unit tabulated.
BUSINESS SIZE CATEGORIES
[Revenues or asset* (dollars)]
$25.000
$50.000
$100.000
$250000
$500.000
$1 million
$2.5 million
$5 million
$10 million ..._
$25 million
$50 million
$100 million
$250 million .:.. . .
$500 million
$1 billion
$2.5 billion
$5 billion
under ....
under..
under . ..
under . ..
under . ..
under . ..
under ..
under...
under ...
under
under ....
under ....
under ...
under ..
under ...
under ..
under .
or more
. $25.000.
. $50.000.
. $100.000
. $250.000.
$500,000.
$1 million.
$2.5 million
$5 million.
$10 million
$25 million.
$50 million.
$100 million
. $250 million
. S500 million
. $1 billion.
$2.5 billion.
$5 billion
BUSINESS SIZE CATEGORIES
(Employment (number
-------
1552
Federal Register / Vol. 53. No. 12 / Wednesday, January 20. 1988 / Notices
g. Standard Source of Labor Force and
Unemployment Estimates for Use in the
Administration of Federal Programs
Agencies that are required by law to
allocate Federal funds or determine
eligibility for participation in a Federal
program on the basis of the employment.
unemployment, or labor force
participation levels or rates in the
population or any subgroups of the
population of Stale, county, or local
units of government, should use—to the
extent permitted by law—the most
current estimates published by the
Bureau of Labor Statistics that are
available for all units of the relevant
levels of government.
h. Definition of Poverty
Agencies that are required by law to
use the definition of poverty established
by the Office of Management and
Budget, and agencies that publish
statistical estimates of the number or
persons, families, or households in
poverty, should continue to use as the
definition of poverty the annual income
thresholds that for 1986 and prior years
have been published by the Bureau of
the Census in its Current Population
Reports, P-60 series. For 1987 and
subsequent years, the definition of
poverty shall be the 1986 thresholds
adjusted annually by the year-over-year
change in the Consumer Price Index for
all urban consumers.
i. Racial and Ethnic Categories
To the extent permitted by law.
agencies should use the following
categories for all purposes that require
classifying people by racial and/or
ethnic background. A person's racial
and/or ethnic background is determined
by the way in which the person chooses
to be identified in his/her community.
Racial Categories:
American Indian or Alaska Native
Asian or Pacific Islander
Black
White
In establishing reporting systems and
collecting data, agencies should permit
individuals to identify themselves as
"other" if they believe they do not fall
into any of the categories listed above.
Ethnic Categories:
Hispanic
Not of Hispanic origin
Agencies may use other, more
detailed categories as long as such
categories can be aggregated into the
basic categories listed in this section.
ATTACHMENT G—Circular No. A-
Provision of Statistical Information to
International Organizations
1. This attachment replaces Statistical
Policy Directive No. 18. "Providing of
Statistical Information to International
Organizations." which this Circular
rescinds.
2. In accordance with Section 1 of
Executive Order 10033 of February 8.
1949. as amended (see 22 USCA 286f).
the Director of OMB will determine.
with the concurrence of the Secretary of
State, what statistical information shall
be provided in response to official
requests received by the United States
Government from any international
organization of which the United States
is a member, and will determine which
agency or agencies shall prepare the
statistical information to be provided.
Agencies that have not been previously
designated by the Director to prepare
such information shall notify the
Director and receive his concurrence
before compiling or providing any
statistical information for publication or
use by any international organization of
which (he 'Jnited States is a member.
[FR Doc. B8-fl96 Filed 1-19-88, 8.45 am]
BILLING CODE 1110-01-M
-------
SESSION: Biological/Human Sampling
TITLE: Pharmacokinetic Modeling Using SAAM — A Tool for Simulation and
Model Fitting
AUTHOR: Bernard Most, Northrup Services, Inc.
Directly from its definition, pharmacokinetics (PK), is of interest to the
researcher studying health effects related to environmental exposures. Compart-
mental modeling is a means of studying approximate PK by considering a finite
number of macroscopic sub-systems deemed relevant to elucidate the health
effect under study. SAAM is a computer tool for:
analyzing compartmental models given the model parameters and
estimating model parameters given experimental data.
The session will introduce SAAM and present an illustrative example, from
practice, of each use:
simulation of a proposed model for the effect of transient CO exposure
on blood COHb levels and
modeling absorption of nickel in drinking water.
23
-------
SESSION: Biological/Human Sampling
TITLE: Considerations In Measuring Blologicals In Human Populations
AUTHOR: Cheryl Slegel Scott. Exposure Evaluation Division, Office of Toxic
Substances
Abstract not available.
24
-------
SESSION: Biological/Human Sampling
TITLE: Blomarkers: Promise and Practice
AUTHOR: John Fowle III, Office of Health Research. Office of Research and
Development
Biomarkers offer great promise for linking exposure to dose, and dose to
disease due to their capabilities for specificity, sensitivity, rapid analysis, and
low cost. This promise has only been met for a few markers, such as those for
lead, carbon monoxide, and organophosphate pesticides. In these cases, extensive
characterization work was required to validate the markers with respect to
specificity, sensitivity, population variation, etc., before they could be applied
with confidence for regulatory purposes. To avoid the possibility that data from
poorly characterized markers may be applied to Agency decision-making, with
adverse consequences such as those which occurred at Love Canal, ORD has
initiated a Biomarker Program to define issues surrounding the use of biomarkers.
to coordinate efforts across the Agency and between EPA and other Agencies,
and to develop science policies for their application at EPA. The current and
future efforts of this program will be discussed.
25
-------
SESSION Graphics
TITLE: Expert Systems to Assist In Decisions Concerning Land Disposal of
Hazardous Wastes
AUTHOR: Daniel G. Greathouse, Hazardous Waste Engineering Research
Laboratory, Cincinnati, Office of Research and Development
In FY 1984, the Hazardous Waste Engineering Research Laboratory success-
fully developed a small proof-of-concept expert system to assist in interpretation
of chemical immersion test (EPA Method 9090) data for PVC liner materials.
This was the beginning of an orderly progression of efforts to assess the
feasibility of using expert systems to assist in permit reviews for hazardous
waste land disposal sites. Permit review decision areas amenable to expert
system applications have been identified and several systems are in various
stages of development and testing. The rationale for this approach to provide
decision support aids for permit review include the complexity of the required
engineering evaluations; availability of extensive relevant research results and
known subject-specific specialists (experts); concern that permit reviewers do not
have all the required expertise and that they have little, if any, access to
subject-specific regulatory policy and research information; and concern that
decisions may not be consistent among reviewers or with EPA regulations and
policies. The decision areas selected for expert system development and the
progress on the ongoing development efforts will be presented.
26
-------
SESSION: Bioassay and Toxlclty Measurements
TITLE: Sources of Variability in Laboratory Animal Carclnogeniclty Studies
AUTHOR: Joseph K. Haseman, National Institute for Environmental and Health
Sciences
In the absence of adequate epidemiological data, laboratory animal car-
cinogenicity studies, such as those carried out by the National Toxicology
Program (NTP), remain the most definitive means of assessing the carcinogenic
potential of chemicals for humans. The NTP studies also provide a large data
base of similarly-designed experiments, which can be examined retrospectively to
evaluate possible sources of variability in tumor incidence, and the impact that
these factors may have on the interpretation of study results. Various sources
of variability are examined, including factors related to intra-study variability
(e.g., animal room environment, littermates, gross tissue examination and
preparation of slides, histopathology diagnosis, food consumption/weight gain) as
well as those associated with inter-study variability (e.g., lab-to-lab variability,
time-related trends, animal supplies, dietary effects, genetic drift). Recommenda-
tions are given regarding how the possible confounding effects of certain of
these sources of variability can be reduced or eliminated.
27
-------
SESSION: The Group Depth (Focus Group) Interview
TITLE: The Group Depth Interview: An Unstructured Approach for Collecting
In-Depth Survey Information on Attitudes and Motivations
AUTHOR: Alfred E. Goldman, National Analysts Division, Booz Allen &
Hamilton Inc.
This introduction to the Group Depth, or Focus Group, Interview will center
on its appropriate use in the armamentarium of the survey research specialist.
Topics covered will be when and why it is used, its assets and liabilities, its
theoretical rationale, and selected moderating techniques.
28
-------
SESSION: Human Exposure Monitoring
TITLE: New Directions of Exposure Monitoring
AUTHOR: John D. Spengler, Professor of Environmental Health, Harvard School
of Public Health
Abstract not available.
29
-------
SESSION: Guide to EPA Information Center Services for Statisticians
TITLE: Guide to EPA Information Center Services for Statisticians
AUTHOR: Denny Daniel, Manager, Technical Center, Washington Information
Center
This session will provide an explanation of the Washington Information
Center Services using a video presentation. There will also be a discussion of
the Information Center Services available in the regional offices and some of the
labs and the contact points there.
30
-------
SESSION: Guide to EPA Information Center Services for Statisticians
TITLE: In-House Graphics
AUTHOR: Nancy Sneath, Washington Information Center Consultant
Graphics are playing an increasingly important role in EPA's assimilation of
statistical data. Organizing facts and figures in an easy to understand manner is
not an easy task. The Washington Information Center (WIC). however, is
available to help statisticians understand and select graphics hardware and
software to create their own charts or presentation materials.
The focus of this session will be "In-House Graphics"; what graphics can
"YOU" generate sitting at your desk. Topics will include: tips for selecting
software programs, output devices, speed considerations, color, costs, and
services available from WIC.
31
-------
SESSION: Spatial Statistics
TITLE: Geostatlstlcal Software: Practical Applications
AUTHORS: Evan J. Englund and George Flatman. Environmental Monitoring
Systems Laboratory, Las Vegas
The "practical application of geostatistics to environmental problems" often
boils down to "making a better contour map." A step-by-step geostatistical
analysis of a sample data set illustrates the use of recently developed geostatis-
tical software for variogram computation and modeling, kriging, and contouring.
The sample data set used in the example was drawn from a completely
known 2-D data set, which allows us to compare the kriged estimates to the true
values. We can also look at estimators other then kriging (e.g., a simple
unweighted moving average) and ask whether kriging is in fact the better
estimator it is claimed to be. This leads us to the question of what criteria to
use to measure the quality of a spatial estimator. Several criteria proposed for
ongoing comparison studies will be presented and discussed.
NOTICE
Although the research described in this article has been supported by the
United States Environmental Protection Agency. It has not been subjected to
Agency review and, therefore, does not necessarily reflect the views of the
Agency and no official endorsement should be inferred.
32
-------
SESSION: Poster Session
TITLE: Forest Effects of Acid Deposition: An Epidemlological Approach to
Data Representation Graphics and Exploratory Data Integration of
Ecological Information
AUTHORS: Ruth H. Allen, Office of Acid Deposition, Environmental Monitoring,
and Quality Assurance, Office of Research and Development, J. Jacob
Wind, American Management Systems, and Ronald W. Matheny, Office
of Research Program Managment. Office of Research and Development
This poster illustrates innovative data representation graphics we used to
explore and analyze data collected under the Acid Deposition and Atmospheric
Research Division program on Forest Responses to Anthrogenic Stress (FORAST)
(Oak Ridge National Laboratory, 1985). Text slides cover a retrospective view of
the history of the project, the original structure of the data, and the final
structure of the data base. Graphs illustrate innovative uses of the data.
Lessons drawn from peer reviews, case studies, and hindsight form the basis for
conclusions about the role of statistics and a systems approach in the explor-
atory data integration of ecological information.
33
-------
SESSION: Poster Session
TITLE: Appropriate LC50 Statistics for Effluent Toxlclty Analysis
AUTHORS: M. Bastlan, P. Koska. T. Vinson, and C. Young. Surveillance Branch.
Region VI (Presenter: Jim Stiebing, Chief, Surveillance Branch. Region
VI)
Whole effluent toxicity testing is a primary monitoring tool for water
quality-based permit compliance. Compliance test methodologies must be clear
and specific so that- evaluation of permit compliance is not clouded by issues of
data interpretation. In this light, the number of statistical procedures (six)
discussed in Methods for Measuring the Acute Toxicity of Freshwater and Marine
Organisms is a problem because it requires choice among statistical procedures
which may or may not be comparable.
The purpose of this study was to evaluate the practical utility of some of
the common LC50 statistics recommended in the acute toxicity testing manual.
We chose the Moving Average Angle (MAA), Probit. and Binomial procedures
because programs for these statistics are available on floppy disk from the
Environmental Monitoring and Support Laboratory in Cincinnati (MOVING.BAS.
LCVALUES.BAS, TOXDAT). The specific study objectives were to:
(1) describe the data characteristics of the effluent toxicity tests conducted
in the Regional laboratory,
(2) determine if the data requirements of the statistical programs were well
matched with the data characteristics of the tests, and
(3) determine if LC50s calculated by the different methods varied by a
significant amount.
Results of this study reaffirmed previous observations that effluent toxicity
is frequently an "all or nothing" response. The data requirements for the MAA
method were often met while those for Probit analysis were not, although this
issue is confounded by some differences in the data requirements among the
34
-------
statistical programs. The LCSOs calculated for a single data set by the three
statistics were similar; in 20 tests the maximum difference was 11% effluent and
in 10 of those tests the LC50 varied by less that 4%. Based on the results of
this study, it is recommended that the MAA method with one set of data
requirements be identified as the preferred LC50 statistic for permits compliance
evaluation. This recommendation is limited to consideration of statistics
discussed in the acute manual and those that can be calculated by readily
available personal computer programs.
35
-------
SESSION: Poster Session
TITLE: Regional Surface Water Quality Characteristics of Nebraska
AUTHOR: Norman H. Crisp. Environmental Services Division. Region VII
(Presenter: Thomas T. Holloway. Chief, Water Monitoring Section.
Environmental Services Division, Region VII)
The ability to identify regional differences in water quality characteristics
provides resource managers with an important tool. This information can be
used to develop cost-effective monitoring networks, set regional water quality
standards, and identify areas of .non-conformity where remedial actions may be
needed.
Water quality data from 69 ambient monitoring stations in Nebraska were
evaluated for regional patterns using the multivariate procedures of principal
component analysis and cluster analysis.
Principal component analysis reduced the data set of nine parameters to
three components: ionic strength, runoff, and nitrogen. These three principal
components explained 80 percent of the variation in the original data set.
Clustering of the median values of the principal components resulted in good
spacial correspondence between water quality characteristics, as measured by the
principal components and the Soil Conservation Service Land Resource Areas.
Management implications of these defined regional characteristics are
significant. The results of the principal component analysis suggest that
parameter coverage, and hence cost, can be reduced without loss of the informa-
tion content at the monitoring locations. Based on cluster analysis, decisions on
the extent of the monitoring network which is needed and the direction of water
pollution control activities toward areas with a typical water quality character-
istics can be logically made.
36
-------
SESSION: Poster Session
TITLE: Guide to EPA Information Center Services for Statisticians
AUTHOR: Denny Daniel, Manager, Technical Center, Washington Information
Center
With the influx of Personal Computers (PCs) in the Agency, there is an
increased need for assistance and support on this equipment. Most regional
offices and many labs have set up Information Centers to assist the PC user
with installation, training, and support. This exhibit will illustrate these support
services available from local Information Centers. There will be people available
to answer your specific questions about the computer hardware and software or
general questions about automating your office.
37
-------
SESSION: Poster Session
TITLE: Chlorinated Paraffins: A Report on the Findings from Two Field
Studies at Sugar Creek, Ohio, and Tinkers, Creek Ohio
AUTHOR: Susan Oillman, Exposure Evaluation Division. Office of Toxic
Substances
This poster presents the results of two field studies conducted in 1986 by
the Environmental Protection Agency to measure chlorinated paraffins (CPs) in
segments of two watersheds: Sugar Creek, Ohio and Tinkers Creek, Ohio. The
objective of these field studies was to collect environmental information that
would help EPA determine if chlorinated paraffins exist in these watersheds and
at what concentrations. These watersheds were selected for study because of
their association with the known CP manufacturer (Sugar Creek) and a user of
lubricating oils which commonly contain CPs (Tinkers Creek).
Sugar Creek. Ohio
Analysis of the first of three sets of environmental samples collected from
this study site shows that chlorinated paraffins represented in this study by
three technical mixtures [short-Chain Cio-i2 (50-60% CD, medium-chain Cu-i?
(50-60% CD, and long-chain Czo-ao (40-50% CD CPsJ, are generally present at
quantifiable concentrations in the parts-per-billion to parts-per-million range in
both the discharge from the CP manufacturing plant and in Sugar Creek down-
stream from the discharge.
The highest CP concentrations were found in the surface impoundment
lagoon which sequesters the manufacturing plant effluent before allowing it to
discharge to Sugar Creek. Here, quantifiable concentrations as high as 170,000
ug/kg were found in the lagoon sediments. Measurements made in the ditch
which carries the lagoon drainage to Sugar Creek showed concentrations as high
as 3,600 ug/kg in the sediments. Concentrations were also recorded in Sugar
Creek downstream from the drainage ditch confluence ranging from trace levels
to 21 ug/kg.
38
-------
Tinkers Creek. Ohio
Analysis of the first of three sets of environmental samples collected from
this study site failed to detect CPs in any of the samples collected near the
outfall of the lubricating oil user or in the drainage network carrying its
discharge to Tinkers Creek. Most of the samples analyzed from this site,
especially the sediment samples, contained a variety of organic constituents
which would have masked the presence of any CPs. Chlorinated paraffins,
however, were measured in the low parts-per-billion range in one sample
collected from the process waste stream of the lubricating oil user located at
this site.
39
-------
SESSION: Poster Session
TITLE: Geostatlstical Software Demonstration
AUTHOR: Evan J. Englund, Environmental Monitoring Systems Laboratory, Las
Vegas, Office of Research and Development
A PC-based geostatistical software package recently developed at EMSL-LV
will be demonstrated with several environmental data sets, including atmospheric
NOx from the Los Angeles area, cadmium concentrations in soils from the
Palmerton, Pennsylvania smelter site, and lead in soils from Dallas. Texas.
The software package provides the investigator with the capabilities
required for a complete 2-D spatial data analysis, from simple statistics through
variogram computation and modeling, to kriging and contouring. The programs
all use screen menus and a standard command structure for ease of use. While a
'black box1 approach is strictly avoided, the software uses default options
wherever appropriate. Thus, in addition to providing a practical set of tools for
the site investigator, the package is expected to be very useful for introductory
geostatistical training and tutorials.
The software is written in Fortran 77, and is in the public domain. It
requires a relatively powerful PC configuration with 640 kilobytes of memory —
ideally an AT with a hard disk drive, math co-processor, and EGA graphics.
40
-------
SESSION: Poster Session
TITLE: Household Solvent Products: A National Usage Survey
AUTHORS: Mary Frankenberry. Patrick Kennedy, and Cindy Stroup. Exposure
Evaluation Division, Office of Toxic Substances, and Donna
Elsenhower, Paul Flyer, and John Rogers, Westat. Inc.
This study was conducted to provide usage information on 32 categories of
common household and automotive products which were thought to contain
methylene chloride or its potential substitutes. Respondents were selected using
a random digit dialing procedure, were contacted by telephone to obtain consent
and address, and were then sent a mail questionnaire which included product
pictures. Nonrespondents were followed up with a telephone interview. The
objective was to acquire usage statistics for each product that could be used to
calculate exposure assessments. These usage statistics included frequency of use,
duration and amount of use, location of use, brand used, and protective measures
undertaken while using the product. In general, respondents used an average of
seven of the thirty-two products in their lifetime and five during the last year.
Contact cements, superglues, and spray adhesives were used most frequently,
while particular automotive products were used least frequently. Duration of use
was longest for painting proucts (e.g., paint removers/strippers, adhesive
removers, and wood stains), varnishes, and finishes. Most respondents reported
having a window or door open but did not have the fan on while using products,
and most reported that they read directions on the product labels before use.
Finally, usage of the products tended to decrease with increasing age.
41
-------
SESSION: Poster Session
TITLE: Tools for Presenting Spaclal and Temporal Patterns of Environmental
Monitoring and Effects Data
AUTHORS: L. Thomas Heiderscheit, Wilson B. Riggan. and John Creason. Health
Effects Research Laboratory, Research Triangle Park, Office of
Research and Development
The Health Effects Research Laboratory had developed this data presenta-
tion tool for use with a variety of types of data which may contain spacial and
temporal patterns of interest.
The technology links mainframe computing power to the new generation of
"desk-top publishing" hardware and software to produce publication-quality maps
and tables.
The Data Management System documentation, along with the cancer
mortality data used from U.S. Cancer Mortality Rates and Trends 1950-1979:
Volume IV Maps, will be available soon on tape from NTIS.
Plans are now being made for future work, including the publication of an
update to U.S. Cancer Mortality Rates and Trends 1950-1970: Volumes I-IV.
which will include data from 1980-85. Potential analysis of this data is being
explored. Suggestions are welcome.
42
-------
SESSION: Poster Session
TITLE: Orientation to Quality Assurance Management
AUTHORS: Kevin Hull. Office of Acid Deposition, Environmental Monitoring, and
Quality Assurance, Office of Research and Development, and Susan A.
Santo, JWK International Corporation
This poster will outline the Quality Assurance Management Staff's six-hour
workshop covering basic management issues associated with EPA's quality
assurance (QA) program. Through lecture, discussion, videotape, and class
exercises, participants learn how a QA program ensures that environmental data
collected can support EPA's decision-making needs. Topics include how QA
differs from QC (quality control), how to develop QA program and project plans,
the purpose of data quality objectives, and why management systems reviews are
crucial to the success of a QA program. For more information, call the QA
Management Staff at (202) 382-5780.
43
-------
SESSION: Poster Session
TITLE: Sampling Strategy for Network Design
AUTHORS: Jerry Jalkanen and Donald E. Myers, Department of Math, University
of Arizona; and George T. Flatman, Environmental Monitoring Systems
Laboratory, Las Vegas, Office of Research and Development
A sampling strategy is presented to attack the problem of expanding a field
data collection network for spatially correlated random processes that are known
to occur in hydrology.
The expansion of the collection network is formulated as an unconstrained
nonlinear minimization problem with the decision variables being the spatial
locations of the additional sample locations. Previous attempts have formulated
it as a combinatorial problem with new locations restricted to a pre-determined
grid.
The estimation error is a function of the additional sample locations
through the usage of a geostatistical model that assumes the functional forms of
both the mean function and the variogram function are known. It is assumed
that the first stage of sampling has allowed estimation of parameters associated
with the variogram model.
Examples are given to illustrate the major factors that influence the
selection of the new sample locations. A hypothetical case study is presented
that consists of expanding a water well network in the Wolfcamp aquifer of the
Palo Duro Basin, Texas.
NOTICE
Although the research described has been supported by the United States
Environmental Protection Agency through Cooperative Agreement CR811938 to
the University of Arizona, it has not been subjected to Agency review and,
therefore, does not necessarily reflect the views of the Agency and no official
endorsement should be inferred.
44
-------
SESSION: Poster Session
TITLE: Statistical Evaluation of Water Quality Trends
AUTHOR: Reta Roe, Environmental Services Division, Region VII
Trend evaluation determines whether water quality is improving or
degrading, and the rate of change. Because the parameters used to determine
water quality frequently depend on the season, flow, and or sediment, it may be
necessary to adjust data for these constituents.
Region 7 has compiled an extensive Standard Operating Procedure (SOP) and
software for conducting trend evaluations. This poster describes the capabilities
and options of that SOP.
Extrapolation of the statistical trends in historical data can aid in deter-
mining if and when beneficial uses of the water may change.
Step trend analysis can help determine if a change such as a new treatment
facility has had an impact on water quality.
The SOP and associated computer programs have been designed for ease of
usage and include simplified discussions of the statistics generated. Featured are
discussions of general procedures followed in the programs and an explanation of
the naming scheme. Each program includes detailed editing instructions, output
descriptions, and input and output examples. Necessary tables are also contained
in the package. A full length version of each program is included with
comments and echo print for debugging purposes, as well as an edited version.
Computer programs retrieve parameter data from the STORET data base and
place it into a SAS data set. The data set retrieval programs can retrieve a
single parameter along with flow or sediment, or can retrieve grouped parameters
(such as metals) along with flow or sediment. Various SAS procedures are then
used for analysis.
45
-------
PLOT.STATS uses fourteen different relationships between the chosen para-
meter as the dependent variable and flow or sediment as the independent
variable to establish and plot a graphical representation. Models used are linear,
log-linear, log-log, inverse, quadratic, log-quadratic, and eight different hyper-
bolics. Discussions in this section focus on mean and standard deviation, curve
modeling, normal distributions, t and F distributions and hypothesis testing. SAS
procedures used are MEANS, PLOT, and SYSREG.
FLOW.STATS performs additional statistical analyses, including residual
analysis on the model chosen in PLOT.STATS. Included are sections on univari-
ate analysis, first through fourth moments, and nonparametric testing. Nonpara-
metric tests used are the Kolmogorov-Smirnov and the Shapiro-Wilk statistic.
Univariate plots used are stemleaf, box, and normal probability. SAS procedures
used are MEANS, REG, PLOT, and UNIVARIATE.
TIME.STATS again uses the chosen model and examines the data for trends.
Plots are generated for flow, concentration of the parameter, and flow-adjusted
concentrations of the parameter against time (or sediment against time).
Discussions include autocorrelation and the Durbin-Watson.
Frequently, flow and sediment will have a linear relationship. However,
they may not, and adjusting for both flow and sediment may be desirable. With
MATRIX.STATS, flow and sediment can be entered as the independent variables,
and the influences of both are examined. Using SASGRAPH, a three-dimensional
plot can be generated. Explanations in this section focus on the Pearson
product-moment correlation, limitations of regression analysis, covariance,
correlation matrix, colinearity, model specification testing, parameter estimate
statistics, and influence. SAS procedures used are MEANS, CORR, and REG.
Flow-adjusted (or sediment-adjusted) data from the previous programs can
be further adjusted by seasonality, and non-parametric measures of correlation
used to analyze the data.
SEASON.STATS does seasonal adjustment on the data and recomputes
parameter estimates and residuals to reflect this adjustment. Discussions include
46
-------
the nonparametric measures of correlation, Spearman-Rho. Kendall's Tau, and
Hoeffding's D statistic. SAS procedures used are MEANS, REG, and CORR.
STEP.STATS adjusts data seasonally, analyzes the data when a change has
been made, and determines whether the change has had a positive or negative
impact. The nonparametric analyses of variance used include the Wilcoxon, the
Median, The Van der Waerden, and the Savage. Also used is the Kruskal-Wallis
Chi-square approximation. SAS procedures used are MEANS, REG, CORR, and
NPAR1WAY.
Another section gives directions, explanations, and keys to use SASGRAPH
with all of the programs for enhanced output and plots. Printing enhancement
options include titles, fonts and colors. Plotting enhancement options
additionally include patterns and symbols.
A special feature is a three-dimensional plot for the MATRIX.STATS
program. Additional SAS procedures used with the enhanced program are
G3GRID and G3D.
47
-------
SESSION: Poster Session
TITLE: In-House Graphics
AUTHOR: Nancy Sneath, Washington Information Center Consultant
Graphics are playing an increasingly important role in EPA's assimilation of
statistical data. Organizing facts and figures in an easy to understand manner is
not an easy task. The Washington Information Center (WIC), however, is
available to help statisticians understand and select graphics hardware and
software to create their own charts or presentation materials.
The focus of this session will be "In-House Graphics"; what graphics can
"YOU" generate sitting at your desk. Topics will include: tips for selecting
software programs, output devices, speed considerations, color, costs, and
services available from WIC. The most popular graphics packages will be
demonstrated.
48
-------
SESSION: Poster Session
TITLE: Control Chart Strategy
AUTHORS: Thomas H. Starks, Environmental Research Center, University of
Nevada, Las Vegas, and George T. Flatman, Environmental Monitoring
Systems Laboratory, Las Vegas
Monitoring the groundwater around a waste impoundment site presents a
statistical problem in testing compliance or leaking. Two type of errors are
present in statistical tests, and both are undesirable. If the test is designed for
early detection, there may be too many false positives. This means an out-of-
control state is declared when it does not exist and unnecessary increased
sampling is started. In contrast, if the test is designed for conservative
detections there may be too many false negatives. That means an in-control
state is declared when an out-of-control state exists and pollution takes place
undetected.
One commonly used statistical procedure that suggests an answer for this
type of problem is control chart methodology. This poster explains control chart
decision logic and applies two of the more promising types of actual data. The
Shewart chart is designed to detect a large, binary leak from seasonal or random
fluctuations. The Cusum chart is designed to detect a growing leak. The
optimum monitoring strategy uses a combination of these two to minimize false
positives and negatives.
NOTICE
Although the research described in this article has been supported by the
United States Environmental Protection Agency, it has not been subjected to
Agency review and, therefore, does not necessarily reflect the views of the
Agency and no official endorsement should be inferred.
49
-------
SESSION: Practical Considerations for Agency Surveys
TITLE: National Survey of Pesticides In Drinking Water Wells
AUTHOR: James Boland, Hazard Evaluation Division, Office of Pesticide Programs
The National Pesticide Survey (NPS) is a stratified random survey of both
community water system (CWS) and rural domestic drinking water wells (DWS).
The domestic well portion of the Survey is a three-stage design with stratifi-
cation at the first two stages. The CWS component is a two-stage design with
stratification at the first stage. The Survey objectives include:
(1) to develop accurate estimates of pesticide contamination of drinking water
wells, both nationally and within domains of interest (e.g., subcountry
areas of high ground-water vulnerability and/or pesticide usage); and
(2) to begin to explore potential associations between specific pesticide
occurrence in wells and factors such as high vulnerability, specific crop
types, co-occurrence of nitrate contamination, and pesticide use.
This presentation will focus on several communication/outreach issues which
have proven both critically important to the success of the project, but also
very resource intensive. Specific issues to be discussed will include: needs for
outreach; need to keep Regional, State, and local units of government well
informed; need for involving statisticians early in the project planning to
properly articulate project objectives (i.e., do not "oversell" the Survey); and the
need to translate statistical details (a.k.a., "mumbo-jumbo") to the nonstatis-
ticians. The successful role of a pilot study, conducted in three States, will also
be discussed. Other issues, such as sample frame problems (differing needs for
accuracy for the program vs. a statistical survey) and multifaceted statistical
support needs for a major survey, will also be discussed, if time permits.
50
-------
SESSION: Practical Considerations for Agency Surveys
TITLE: Hazardous Waste Surveys: Problems In Development and
Implementation
AUTHOR: Jim Craig, Water Management Division, Office of Solid Waste
The Office of Solid Waste (OSW) is undertaking a major effort to improve
the quality of information used in regulatory development. Two major hazardous
waste surveys — the Treatment, Storage, Disposal, and Recycling Facility (TSDR)
Survey and the Hazardous Waste Generator Survey -- are the cornerstone of this
effort. The surveys cover hazardous waste generation and management of
facilities regulated under the Resources Conservation and Recovery Act of 1976.
Together they will provide comprehensive information for use in regulatory
development.
Many problems complicated the development and implementation of these
surveys. The major problems were developing the sample frame before con-
ducting the survey and communications throughout the survey process. Commun-
ications problems included determining and describing OSW's data needs, develop-
ing questions to meet these needs, ensuring that respondents interpret the
questions as intended, and ensuring that analysts using data do not misinterpret
or misrepresent the data. This presentation covers some of these problems and a
strategy for overcoming them. j i
lopuloJ^0^^JfVJtt/>us«AV(x vjovirixicHM
51
-------
SESSION: Guest Presentation
TITLE: Statistics versus Statistics
AUTHOR: Leo Brelman, Professor of Statistics, University of California. Berkeley
There are today two cultures in statistics. The first emanates from the
universities and is concerned with models, techniques, asymptotics, etc. The
second is the domain of the working statistician concerned with problem
formulation, data gathering designs, implementation of the design, assessment and
understanding of the data, and analysis of the data.
The first culture often has a pernicious influence on the second, tending to
substitute inapplicable models and techniques for the exercise of intelligence and
skepticism. Some of the aspects of these two cultures and their relationship will
be outlined and illustrated.
52
-------
Conference Facilities
William
it
Man-
Parlor
J
I LK!
» tri.**.»*
t
m
J-i
o
O
H
r
Elevator
Elevators
Open To
Courtyard
Below
Parlor
1 '«}"'
Parlor
A
DN
Thurs
Thurs
Fri
Thurs
Reft Rnu.
Yorktown Elevators
.=|
WilliamsburC
Parlor .
Lower Level A
Jamestown
Parlor
Thurs
Berkley
Room
srr
I' I Rest Rm».
Lower Level B
•
Pr«-Functi
Foyer
r~t>
-------
Williamsburg Dining
COLLEGE OF
VWUJAM & MARY
Royce Hotel
The Colony Room
Sir Francis Drake's
415 Richmond Road
Paul's Deli and Pizza
761 Scotland Street
229-8976
2 Greenleaf Cafe
Scotland Street
3 College Deli
Richmond Road
4 Sukura (Japanese)
Scotland Street
5 Trellis
Duke of Gloucester Street
229-8610
6 Berret's Seafood
199 S. Boundary Street
253-1847
7 Short Stop Cafe
Jamestown Road
8 Chounings Tavern
Duke of Gloucester Street
9 Kings Arms Tavern
Duke of Gloucester Street
10 Christina Campbell's Tavern
Duke of Gloucester Street
11 Tusks Restaurant & Lounge
Ramada Inn
York Street
*12 Nick's Seafood Pavilion
Yorktown
887-5269
"13 Captain George's Seafood
Richmond Road
Not within walking distance.
54
------- |