United States
Environmental Protection
Agency
Health Effects Research EPA-600/2-83-036
Laboratory May 1983
Research Triangle Park NC 27711
Research and Development
Quality Assurance
Guidelines for
Environmental
Health Effects
Research
-------
EPA-600/2-83-036
May 1983
QUALITY ASSURANCE GUIDELINES FOR
ENVIRONMENTAL HEALTH EFFECTS RESEARCH
by
P. A. Cunningham
K. W. Gold
L. E. Myers
N. H. Sexton
EPA Contract No. 68-02-3226
EPA Project Officer: Ferris B. Benson
Health Effects Research Laboratory
U.S. Environmental Protection Agency
Research Triangle Park, North Carolina 27711
-------
NOTICE
This document has been reviewed in accordance with
U.S. Environmental Protection Agency policy and
approved for publication. Mention of trade names
or commercial products does not constitute endorse-
ment or recommendation for use.
11
-------
FOREWORD
The many benefits of our modern, developing, industrial society are accompanied
by certain hazards. Careful assessment of the relative risk of existing and new
man-made environmental hazards is necessary for the establishment of sound regulatory
policy. These regulations serve to enhance the quality of our environment in order
to promote the public health and welfare and the productive capacity of our Nation's
population.
The complexities of environmental problems originate in the deep interdependent
relationships between the various physical and biological segments of man's natural
and social world. Solutions to these environmental problems require an integrated
program of research and development using input from a number of disciplines. The
Health Effects Research Laboratory, Research Triangle Park, NC and Cincinnati, OH
conducts a coordinated environmental health research program in toxicology, epidemi-
ology and clinical studies using human volunteer subjects. Wide ranges of pollutants
known or suspected to cause health problems are studied. The research focuses on
air pollutants, water pollutants, toxic substances, hazardous wastes, pesticides,
and non-ionizing radiation. The laboratory participates in the development and
revision of air and water quality criteria and health assessment documents on
pollutants for which regulatory actions are being considered. Direct support to
the regulatory function of the Agency is provided in the form of expert testimony
and preparation of affidavits as well as expert advice to the Administrator to
assure the adequacy of environmental regulatory decisions involving the protection
of the health and welfare of all U.S. inhabitants.
The quality of the data resulting from this research is an overriding factor in
determining their usefulness to EPA. In recognition of the importance of data quality
assurance, our Laboratory instituted an active, comprehensive program to coordinate
the development and implementation of effective quality assurance planning into all
research within the Laboratory. More recently, the Administrator has required quality
assurance for all environmentally related measurement activities supported by the Agen
This substantially enhances the quality assurance aspects of our own research measure-
ments.
This document represents the current statement of our effort. I am confident tha
full implementation of our data quality assurance policy, with the help of the guideli
manuals and the increased application of quality assurance principles, will enhance th
scientific merit of our research program.
F.Gordon Hueter,Ph.D.
Director
Health Effects Research Laboratory
iii
-------
CONTENTS
Section Page
Foreword iii
Figures vi
Tables vi
1 Abstract 1
2 Introduction 2
2.1 Laboratory Objectives 2
2.2 Background 3
2.3 Purpose 4
2.4 Definitions 4
2.4.1 Quality 5
2.4.2 Quality Control 5
2.4.3 Quality Assurance 5
2.4.4 Environmental or Environmentally Related
Measurement 5
2.4.5 Project 5
2.4.6 Protocol 6
2.4.7 Quality Assurance Program Plan 6
2.4.8 Quality Assurance Project Plan 6
2.4.9 Quality Assurance Officer 6
2.4.10 Project Officer 6
2.4.11 Contracting Officer 6
2.4.12 Quality Assurance Performance Audit 6
2.4.13 Quality Assurance Systems Audit 7
2.5 References 7
3 Management Policy 9
3.1 Quality Assurance Program Goals 9
3.2 Quality Assurance Policies 10
3.3 Quality Assurance Program Organization 11
3.3.1 Organizational Structure for Quality
Assurance 12
3.3.2 Assignment of Responsibilities 12
3.4 References 17
4 Quality Assurance Guidelines for Project Management 19
4.1 QA Requirements for HERL Projects Involving
Environmental Measurements 19
4.2 Elements of a QA Project Plan 27
4.2.1 Identification 27
4.2.2 Project Description 36
4.2.3 Project Organization and Responsibility 38
IV
-------
CONTENTS (continued)
Section Page
4.2.4 Facilities, Services, Equipment, and
Supplies 40
4.2.5 QA Objectives for Measurement Data 46
4.2.6 Sample Collection 51
4.2.7 Sample Custody 55
4.2.8 Calibration 60
4.2.9 Sample Analysis 66
4.2.10 Recordkeeping 68
4.2.11 Data Management 70
4.2.12 Internal QC Checks 80
4.2.13 External Quality Assurance for Research
Projects 88
4.2.14 Preventive Maintenance 89
4.2.15 Specific Routine Procedures for Assessing
Data Quality 91
4.2.16 Feedback and Corrective Action 91
4.2.17 QA Reports to Management 93
4.3 References 93
Appendixes
A Selected National Bureau of Standards Standard
Reference Materials 99
B Example of Calculations That Should Be Provided
for Estimates of Imprecision and Bias 115
-------
FIGURES
Number Page
3-1 Functional management structure, HERL 13
3-2 Interactions of the QA organization with other
HERL management 14
4-1 Checklist for project-level QA/QC planning and evaluation ... 24
4-2 Summary of EPA's proposed GLPs for health
effects research 28
4-3 Sample title page for intramural QA Project Plan 32
4-4 Sample title page for extramural QA Project Plan 34
4-5 Sample table of contents for QA Project Plan 35
4-6 Sample project organization chart 39
4-7 Sample flow diagram for sample collection
and analysis 54
4-8 Sample chain-of-custody record 57
4-9 Minimum technical report content for EPA health
effects tests 79
4-10 Sample X control chart 86
TABLES
Number Page
4-1 Format for Summarizing Precision, Accuracy, and
Completeness Objectives 48
-------
SECTION 1
ABSTRACT
This document is a statement of the quality assurance (QA) policy of
the Health Effects Research Laboratory (HERL), U.S. Environmental Protection
Agency, Research Triangle Park, North Carolina. It describes the HERL QA
organization and the QA responsibilities of both management and technical
research personnel in relation to the mandatory Agency QA policy and project
data quality requirements. It provides guidelines for managers in the
implementation of Agency QA policy and evaluation of research documentation,
and presents guidelines for project officers for (1) specification of QA
requirements for extramural and intramural tasks, (2) preparation of QA
Project Plans and research protocols for intramural research and support
tasks, (3) review and evaluation of QA Program and Project Plans for ex-
tramural projects, and (4) review and evaluation of data quality throughout
the project term. Aspects of research projects that must be considered by
project officers in the development or review of QA plans are treated in
detail. These guidelines are reviewed and revised annually by the HERL QA
officer, the QA Committee, and HERL division directors.
-------
SECTION 2
INTRODUCTION
2.1 LABORATORY OBJECTIVES
The U.S. Environmental Protection Agency's (EPA) Health Effects
Research Laboratory (HERL) at Research Triangle Park, North Carolina,
conducts an extensive research program to evaluate the health effects of
chronic and acute exposures to environmental pollutants, including air and
water pollutants, pesticides, toxic substances, and nonionizing radiation.
This research program presently involves two distinct areas: bioassays and
human exposure studies.
Bioassays employ living organisms or living parts of organisms to
measure or assay the effects of a test substance or mixture of test sub-
stances. In vivo bioassays use whole animals (e.g., rats, fish, monkeys)
and J_n vitro bioassays use cultured tissues or individual cells (e.g.,
rodent lungs, white blood cells, bacteria). Specific areas of bioassay
research conducted by HERL include:
Investigations of acute and chronic effects of low level exposure of
whole animals to environmental pollutants, including those to quantify
the absorption, distribution, storage, mobilization, biotransformation,
and excretion of environmental pollutants. The effects of pollutants
upon metabolic processes in selected species are included. Effects
selected relate to risk factors for acute illness, chronic disease, and
reproduction. Emphasis is placed upon discerning which portions of
the population may be especially susceptible to pollutant-induced or
pollutant-aggravated disorders.
Investigations of the effects of environmental pollutants upon cellular
and organ model systems of human disease. Models of specific acute or
chronic diseases include i_n vivo and i_n vitro models for infectious,
neoplastic, and other noninfectious conditions. The morphologic and
functional integrity of cellular and organ systems is used to clarify
effects of environmental pollutants.
Investigations in the fields of mutagenesis, carcinogenesis, cellular
toxicology, and metabolism. The mutagenic and oncogenic potential of
agents of environmental concern, including pure chemicals and complex
-------
environmental mixtures, is evaluated through the stepwise application
of bioassay methodologies including short-term prescreening tests,
confirmatory short-term in vitro bioassays, and established ui vivo
whole animal mutagenesis and carcinogenesis bioassays.
Investigations of the functional effects of environmental pollutants,
toxic substances, and nonionizing radiation on the central and peripheral
nervous systems, including normal behavior patterns, electrophysiologic
responses, neuroendocrine status, and appropriate performance testing.
Neurophysiologic parameters chosen are those that most nearly model the
relevant human experiences.
Human studies involve controlled exposure of human volunteers to sub-
acute concentrations of test substances to evaluate changes in physiological,
biochemical, metabolic, neurological, and behavioral responses.
2.2 BACKGROUND
Because of an increased awareness of the serious health effects of
environmental pollutants and of the need for adequate data quality to sup-
port risk assessment and control strategies, HERL management initiated a
Laboratory-wide data quality program in May 1976 with the issuance of a for-
mal Quality Assurance (QA) Plan.1 Subsequently, a QA officer was appointed
and a Laboratory QA Committee was established for the purpose of designing
and implementing a Laboratory-wide QA Program. QA guidelines have since
been developed for management policy,2 research task planning,3 and environ-
mental pollutant measurements.4 5 6 Guidelines for QA in selected areas of
biological research are currently being developed to continue the QA Program
at HERL.
The HERL QA Program is further supported by EPA's recent commitment to
a mandatory Agency-wide QA Program. Agency policy initiated by the Administra-
tor in memoranda of May 30, 1979,7 and June 14, 1979,8 requires participation
in a centrally managed QA Program of all EPA Laboratories, Program Offices,
Regional Offices, and those monitoring and measurement efforts supported or
mandated through contracts, regulations, or other formalized agreements.
The Office of Research and Development (ORD) is responsible for developing,
coordinating, and directing implementation of the Agency QA Program. Within
the ORD, this responsibility has been delegated to the Quality Assurance
Management Staff (QAMS) of the Office of Monitoring Systems and Technical
Support.
-------
To implement Agency policy, EPA Laboratories, Program Offices, Regional
Offices, and all extramural contractors, grantees, or cooperating institu-
tions are required to prepare QA Program Plans covering all monitoring and
measurement activities that generate and process environmentally related
data for Agency use.9"10 A QA Program Plan is a written document that
presents in general terms the overall policies, organization, objectives,
and functional responsibilities within the organization designed to achieve
specified data quality goals of a particular organization (e.g., EPA Labora-
tory, Program Office, Regional Office, contracting organization). A QA
Program Plan has been developed for HERL and approved by QAMS. This plan is
reviewed and revised annually by the QA officer, the Laboratory director,
and the QA Committee.
Agency policy also requires that a QA Project Plan or narrative
statement be prepared for each environmentally related measurement project
(or group of similar projects) conducted or supported by the Agency
detailing the policies, organization, objectives, functional activities, and
specific quality assurance and quality control (QC) activities designed to
achieve data quality goals or requirements of the program.9-18 The QA
Project Plan must address procedures used to routinely assess precision,
accuracy, completeness, representativeness, and comparability of the data
produced.
To facilitate compliance with Agency QA requirements, the QAMS has
issued a series of QA guidelines documents for all intramural and extramural
tasks.9"18 These documents are summarized in Section 4.1.
2.3 PURPOSE
The purpose of this document is to provide guidance to HERL managers
and project officers in the development, implementation, and evaluation of
project-level QA programs. Particular emphasis is placed on development and
evaluation of QA Project Plans in accordance with current Agency requirements.11
2.4 DEFINITIONS
Several terms related specifically to health effects research data
quality and to the HERL QA Program are defined below as they are used in
this document. More complete glossaries of data quality terminology can be
found in References 4 and 19.
-------
2.4.1 Quality
Quality is the totality of characteristics of research data that bear
on their ability to satisfy previously specified criteria. For labora-
tory measurement systems, accuracy, precision, representativeness, and
comparability are of major importance. Completeness is appropriately
applied to larger systems, such as air monitoring networks. A detailed
discussion of these data quality parameters is given in Section 4.2.5.
2.4.2 Quality Control
Quality control is a system of activities designed to achieve and
maintain a previously specified level of quality in data collection,
processing, and reporting. Quality control is performed by the task or
project personnel. QC activities include control or correction for all
variables affecting data quality (see Section 4.2.12).
2.4.3 Quality Assurance
Quality assurance is an external program of planned, systematic activi-
ties that are necessary to provide assurance that specified data quality
criteria are achieved for any given project. The QA program involves a
continuing evaluation of the adequacy and effectiveness of the overall
QC program with provision for initiation of corrective action measures
where necessary. QA activities consist of: (1) quantitative measure-
ments, such as inter!aboratory tests or performance audits, and (2) qual-
itative measures, such as site visits or systems audits, to evaluate
the capability of a total measurement system to provide specified
quality data.
2.4.4 Environmental or Environmentally Related Measurement
The term "environmental (or environmentally related) measurement"
applies to all field and laboratory investigations that generate data
involving the measurement of chemical, physical, or biological parame-
ters in the environment, such as determining the presence or absence of
priority pollutants in waste streams; health and ecological effects
studies; clinical and epidemiological investigations; engineering and
process evaluations; studies involving laboratory simulation of environ-
mental events; and studies or measurements on pollutant transport and
fate, including diffusion models.
2.4.5 Project
A project is any task or group of tasks encompassed in a HERL work
plan, intramural or extramural, that produces or uses environmentally
related data.
-------
2.4.6 Protocol
The term protocol includes all task or project planning documents.
Specifically included are research plans, support activity procedure
statements, extramural work plans, scopes-of-work, and QA Project
Plans.
2.4.7 Quality Assurance Program Plan
A QA Program Plan is a written document that presents in general terms
the overall policies, organization, objectives, and functional respon-
sibilities (within the organization) designed to achieve specified data
quality goals of a particular organization (e.g., EPA Laboratory,
Program Office, Regional Office, contracting organization).
2.4.8 Quality Assurance Project Plan
A QA Project Plan is a written document that details the policies,
organization, objectives, functional activities, and specific QA and QC
activities designed to achieve data quality goals or requirements of a
specific measurement program. The QA Project Plan must address proce-
dures used to routinely assess precision, accuracy, completeness,
representativeness, and comparability of the data produced.
2.4.9 Quality Assurance Officer
The QA officer is that individual who is assigned the responsibility
for overview and guidance of the QA Program for an organization or for
a specific project. Organizationally, the QA officer should be in a
position to provide independent and objective evaluation and assessment
of the effectiveness of the QA Program and to provide timely feedback
and recommendations.
2.4.10 Project Officer
The project officer is that individual who is assigned overall respon-
sibility for a project from inception through completion. This respon-
sibility covers both technical and QA aspects of the project.
2.4.11 Contracting Officer
The contracting officer is that individual who is assigned the respon-
sibility for ensuring that contracting is done as authorized by law and
regulation.
2.4.12 Quality Assurance Performance Audit
A QA performance audit is a quantitative analysis or check made with a
material or device of known properties or characteristics to determine
the accuracy of a measurement system. Performance audits may require
the identification or the quantisation of specific elements or compounds
or both.
-------
2.4.13 Quality Assurance Systems Audit
A QA systems audit consists of a systematic, onsite, qualitative review
of facilities, equipment, training, procedures, recordkeeping, data
validation, data management, and reporting aspects of the total measure-
ment system. A QA systems audit may be required (1) to assess, prior
to project initiation, the capability of a measurement system to gen-
erate data of the required quality, or (2) to determine compliance of
an ongoing project with specified QA requirements.
2.5 REFERENCES
1. U.S. Environmental Protection Agency, Health Effects Research Labora-
tory, Quality Assurance Plan, Research Triangle Park, NC, May 1976.
2. U.S. Environmental Protection Agency, Health Effects Research Labora-
tory, Management Policy for the Assurance of Research Quality, EPA-600/
1-77-036, Research Triangle Park, NC, 1977.
3. U.S. Environmental Protection Agency, Health Effects Research Laboratory,
Development of Quality Assurance Plans for Research Tasks, EPA-600/1-78-
012, Research Triangle Park, NC, 1978.
4. U.S. Environmental Protection Agency, Quality Assurance Handbook for
Air Pollution Measurement Systems, Vol. I - Principles, EPA-600/9-76-005,
Research Triangle Park, NC, March 1976.
5. U.S. Environmental Protection Agency, Quality Assurance Handbook for
Air Pollution Measurement Systems, Vol. II - Ambient Air Specific Methods,
EPA-600/4-77-027a, Research Triangle Park, NC, May 1977.
6. U.S. Environmental Protection Agency, Quality Assurance Handbook for
Air Pollution Measurement Systems, Vol. Ill - Stationary Source Specific
Methods, EPA-600/4-77-027b, Research Triangle Park, NC, August 1977.
7. U.S. Environmental Protection Agency, Environmental Protection Agency
(EPA) Quality Assurance Policy Statement, Administrator's Memorandum,
May 30, 1979.
8. U.S. Environmental Protection Agency, Quality Assurance Requirements
for All EPA Extramural Projects Involving Environmental Measurements,
Administrator's Memorandum, June 14, 1979.
9. U.S. Environmental Protection Agency, Strategy for the Implementation
of the EPA's Mandatory Quality Assurance Program: FY1980 and FY1981,
QAMS-001/80, Office of Research and Development, Washington, DC, March
1980.
10. U.S. Environmental Protection Agency, Guidelines and Specifications
for Preparing Quality Assurance Program Plans, QAMS-004/80, Office of
Research and Development, Washington, DC, April 1980.
-------
11. U.S. Environmental Protection Agency, Interim Guidelines and Specifica-
tions for Preparing Quality Assurance Project Plans, QAMS-Q05/80,
Office of Research and Development, Washington, DC, December 1980.
12. U.S. Environmental Protection Agency, Guidelines and Specifications for
Implementing Quality Assurance Requirements for EPA Contracts and Inter-
agency Agreements Involving Environmental Measurements, QAMS-Q02/8Q,
Office of Research and Development, Washington, DC, May 1980.
13. U.S. Environmental Protection Agency, Quality Assurance Requirements
for Contracts over $10,000, Acting Director's Memorandum, Quality
Assurance Management Staff, March 29, 1982.
14. U.S. Environmental Protection Agency, Guidelines and Specifications for
Implementing Quality Assurance Requirements for Research Grants Involv-
ing Environmental Measurements, QAMS-OQ3/80/01. Office of Research and
Development, Washington, DC, April 1981.
15. U.S. Environmental Protection Agency, Guidelines and Specifications for
Implementing Quality Assurance Requirements for Demonstration Grants and
Cooperative Agreements Involving Environmental Measurements, QAMS-003/
80/02, Office of Research and Development, Washington, DC, April 1981.
16. U.S. Environmental Protection Agency, Guidelines and Specifications for
Implementing Quality Assurance Requirements for State and Local Assist-
ance Grants Involving Environmental Measurements, QAMS-003/80/03,
Office of Research and Development, Washington, DC, April 1981.
17. U.S. Environmental Protection Agency, Quality Assurance Requirements
for Financial Assistance, Acting Director's Memorandum, Quality
Assurance Management Staff, March 30, 1982.
18. U.S. Environmental Protection Agency, Quality Assurance Requirements
for Cooperative Agreements, Acting Director's Memorandum, Quality
Assurance Management Staff, July 16, 1982.
19. The American Society for Quality Control, Glossary and Tables for Statis-
tical Quality Control, J. E. Jackson and R. A. Freund, eds. , Milwaukee,
WI, 1973.
-------
SECTION 3
MANAGEMENT POLICY
It is the policy of the Health Effects Research Laboratory (HERL),
Research Triangle Park, that all quality assurance (QA) activities will be
carried out in accordance with Agency mandates and guidance specified by the
Quality Assurance Management Staff (QAMS) of the Office of Research and
Development (ORD). Planning for the incorporation of suitable QA measures
into measurement activities is the responsibility of project officers. It
is the responsibility of management to ensure that all project-related
documents or plans incorporate adequate QA measures. It is also management's
responsibility to ensure that QA Project Plans are implemented and that
project data are of adequate and documented quality. The HERL QA organ-
ization, consisting of a QA Committee chaired by the QA officer, is available
to all Laboratory technical and management personnel for consultation or
active participation in development and review of QA Project and Program
Plans.
3.1 QUALITY ASSURANCE PROGRAM GOALS
The goal of the HERL QA Program is to ensure, assess, and document the
quality of laboratory and field data used by EPA in evaluating the health
effects of pollutants and in developing adequate control strategies.
Specific objectives of the HERL QA Program are to:
Establish a mechanism to ensure that QA Project Plans are developed
and implemented for each intramural project (or group of similar
projects). This will be done at the division, branch, section, or
group level, as appropriate.
Establish a mechanism to ensure that research protocols are pre-
pared for all new intramural projects. These research protocols
shall address project-specific QC procedures for each measurement
method.
Establish a mechanism to ensure that QA Project Plans or narrative
statements and QA Program Plans are prepared for all new extramural
projects, as required by QAMS.
-------
Review and approve all QA Project Plans, QA Program Plans or
narrative statements, and intramural research protocols for new
tasks. Recommend revisions and ensure that such revisions are
made prior to approval and initiation of project activities.
Establish a mechanism to ensure that written standard operating
procedures (SOPs) are developed for all routine measurement methods
and research activities.
Establish a mechanism to ensure that SOPs, QA Project Plans, QA
Program Plans, and other QA documentation are maintained under a
system of document control.
Establish and implement peer review procedures for planning,
ongoing measurement activities, and technical publications.
Maintain a central file of information on available reference and
audit materials and devices for HERL measurement methods.
Maintain written QA guidelines to assist HERL in the development
of general and specific, current and future HERL research.
Continue a program of educating all personnel within HERL in the
basic concepts of quality assurance through QA workshops.
Implement requirements that all reported data include quality
estimates of precision, accuracy or bias, completeness, and, when
appropriate, representativeness and comparability.
Develop feedback and corrective action mechanisms for routinely
identifying data quality problem areas, alerting management to
them, and evaluating the solutions to such problems.
Monitor the operational performance of HERL through appropriate
intralaboratory and interlaboratory quality evaluation programs.
Conduct systems and performance audits of ongoing HERL projects,
as required.
Identify additional routine measurement methods for inclusion in
the HERL QA audit program.
Establish a procedure for regular QA reports to HERL management
and QAMS.
3.2 QUALITY ASSURANCE POLICIES
It is the policy of HERL that the Laboratory QA Program will be appro-
priate to ensure that all data collected or used are of documented quality.
HERL QA Program requirements cover all activities supported or
10
-------
required by HERL that generate or use environmentally related measurement
data. This includes all funded projects—intramural and extramural, con-
tract, grant, and cooperative and interagency agreement. Laboratory QA
policy requires that QA considerations be included in all requests for pro-
posals (RFPs), research proposals and evaluations, work plans, project
plans, and project reports.
In accordance with Agency policy, every environmental measurement
project (or group of similar projects) planned, conducted, or sponsored by
HERL must have a QA Project Plan or narrative statement approved by the QA
officer.1"10 Specifically, the QA Project Plan ensures that: (1) the level
of data quality needed will be determined and stated before the data collec-
tion effort begins; and (2) all data generated and reported will be of the
quality and integrity established by the QA Project Plan. For intramural
tasks, QA Project Plans are prepared for each functional organizational unit
(i.e., division, branch, section, or group) within which all tasks may
appropriately be regarded as a single measurement project; these plans are
then included by reference in the research or support task protocols required
for each intramural task (see Section 4.1.1). For extramural projects, QA
Project Plans or narrative statements are prepared on an individual task or
project basis. In addition, all extramural projects involving environmental
measurements must include a QA Program Plan or a statement of overall QA
policy.2 4-11
Systems and performance audits and interlaboratory/interfield compari-
son studies will be conducted on measurement projects within HERL at the
direction of the QA officer, the project officer, and/or the appropriate
division director to assess the adherence to, and adequacy of, approved QA
Project Plans and to assess the need for corrective action. External audits
may also be conducted at the request of QAMS.
3.3 QUALITY ASSURANCE PROGRAM ORGANIZATION
To support project officers and management in the development and
implementation of appropriate QA programs to ensure adequate data quality,
the HERL QA organization is interwoven with the existing HERL management.
The structure of the QA organization, the functional responsibility of QA
personnel, and the lines of communication for achieving a cost-effective
Laboratory QA Program are discussed below.
11
-------
3.3.1 Organizational Structure for Quality Assurance
The HERL management structure is shown in Figure 3-1. All QA manage-
ment responsibilities are assigned to the QA officer. The independence and
objectivity of the QA Program is supported by the QA officer's organiza-
tional independence of all divisions involved in the data generation process.
All divisions within HERL are covered by QA Program requirements. Interfacing
of the QA organization with various levels of management is shown in Figure
3-2.
3.3.2 Assignment of Responsibilities
Although each individual involved in the generation of data is implic-
itly a part of the HERL QA organization, certain individuals have specific,
assigned QA responsibilities. Refer to Figure 3-2 in the following discus-
sion.
3.3.2.1 HERL Laboratory Director--
The HERL Laboratory director has overall responsibility for all Labora-
tory activities, including quality assurance. Because the success of the QA
Program ultimately depends on the director's full support, it is his/her
responsibility to enlist and encourage the cooperation of all HERL personnel
in the program.
3.3.2.2 Quality Assurance Officer--
The QA officer has primary responsibility for all Laboratory QA activ-
ities and reports directly to the HERL Laboratory director. His/her
responsibilities include the development, implementation, evaluation, and
documentation of QA policy and procedures appropriate to the Laboratory
objectives. This includes development of audit programs for Laboratory
measurement programs, evaluation of the cost effectiveness of QA programs
and plans, and recommendations for their improvement.
As advisor to the Laboratory director, the QA officer regularly reports
on the status of the Laboratory QA Program, identifies specific needs (e.g.,
methods development and problem areas), and recommends specific courses of
action for strengthening the program.
As chairman of the QA Committee, the QA officer initiates development
of Laboratory-wide QA guidelines and procedures. He/she coordinates efforts
12
-------
CO
HEALTH EFFECTS RESEARCH LABORATORY
1 1
INHALATION TOXICOLOGY
DIVISION
Clinical Research
Branch
Physiology
Section
Engineering Support
Section
Immunology and Biochemistry
Section
Toxicology
L Branch
RESEARCH COORDINATION
GENETIC TOXICOLOGY
DIVISION
OFFICE *
OFFICE OF THE DIRECTOR
1
Mutagenesis and Cellular
Toxicology Branch
Carcmogenesis and Metabolism
Branch
Genetic Bioassay
Branch
NEUROTOXICOLOGY
DIVISION
Behavioral Toxicology
Branch
Neurophysiology
Systems Engineering
Branch
PROGRAM OPERATIONS
OFFICE
1 1 1
DEVELOPMENTAL BIOLOGY
DIVISION
Reproductive Toxicology
Branch
Perinatal Toxicology
*- Branch
BIOMETRY
DIVISION
-
Biostatistics
Branch
Data Management
Branch
Data Systems
Support Section
Data Preparation
and Control Section
TOXICOLOGY AND MICRO
BIOLOGY DIVISION
Microbiology
Branch
Target Organ
Toxicology Branch
Bioassay
Branch
Chemical and Statistical
Support Services Branch
EXPERIMENTAL BIOLOGY
DIVISION
Cellular Biophysics
Branch
Molecular Biology
Sect ,on
Immunobtology
Biological Engineering
— Branch
Engineering
_ Section
'Quality Assurance
Figure 3-1. Functional management structure, HERL.
-------
DIVISION
DIRECTOR
BRANCH
CHIEF
SECTION
CHIEF
i
GROUP
LEADER
HERL
DIRECTOR
QA OFFICER
QA
COMMITTEE
QA
COORDINATOR
DECISION
UNIT
COORDINATOR
HERL
QA ORGANIZATION
PROJECT
OFFICER
Functional management authority
— QA authority/consulting
Figure 3-2. Interactions of the QA organization with other HERL management.
14
-------
to develop QA procedures and materials for specific HERL research techniques
and assesses data provided by the QA Committee regarding evaluation of the
QA Program.
As QA consultant, the QA officer is available to consult with and
recommend to the HERL professional staff (project officers, investigators,
etc.) appropriate and necessary QA methods and plans for ensuring the quality
of the research data produced.
Finally, since motivation of personnel is a critical factor in the
success of the Laboratory QA Program, a major responsibility of the QA
officer is to ensure that all personnel are fully aware of the scope and
objectives of the Laboratory QA Program and understand the importance of
their roles to the overall success of the program.
3.3.2.3 Quality Assurance Coordinator/Committee--
Each division director designates one or two divisional representatives
to serve on the HERL QA Committee. These QA coordinators recommend and
review proposals for improvements in QA policies and procedures and report
and evaluate potential data quality problem areas. Within their respective
divisions, the QA coordinators consult on matters of quality assurance,
serve as a primary source of information on research QA considerations,
review contract proposals for QA aspects, and help to implement the Labora-
tory QA Program.
The QA Committee serves as an advisory committee to the Laboratory
director. Specifically, the Committee's functions include assisting in the
evaluation and refinement of data quality objectives of the QA Program to
ensure that they meet the Laboratory needs with minimum disruption of exist-
ing workloads and procedures, reviewing recommendations presented to the
Committee, and assessing the effectiveness of the QA Program.
3.3.2.4 Quality Assurance Staff--
The QA staff is composed of HERL investigators who hold primary
appointments in other research areas at HERL and who serve as technical
consultants to the QA officer and other HERL investigators. They are
responsible for providing the necessary technical expertise to ensure adequate
implementation and review of the Laboratory QA Program. In particular, they
are available to advise project officers and management on the technical
15
-------
aspects of specific task activities that affect overall data quality (e.g.,
methods development) and to assist the QA officer in the development, imple-
mentation, and evaluation of audit programs for both Laboratory-wide and
task-specific measurement techniques.
3.3.2.5 Quality Assurance Contractors/Consultants--
External QA contractors/consultants assist in implementing major parts
of the HERL QA Program. Their responsibilities include development and/or
evaluation of Laboratory QA guidelines and plans, development and implementa-
tion of Laboratory-wide audit programs, methods development, and provision
of analytical services.
3.3.2.6 Decision Unit Coordinator--
Evaluation of the need for specific types of research is an essential
part of the Laboratory QA Program. The decision unit coordinator is the
functional manager for each general program area and, as such, is respon-
sible for distributing resources (i.e., funding and manpower) and for evalu-
ating the relevance of each proposal (and project) in his/her program area
to the overall Laboratory and Agency goals. In addition, he/she is respon-
sible for identifying impending needs in specific areas of research and
encouraging proposals for new projects in these areas, thus aligning the
production of research data with the.projected Laboratory and Agency needs.
3.3.2.7 Functional Managers--
Functional managers (i.e., division director, branch chief, section
chief, group leader) are responsible for ensuring the quality of all research
data produced under their direction. This responsibility includes development
of QA Project Plans for their organizational unit, as appropriate (see
Section 3.2); review and approval of intramural research protocols; review
and approval of extramural proposals, work plans, QA Project Plans or narrative
statements, and QA reports; and periodic evaluation of all ongoing QA pro-
grams. Peer review of QA programs by the QA Committee may also be requested
by management.
3.3.2.8 Project Officer--
As task manager, the project officer is responsible for ensuring that
the technical and administrative requirements of each task are met. It is
his/her responsibility to adequately ensure and document the quality of the
16
-------
task results, including both the research data and conclusions. The project
officer draws upon his/her professional training and expertise, in collabora-
tion with the HERL QA organization, to determine which QA/QC techniques most
appropriately apply to a particular task and to develop task-specific QA
programs. For intramural projects, it is the responsibility of the project
officer to prepare a detailed research protocol including all project-level
QC activities and appropriate reference to the relevant division, branch, or
section QA Project Plan (see Section 3.2). For extramural projects, it is
the responsibility of the project officer to ensure that Agency QA require-
ments are met2"11 and that the awardee develops acceptable QA Program and
Project Plans as required.
To ensure adequate data quality, the project officer must obtain regular
objective evaluation of data quality throughout the project term through
assessment of regular internal QC checks and through external QA activities
(e.g., systems and performance audits). Specific items that must be consid-
ered in data quality assessment are discussed in detail in Section 4.2.
3.4 REFERENCES
1. U.S. Environmental Protection Agency, Health Effects Research Labora-
tory, Research Protocols, Research Triangle Park, NC, Director's Memo-
randum, September 25, 1979.
2. U.S. Environmental Protection Agency, Strategy for Implementation of
the Environmental Protection Agency's Quality Assurance Program: FY 1980
and FY 1981, QAMS-Q01/80, Office of Research and Development, Washington,
DC, March 1980.
3. U.S. Environmental Protection Agency, Interim Guidelines and Specifica-
tions for Preparing Quality Assurance Project Plans, QAMS-Q05/80,
Office of Research and Development, Washington, DC, December 1980.
4. U.S. Environmental Protection Agency, Guidelines and Specifications for
Implementing Quality Assurance Requirements for EPA Contracts and Inter-
agency Agreements Involving Environmental Measurements, QAMS-Q02/80,
Office of Research and Development, Washington, DC, May 1980.
5. U.S. Environmental Protection Agency, Quality Assurance Requirements
for Contracts over $10,000, Acting Director's Memorandum, Quality
Assurance Management Staff, March 29, 1982.
6. U.S. Environmental Protection Agency, Guidelines and Specifications for
Implementing Quality Assurance Requirements for Research Grants Involv-
ing Environmental Measurements, QAMS-OQ3/80/01, Office of Research and
Development, Washington, DC, April 1981.
17
-------
7. U.S. Environmental Protection Agency, Guidelines and Specifications for
Implementing Quality Assurance Requirements for Demonstration Grants and
Cooperative Agreements Involving Environmental Measurements, QAMS-Q03/
80/02, Office of Research and Development, Washington, DC, April 1981.
8. U.S. Environmental Protection Agency, Guidelines and Specifications for
Implementing Quality Assurance Requirements for State and Local Assist-
ance Grants Involving Environmental Measurements, QAMS-003/80/03,
Office of Research and Development, Washington, DC, April 1981.
9. U.S. Environmental Protection Agency, Quality Assurance Requirements
for Financial Assistance, Acting Director's Memorandum, Quality Assurance
Management Staff, March 30, 1982.
10. U.S. Environmental Protection Agency, Quality Assurance Requirements
for Cooperative Agreements, Acting Director's Memorandum, Quality
Assurance Management Staff, July 16, 1982.
11. U.S. Environmental Protection Agency, Guidelines and Specifications
for Preparing Quality Assurance Program Plans, QAMS-004/80, Office of
Research and Development, Washington, DC, April 1980.
18
-------
SECTION 4
QUALITY ASSURANCE GUIDELINES FOR PROJECT MANAGEMENT
This section presents quality assurance (QA) guidelines for overall
project management, including: (1) specification of QA requirements for
extramural and intramural tasks, (2) preparation of QA Project Plans and
research protocols for intramural research and support tasks, (3) review and
evaluation of QA Program and Project Plans for extramural tasks, and
(4) review and evaluation of data quality throughout the project term.
4.1 QA REQUIREMENTS FOR HERL PROJECTS INVOLVING ENVIRONMENTAL
MEASUREMENTS
By memoranda of May 30, 1979,x and June 14, 1979,2 the U.S. Environ-
mental Protection Agency (EPA) Administrator established an Agency-wide QA
Program to ensure that all environmentally related measurements funded or
mandated by EPA and interagency agreements be scientifically valid, defen-
sible, and of estimated precision and accuracy. The Agency has issued a
series of guideline documents and memoranda1"13 detailing specific QA require-
ments for different types of environmental research and measurement projects
conducted by or for the EPA. Key elements of the Agency QA Program include:
1. QA Program Plans—A QA Program Plan is a written document that
describes in general terms the overall policies, organization,
objectives, responsibilities, and procedures designed to achieve
specified data quality goals. All intramural HERL tasks are
covered by the Laboratory QA Program Plan.14 QA Program Plans for
extramural tasks must be prepared by the offeror/awardee in accord-
ance with Agency guidelines.4 Minimum contents of a QA Program
Plan include:
A statement of policy concerning the organization's commitment
to implement a QA/QC program to ensure generation of measure-
ment data of adequate quality to meet the requirements of the
proposed work.
19
-------
An organizational chart showing the position of the QA office
or officer within the organization. It is highly desirable
that the QA organization be independent of the functional
groups that generate measurement data.
A delineation of the authority and responsibilities of the QA
personnel and the related data quality responsibilities of
other functional groups of the organization.
The background and experience of the proposed QA and project
personnel relevant to meet the QA requirement specifications
in the proposed work.
A general description of the proposed approach for accomplish-
ing the QA specifications in the project.
2. QA Project Plans—A QA Project Plan is a written document prepared
by the project management describing the project organization and
responsibilities and specific QA/QC activities that will be imple-
mented to achieve specified data quality goals or requirements.
At a minimum, the following items must be addressed in all required
QA Project Plans:5
Project description
Project organization and responsibility
Facilities, services, equipment, and supplies
QA objectives for measurement data
Sample collection
Sample custody
Calibration
Sample analysis
Recordkeeping/documentation
Data management
Internal QC checks
External QA audits
Preventive maintenance
Specific routine procedures for assessing data quality
20
-------
Feedback and corrective action
QA reports to management
Each of these items is described in detail in Section 4.2. Addi-
tional assistance in addressing specific items required in the QA
Project Plan may be obtained by consulting References 15 through
20, as appropriate.
Given the large number of intramural tasks conducted by HERL
and the similarity of many of these tasks at the division, branch,
section, or group level, present HERL policy requires the prepara-
tion of formal QA Project Plans at these levels only, as appropri-
ate, and not for individual intramural tasks. The intent is to
reduce the increasing burden of paperwork on individual project
officers and to eliminate unnecessary duplication of effort in
meeting QA requirements. This approach also provides a mechanism
for establishing more uniform data quality assessment, feedback
and corrective action, and standard operating procedures (SOPs)
within HERL. It should be emphasized that research protocols,
including description of specific task-level QC activities, are
still required for each intramural task.21 22 QA activities may
be referenced to the appropriate overall QA Project Plan for the
operating unit (division, branch, section, or group), with specific
note of any deviation from this plan.
3. External performance and systems audits—It is the responsibility
of the project officer and/or the QA officer to ensure that appro-
priate QA audit requirements are established for each project
prior to initiation of measurement activities. Intramural and
extramural investigators may be required to participate in per-
formance and systems audits as part of the preaward evaluation
and/or during the period of project performance. Requirements for
performance audits will depend on the availability of reference
materials or devices for the measurements to be made. In general,
performance audits are required where reference samples are avail-
able, unless the offerer has previously met this requirement to
the satisfaction of the project officer and the QA officer. In
the event that reliable reference materials or devices are not
available for the measurements involved, consideration should be
given to the use of common or split samples for cross-comparisons
of results from offerers with those from EPA.
4. QA Reporting—Provision for adequate QA reporting for each project
must also be made by the project officer and the QA officer.
Projects of short duration (i.e., 1 year or less) may require only
a final QA report. Projects of longer duration may require periodic
(e.g., quarterly) QA reports. These QA reports should be separate
from other required reports and should contain such information
as:
21
-------
Changes in the QA Project Plan
Significant data quality problems, accomplishments, and
status of corrective actions
Results of QA performance audits
Results of QA systems audits
Assessment of data quality in terms of precision, accuracy,
completeness, representativeness, and comparability, when
appropriate
Quality-related training efforts.
Any or all of the above QA items may be required for a given
project.
Many of the Agency QA guideline documents are in draft form and specific
QA requirements are not yet firmly established. Therefore, to assist in QA
planning, implementation, and review for both intramural and extramural
tasks, the QA officer has prepared information packets for HERL project
officers summarizing:
1. Current Agency and Laboratory QA requirements for
Intramural research tasks
Intramural support tasks
Contracts
Interagency agreements
Research grants
Demonstration grants
Cooperative agreements
State and local assistance grants.
2. Specific instructions for ensuring that appropriate QA require-
ments are met.
3. Required QA enclosures for extramural projects.
These information packets are distributed to all HERL project officers and
are maintained under a system of document control so that they can be
22
-------
updated as necessary to reflect changes in the Agency QA Program require-
ments. Project officers should refer to this information and consult with
their QA Committee representative and/or the HERL QA officer to ensure that
appropriate QA requirements are met for specific intramural and extramural
projects.
All QA Project Plans must be reviewed and approved by appropriate HERL
management prior to project funding. For intramural tasks, approval by the
group leader, branch chief, division director, and QA officer of the research
protocol, with reference to the appropriate division, branch, or group QA
Project Plan, is required. Additionally, the project officer must review
and approve all QA Program and Project Plans for extramural projects under
his/her supervision. Periodic assessment of QA Project Plan implementation
by the project officer and the QA officer throughout the project term provides
the means by which management may determine if adequate data quality is
being obtained in a cost-effective manner. Assistance from the QA organiza-
tion is available to project officers and other management personnel for
evaluation of QA Project Plans and the effectiveness of their implementation.
A detailed checklist of specific items that should be considered in all
project-level QA planning or review is given in Figure 4-1. This checklist
has been developed from current Agency guidelines for QA Project Plans5 and
is intended to be all inclusive. The relative impact of each of the check-
list items on overall data quality will vary between tasks; however, no
element should ever be deleted from consideration. For example, the precision
of a measurement process is always estimable by repeating the process under
appropriate conditions. However, for some of the biological or health-
related environmental research involving microorganisms, tissue cultures,
animals, or human subjects, it may be difficult to design accuracy checks
for all aspects of a basic research study. Since much of this work is of a
developmental nature, calibration techniques or standard reference materials
may not be readily available. In other cases, subjective evaluation of
changes in tissue or cellular morphology must be made based on human judg-
ment. In these instances, intensive peer review of research protocols or
work plans may be the best or only means of evaluating research objectives
prior to project initiation, ensuring collection of data of known and adequate
23
-------
QA/QC Items Comments
Identification
( ) Documentation control
( ) Title page with provision for approval signatures
( ) Table of contents
Project Description
( ) Objectives
( ) Hypotheses
( ) Experimental design_
( ) Data analysis ~
( ) Duration
( ) End use of data
Project Organization and Responsibilities
( ) Project line authority
( ) QA authority
( ) Key personnel
( ) Training
( ) Resumes
Facilities and Services
( ) Appropriateness to task requirements
( ) Environmental aspects (e.g., temperature, lighting,
ventilation)
( ) Maintenance (preventive and corrective)_
( ) Inspection procedures
( ) Configuration control
( ) Security
( ) Safety provisions_
( ) Support services
Equipment
( ) Appropriateness to task requirements
( ) Maintenance (preventive and corrective)
( ) Configuration control
( ) Safety provisions
( ) Recordkeeping (i.e., documentation of calibration and
maintenance history)
Supplies
( ) Certification
( ) Acceptance screening
( ) Animal care and testing procedures
( ) Storage
( ) Stockroom tracking system
Figure 4-1. Checklist for project-level QA/QC planning and evaluation.
24
-------
QA Objectives for Measurement Data
Comments
) Accuracy
) Precision
) Completeness
) Representativeness_
) Comparability "
Sample Collection
( ) Site selection
( ) Documentation of sampling procedure
( ) Flow charts or diagrams
( ) Sample identification
( ) Sample storage and handling
( ) Standardized data formats and recordkeeping
Sample Custody
( ) Field custody_
( ) Laboratory custody
( ) Evidentiary considerations
Calibration
( ) Standards_
( ) Procedures_
( ) Frequency_J
( ) Corrective
action
Sample Analysis
( ) Criteria for methods selection
( ) Documentation of analytical procedures
Recordkeeping
( ) Documentation control
( ) Standardized data forms
( ) Storage
( ) Security
Data Management
( ) Data collection
Data storage and backup
Data transfers ~_
Data validation
Data reduction
Software QA/QC
Data analysis
Reporti ng
Figure 4-1. (continued)
25
-------
Internal QC Checks Comments
( ) Systems audits _^__________
( ) Performance audits
replicates
spiked samples_
split samples_
blanks
internal standards
QC samples ~
surrogate samples
calibration standards
( ) Control charts
( ) Reagent checks
External QA Audits
( ) Systems audits
( ) Performance audits_
Preventive Maintenance
( ) Procedures
( ) Frequency
( ) Documentation
Specific Routine Procedures for Assessing Data Quality
(5Equations used to calculate precision and accuracy_
Feedback and Corrective Action
( ) Management line authority
( ) Corrective action reporting
QA Reports to Management
( ) Specification of contents
( ) Frequency of reporting
( ) Review and approval authority_
Special Requirements
( ) Radiation safety form
( ) Toxic chemical form
( ) Request for animal procurement and care_
(V) Satisfactory
(U) Unsatisfactory
(NA) Not Applicable
Figure 4-1. (continued)
26
-------
quality during project performance, and assessing final project data and
conclusions after completion of the project.
Project-level QA programs should also include good laboratory practice
(GLP) concepts where appropriate. The Food and Drug Administration's (FDA)
GLP Regulations18 apply to nonclinical studies performed on animals, plants,
microorganisms, or subparts thereof. The EPA's proposed GLP Standards13
(summarized in Figure 4-2) are intended for use in the development of data
on the health effects of chemical substances and mixtures tested in accordance
with Section 4 of the Toxic Substances Control Act.19 The proposed EPA GLPs
generally apply to all bioassay laboratory health effects studies conducted
by or on behalf of manufacturers of chemical substances.
4.2 ELEMENTS OF A QA PROJECT PLAN
The QA elements described in the following sections must be addressed in
every QA Project Plan. The relative impact of each of these elements on
overall data quality will vary greatly between projects; however, no element
should ever be deleted from consideration.
4.2.1 Identification
4.2.1.1 Document Control--
Each QA Project Plan must be prepared under a system of document control.
One recommended system employs a standardized document control format and
provides for convenient replacement of pages that may need to be changed
within the technical procedures descriptions.20 23 This document control
format requires, in the top right corner of each page, the section number,
revision number, date (of revision), and page number as shown below:
Section No. 2.12
Revision No. 0
Date: September 27, 1977
Page 1 of 5
A digital numbering system identifies sections and subsections within the
text. New subsections should begin on a new page. This format groups
together the pages within a subsection to allow for its revision or expan-
sion without necessitating reissue of the complete document. Each time a
new page is added or expanded within a section, the number of the preceding
27
-------
Summary of Proposed EPA GLPs for Health Effects Research
I. The proposed GLPs apply to studies relating to health and safety evalu-
ations conducted under Section 4 of the Toxic Substances Control Act,
whether conducted by the sponsor or under contract or grant. Fourteen
terms are defined in this section.
2. Test and control substances must be characterized by strength, purity,
composition, and stability before initiation of a study. Their con-
tainers must be labeled by name, chemical abstract number or code num-
ber, batch number (expiration date), and storage conditions require-
ments. Handling procedures must ensure proper identification and mini-
mize contamination, deterioration, or damage. Mixtures must be suitably
analyzed to characterize their uniformity, concentration, and stability;
expiration date is that of the earliest expiring component.
3. An ample number of personnel having adequate and documented education,
training, and/or experience must be available to the study. Their
personal habits, health, and clothing must be appropriate for their
assigned duties. The designated study director ensures that all pro-
visions of the GLPs are fulfilled for the study. The QA unit independ-
ently assures management, at least every 3 months, that the facilities,
equipment, personnel, methods, practices, records, and controls conform
with GLP requirements, in each phase of the study.
4. Facilities must be of suitable size, construction, and location to
facilitate proper conduct of the study. For animal studies, animals
must be properly separated, isolated, and quarantined. Separate areas
are required for:
Biohazardous substances
Diagnosis, treatment, and control of known or suspected laboratory
animal diseases
Sanitary disposal
Feed, bedding, supplies, and equipment
Handling of test and control substances and their mixing
Routine procedures
Administrative and personnel use
Secure archival of raw data and specimens
5. Equipment must be suitably designed and located for operation, inspec-
tion, cleaning, maintenance, and calibration according to written
procedures; written records are kept to document these operations.
Figure 4-2. Summary of EPA's proposed GLPs for health effects research.13
28
-------
6. Testing facility operation must be by written SOPs for (as a minimum):
Animal room preparation
Animal care
Test and control substance handling
Test system observations
Laboratory tests
Handling of moribund/dead animals
Necropsy
Specimen collection and identification
Histopathology
Data handling, storage, and retrieval
Equipment maintenance and calibration
Transfer, placement, and identification of animals
All deviations must be authorized by the study director and documented
in the raw data. Each laboratory must have immediately available,
suitable laboratory manuals and SOPs, both active and historical.
Reagents and solutions must be labeled to indicate identity, concentra-
tion, storage requirements, and expiration date. SOPs for animal care
include housing, feeding, handling, care, receiving quarantine, health
parameters, and identification. In addition, periodic feed and water
analysis must be documented as part of the raw data; cages and racks
must be cleaned at appropriate intervals. Bedding, cleaning materials,
and pest controls must be documented as noninterfering in the study.
7. Minimum protocol specifications must be given. The conduct of the study
is detailed in terms of the protocol, specimen identity, records, and
data recording.
8. Reserved
9. Reserved.
10. Minimum contents of the final report must be outlined. Archival of all
raw data, protocols, specimens, and final reports is detailed. Indexed,
orderly, and secure storage is required for at least 10 years.
Appendix A
Additional guidelines are given as follows:
Figure 4-2 (continued)
29
-------
Handling of test substances: DHEWs "Guidelines for the Laboratory Use of
Toxic Substances Posing a Potential Occupational Carcinogenic Risk" and
lARC's "A Manual on the Safety of Handling Carcinogens in the Laboratory."
Handling of radioactive materials: NRC's Title 10 of CFR.
Administrative and personal facilities: OSHA's Title 29 of CFR.
Animal care and handling: HEWs "Guide for the Care and Use of Laboratory
Animals."
Animal care facilities: HEW's "Guide for the Care and Use of Laboratory
Animals," and 9 CFR Part 3.
Figure 4-2 (continued)
30
-------
page should be included and a letter added to it. Revisions may also be
promulgated by issue of entirely new documents. In the case of minor revi-
sions, pen-and-ink posting on the original document with the action noted on
the revision notice may be sufficient. The QA officer or QC coordinator
should be responsible for distributing documents and/or revisions and for
obtaining the required signatures.20 A record of all revisions made to a
document should be maintained to derive full benefit of document control
techniques.
In general, a QA program should include a system for updating formal
documentation of all operating procedures. The most important elements of a
QA program to which documentation control is applied include procedures for:
sample collection, storage, and handling; calibration; sample analysis; data
collection and reporting; auditing; data management including data transfers,
data reduction, analysis, and validation (including programmed checks when
data processing is computerized); and preventive maintenance.
To ensure that all changes to project documentation are incorporated
efficiently, maintenance of full control over the distribution of such
documents is essential. A central file should be established within the
organization including such information as: document number, title, branch
originating the document, latest issue date, change number, distribution
list, and signatures of persons acknowledging receipt. Whenever a revision
is made, the group responsible for maintaining the file should issue the
revision, with a revision notice. Obsolete documents (or pages of documents)
should be removed from all files and points of use, returned to the central
file, logged in, and destroyed. The group responsible should have sole
authority to destroy obsolete documents except for one set of originals and
revisions.
4.2.1.2 Title Page with Provision for Approval Signatures—
4.2.1.2.1 Intramural. A sample title page for a QA Project Plan for
intramural tasks is shown in Figure 4-3. The title page should include:
Document control number
Specification of the HERL organizational unit(s) covered by the QA
Project Plan (i.e., division, branch, section)
31
-------
DOCUMENT CONTROL
NUMBER
QUALITY ASSURANCE PROJECT PLAN
Project Organization
Division, Branch, or Section
HERL APPROVALS
BRANCH CHIEF: Date:
DIVISION DIRECTOR: Date:
LABORATORY DIRECTOR: Date:
QA OFFICER: Date:
Figure 4-3. Sample title page for intramural QA Project Plan.
32
-------
Provision for approval signatures by the branch chief, division
director, Laboratory director, and the HERL QA officer.
Completion of reviews and approvals is shown by signatures on the title
page. Copies of the approved QA Project Plan should be filed with the QA
officer and distributed by the project officer to all task personnel who
have a major responsibility for ensuring the quality of the measurement
data.
4.2.1.2.2 Extramural. A sample title page for a QA Project Plan for
extramural research is shown in Figure 4-4. The title page should include:
Document control number
Name and address of contracting organization conducting project
Project number
Project title
Provision for approval signatures by the extramural organization's
project manager and QA coordinator and by the HERL project officer
and QA officer.
Completion of reviews and approvals is shown by signatures on the title
page. A copy of the approved QA Project Plan should be distributed by the
extramural organization's project manager to all task personnel who have a
major responsibility for the quality of the measurement data. Copies should
also be filed with the HERL project officer and QA officer.
4.2.1.3 Table of Contents--
The table of contents for all QA Project Plans must include all items
shown in Figure 4-5.
A distribution list for all branch, division, section, or group level
HERL QA Project Plans (i.e., those for intramural projects) should include:
division director, branch chief, section chief, group leader, all project
officers within the originating unit, and the QA officer.
The distribution list for all HERL extramural projects should include:
EPA project officer, EPA QA officer, project manager/principal investigator,
QA coordinator, and key project personnel.
33
-------
DOCUMENT CONTROL
NUMBER
QUALITY ASSURANCE PROJECT PLAN
Name and address of extramural project
organization conducting research
Project Number
Project Title:
EXTRAMURAL ORGANIZATION APPROVALS
PRINCIPAL INVESTIGATOR/
PROJECT MANAGER: Date:
QA COORDINATOR: Date:
HERL APPROVALS
PROJECT OFFICER: Date:
QA OFFICER: Date:
Figure 4-4. Sample title page for extramural QA Project Plan.
34
-------
DOCUMENT CONTROL
NUMBER
SECTION PAGE
Project Description
Project Organization and Responsibilities
Facilities, Services, Equipment and Supplies
QA Objectives for Measurement Data
Sample Collection
Sample Custody
Calibration
Sample Analysis
Recordkeeping
Data Management
Internal Quality Control Checks
External QA Audits
Preventive Maintenance
Specific Routine Procedures for Assessing Data Quality
Feedback and Corrective Action
Quality Assurance Reports to Management
Appendixes (e.g., Standard Operating Procedures, Resumes)
Distribution List:
Intramural Projects:
Division Director
Branch Chief
Section Chief
All project officers within the originating unit
QA Officer
Extramural Projects:
EPA Project Officer
EPA QA Officer
Project Supervisor (project manager/principal investigator)
Project QA coordinator
Key project personnel
Figure 4-5. Sample table of contents for QA Project Plan.
35
-------
4.2.2 Project Description
A detailed project description is essential to provide QA reviewers
with sufficient information to evaluate the proposed project and to determine
if QA/QC efforts are properly placed and timed. In any HERL project that
involves data collection and analysis, it is important to consult a statis-
tician during the initial planning phases of the study and throughout data
collection and analysis. An analysis plan, no matter how ingenious, cannot
compensate for poor experimental design.
For HERL division, branch, section, or group QA Project Plans, the
project description should include a summary statement of that unit's over-
all research activities. Individual project or task descriptions for intra-
mural projects should be included in the written protocol required for each
study.21 22 For extramural projects, a detailed project description must be
included in the QA Project Plan, either in full or by reference to the work
plan. Items to be addressed in the project description include:
A statement of objectives and hypotheses to be tested
A description of the experimental design including the variables
to be measured, sample sizes, experimental materials, conditions,
and instruments
An outline of the method of data analysis to be used
Anticipated duration of the project
Intended use(s) of the acquired data
4.2.2.1 Objectives and Hypotheses to be Tested--
A clearly written, concise statement of the research objectives allows
precise formulation of the specific hypotheses to be tested. It is advisable
to rank these objectives. The reference population to which the results are
to apply should be clearly defined.
4.2.2.2 The Experimental Design—
The study design should include a clear definition of all relevant
variables to be considered, the study subjects (e.g., bacteria, animals,
cell culture lines, humans), the number of subjects (sample size), and the
data to be collected and the methods to be used in sampling and analysis.
36
-------
It is essential that there be a clear understanding between the project
officer and an extramural organization as to the level of accuracy and
precision required for any measurements made. Further, from the outset of a
project it should be clear to all concerned whether the emphasis is on
(1) reproducibility and comparability of measurements, as when data will
come from many laboratories or will be compared to historical data obtained
by specific methodology; (2) meeting regulatory requirements, as when the
data must be able to stand up in court; or (3) methods development to design
a more specific or more accurate measurement. Misunderstandings in these
areas could result in the generation of otherwise valid data that may be
useless in meeting the goals of a particular project.
Measurement processes must be carefully characterized and validated24 25
before being routinely used in any HERL projects. Any hypothesized property
of the measurement system, such as linearity of the relation between measured
and true values, constancy of precision across varied levels or conditions,
or bounds on bias and imprecision, should be confirmed under the conditions
(e.g., locale, personnel, equipment) where they will be applied. Although
biases are often difficult to ascertain in health effects research where
reference materials are not available, it is always possible to study
precision.
A well-designed testing program should consider the following:26
1. Are all the relevant intrinsic factors (e.g., age, size, weight,
sex, reproductive condition) and extrinsic environmental factors
(e.g., temperature, duration of exposure, light-dark cycle, chem-
ical form of the pollutant tested, synergistic interactions) being
considered?
2. Are the effects of the relevant variables adequately distinguish-
able from the possible effects of other variables?
3. Has the possibility of interaction between variables been antici-
pated and accounted for?
4. Is the study design as free of bias as possible?
5. Is the study design consistent with the stated objectives? Will
the project yield adequate data (degrees of freedom) to estimate
parameters of interest with a reasonable precision? Are sample
sizes justified on the basis of precision using historic or con-
jectured estimates of variances?
37
-------
6. Is the study design cost effective? Would a more limited design
provide sufficient data at a lower cost?
7. Does the design make adequate provision for controls (negative,
positive, and solvent comparison groups)?
8. Is the design logistically sound? (Are adequate time, space,
personnel, etc., available to properly perform the checks neces-
sary to ensure the specified data quality?)
4.2.2.3 Data Analysis--
The research protocol should cover in some detail the proposed method
of statistical analysis and the assumed underlying mathematical/probabilistic
model. Quality assurance for data analysis primarily involves assuring the
assumptions and accuracy of computations used to estimate parameters.
4.2.2.4 Anticipated Duration of Project--
The QA Project Plan should contain a statement of the proposed startup
and completion date for the proposed project.
4.2.2.5 Intended Use(s) of Acquired Data—
A brief statement describing the intended use of the data collected in
the project should be provided.
4.2.3 Project Organization and Responsibility
4.2.3.1 Project Line Authority--
For all QA Project Plans, a project organization chart should be pro-
vided clearly identifying key individuals, including the QA officer, who are
responsible for ensuring the collection of data of documented quality. In
all extramural projects, the line authority between HERL and the extramural
organization as well as line authority within the extramural organization,
including the extramural organization's QA coordinator or officer, should be
identified. A sample project organization chart is shown in Figure 4-6.
4.2.3.2 Personnel--
All personnel participating in research-related activities under the
auspices of HERL should possess experience and knowledge adequate to perform
the technical tasks assigned. Personnel qualifications should be reviewed
and evaluated by the project officer and may be reviewed by the QA officer.
38
-------
Appropriate ITS Project Officer
Ferris Benson
QA
Officer
EPA
NSI
Quality
Assurance
Staff Engineer
David Davies
Project Officer
Data
Analysis
Doug Hardy
QA
Coordinator
Lab Manager
Michael Hiteshew
Scientist
Engineering
Support
Leon Walsh
Project
Engineer
Facility
Operators
Exposure Facilities
George Hudson
Electrical
Engineer
Technician
I 1 1 I i
„- ,. Furnace (CHAMP) ... .. . . _ . . .....
N02/03 Metals Volcanic Toxics/ Chronic
(Q222) and "roP'6*' Cd, Mn, Others Ash Pesticides N02
H2S04(M2184 (M214) (Q220) (J216) (Q222)
1 i i i i i
John Harris
Associate
Scientist
Hassell Milliard
Engineer
John Harris
Associate
Scientist
Linda Harry
Senior Laboratory
Technician
Tom Williams
Project
Engineer
John Harris
Associate
Scientist
Cindy Mattscheck
Laboratory
Technician
Linda Harry
Senior Laboratory
Technician
Laboratory
Assistant
Figure 4-6. Sample project organization chart.27
39
-------
Professional resumes of key task personnel should be included in the appendix
of QA Project Plans or in the research protocol.
4.2.3.3 Training—
A statement in the QA Project Plan should indicate that the key task
personnel have experience in or have received training in performance of the
specific technical task for which they have responsibility in the project.
Where appropriate, personnel should be expected to participate regularly in
certification programs, including external audit programs for performance
evaluation and/or accredited training courses in their areas of specializa-
tion. All project personnel should keep abreast of current developments in
their fields of expertise. Periodic meetings during project implementation
may be helpful in information exchange and lead to improved quality control.
Bench-level personnel should also be involved in the feedback and
corrective action loop (Section 4.2.16.2). This involvement should begin
early in the project with a briefing on the overall task goals, methods to
be employed, and personnel roles in quality assurance.
4.2.4 Facilities, Services, Equipment, and Supplies
Evaluation of project-specific facilities, support services, equipment,
and supplies is the responsibility of the project officer in cooperation
with the HERL QA organization. The QA officer may, at his/her option or at
the request of the project officer, division director, branch chief, or
section chief, inspect and evaluate or request an audit by qualified person-
nel of facilities, support services, equipment, and supplies used by labora-
tories performing HERL-supported work.
4.2.4.1 Facilities--
All HERL-supported facilities should be capable of producing acceptable
data quality in an efficient, cost-effective manner with minimum risk to
personnel.
The suitability of a facility for the execution of both the technical
and QA aspects of a project may be assessed prior to use through a systems
audit by qualified technical and QA personnel. These audits should deter-
mine if facilities are of adequate size, with satisfactory lighting, venti-
lation, temperature, noise levels, and humidity, and if they are operation-
40
-------
ally consistent with their designed purpose. Satisfactory personnel safety
and health maintenance features should also be present. HERL requires that
all facilities meet acceptable safety and health standards.28
Facility security should be tailored to the project research needs and
to personnel safety requirements. Security may range from areas available
for common use by nonproject personnel to restricted areas accessible only
to authorized project personnel.
Authorization and documentation of all changes in facility configura-
tion should be limited to a single professional staff member (e.g., project
officer) who is qualified to ensure that necessary modifications will not
jeopardize data quality or personnel health and safety within the facility.
4.2.4.2 Support Services—
The reliability of required support services is of primary importance
in evaluating the project facilities and should be checked by periodic
(regular) performance audits. Numerous measurement processes depend on
routine services (i.e., gases, electricity, heat, steam, or water) and loss
of these may cause significant deterioration of data production or quality.
Therefore, adequate provision for backup support services should exist.
4.2.4.3 Equipment--
All equipment should be evaluated prior to use for its applicability to
the HERL project. The relationships of all measurement methods and the
variables to be monitored should be well characterized and documented before
being approved for use. Similarly, the design and performance of equipment
should be thoroughly evaluated with the aid of a professional who has both a
theoretical and a practical understanding of the specific instrument opera-
tion. In some cases, such as for atmospheric analyzers, comparative studies
of different manufacturers' equipment have been conducted by EPA or its
contractors. These data should be taken into consideration in establishing
precision and accuracy requirements for the project. Definitive statements
about the performance of different manufacturers' equipment cannot be based
reliably on examinations of single pieces from each manufacturer. Accept-
ance testing for new equipment should be performed on an item-by-item basis
and documented for comparison with future testing. All testing programs
should be designed to determine the optimum operating range of the equip-
41
-------
merit. Equipment performance should be evaluated periodically by systems and
performance audits.
To ensure consistently high data quality in the HERL program, a plan
for routine inspection and preventive maintenance project management should
be developed and followed for all equipment. Scheduling of a particular
preventive maintenance program should be based on the identification of
critical components that are most likely to fail and the overall effect of
equipment failure on data quality.
All maintenance activities should be performed by suitably qualified
technical personnel using accepted, documented procedures according to the
preventive maintenance plan. The desirability of full- or part-time equipment
operator and/or maintenance support should be considered. Frequently,
sophisticated instrumentation performs poorly or not at all when many
occasional users have access to it. On the other hand, minor but frequent
maintenance often keeps an instrument operating at peak performance. In
such cases, the cost of a full-time, dedicated operator may be justified.
Documentation of all scheduled and unscheduled maintenance is essential
to monitoring and documenting data quality. Permanent records of the main-
tenance histories of all equipment, including detailed descriptions of all
adjustments made, parts replaced, etc., should be kept in individual bound
notebooks, dated, and signed by the proper authority.
4.2.4.4 Supplies--
A well-documented acceptance testing program for all incoming expend-
able supplies should be adhered to. This acceptance screening ensures that
supplies not meeting project specifications are not used. The results of a
successful acceptance test confirm: (1) that the substance fully corresponds
to the manufacturer's specifications and (2) that known or suspected inter-
ferents are absent. Acceptance screening under the HERL QA Program involves
two classes of consumables: chemicals and biological materials.
4.2.4.4.1 Chemicals—The screening of chemicals or reagent commodities
involves verification of assay and examination for impurities. Such screen-
ing should be performed on a batch basis using accepted, documented analyti-
cal methods. For example, it is necessary to characterize all incoming
cylinder gases containing pollutants as to the pollutant concentration and
42
-------
the composition of the diluent gas(es). Following successful completion of
the acceptance test, an expiration date should be permanently marked on each
container; containers should be stored on a first-in first-out basis.
Permanent labels should be attached to the container with the following
information:
Date received:
Date tested:
Expiration date:
Storage conditions that will protect the integrity of the material and
protect personnel from harmful exposures should be observed. In particular,
parameters such as temperature, light, and humidity should be controlled.
Routine recertification should be performed to characterize changes in con-
centration, formation of new species, or loss of original species to prevent
them from degrading project data quality. When possible, the integrity of
the substance should be checked prior to each use.
A permanent record of all certification procedures, dated and signed by
the appropriate authority, should be kept in a bound laboratory notebook
that is filed with the project officer and is accessible to the QA officer.
A central stockroom tracking system should be designed to facilitate
rapid reference to the identity of other users of a substance. This is
useful for informal sharing of information of interest as well as for rapid
identification of users if specific problems (e.g., degradation or contami-
nation) are detected with a particular substance.
Since many chemicals tested at HERL are potentially hazardous to human
health, project personnel should be protected from exposure at all times.
When chemicals are known to be toxic, mutagenic, carcinogenic, or teratogenic,
the project officer should identify where personnel health and safety problems
may arise during the performance of the project. Task personnel should be
advised of the specific hazards and of proper handling procedures for all
potentially hazardous chemicals.28
43
-------
4.2.4.4.2 Biological materials—To evaluate the possible health hazards
associated with environmental pollutants, HERL has taken the approach of
using both i_n vitro and jjn vivo bioassays in a battery of tests, each test
measuring a different endpoint.29 These tests include j_n vitro microbial
and tissue culture assays, i_n vivo whole organism testing, and human studies.
Endpoints measured include mutagenicity, toxicity, carcinogenicity, genetic
damage in chromosomes (both somatic and genetic), neurological damage, and
related health effects. Ijn vitro tests are cost- and time-effective and can
identify substances that may then be screened further for possible health
effects in more time-consuming and costly i_n vivo whole animal testing and
human studies. All of these systems present special problems, since biolog-
ical systems inherently possess a high degree of variation. Because of this
inherent variation, quality assurance in this area of research is in the
developmental stage.
4.2.4.4.3 In vitro testing—In vitro bioassay testing is an important
area of health effects research at HERL; however, several problems have
emerged concerning the evaluation of data quality for such testing.30
Estimates of accuracy often cannot be obtained because of the serious lack
of standard reference materials for these test systems. Estimates of
precision, however, can be obtained by taking replicate measurements. A
specific need also exists for the use of standard protocols including guide-
lines for use of controls and number of replicates. Documentation of all
procedures must be an integral part of a bioassay QA program.
Iji vitro microbial strains and cell culture lines that have been quite
thoroughly characterized are available for research purposes. The American
Type Culture Collection, 12301 Parklawn Drive, Rockville, Maryland 20852,
provides:
Certified animal cell lines (and is a depository for new animal
cell lines)
Animal viruses and antisera
Chlamydiae
Richettsiae
Certified pathogenic bacteria
44
-------
In addition, the following cultures may be obtained from specific
research laboratories:
Ames/Salmonella reverse mutation assay
Dr. Bruce Ames
Department of Biochemistry
University of California at Berkeley
Berkeley, CA 94720
E. coli/pol A DNA damage assay
Dr. Herbert Rozenkrantz
Department of Microbiology
NY Medical College
Valhalla, NY 10595
Mouse lymphoma mammalian cell culture forward mutation assay
Dr. Donald Clive
Burroughs Wellcome
Research Triangle Park, NC 27709
Chinese Hamster Ovary (CHO) Cells forward mutation assay CHO/HGPRT
Dr. Abe Hsie
Oak Ridge National Laboratory
P.O. Box Y
Oak Ridge, TN 37830
CHO/V79-cell transformation and forward mutation assay
Dr. Elie Huberman
Oak Ridge National Laboratory
P.O. Box Y
Oak Ridge, TN 37830
4.2.4.4.4 In vivo whole animal testing—A large portion of HERL testing
involves experimentation with animal subjects. Inbred animal strains have
been quite well characterized for generations and correlate closely with
certain aspects of human health. For example, C3H/HeJ mice have been selected
for their ability to convert polycyclic aromatic hydrocarbons to their
active carcinogenic form.31
Specific screening procedures for intramural HERL animal studies should
be developed with the assistance of the HERL Laboratory Animal Staff (LAS).
Adherence to accepted animal handling procedures and animal facility accredi-
tation by the American Association for Accreditation of Laboratory Animal
Care (AAALAC) are considered minimum requirements for all HERL animal studies.
Animal selection should be based on awareness of the animal strain's genetical-
ly determined immunities, as well as the specific dose-response relationship
45
-------
to be investigated. The research protocol should clearly state the basis
for selection of a particular species and strain. Acceptance testing, or
prescreening and surveillance, should be sufficiently comprehensive to
ensure that only suitable animals are used as experimental subjects and
controls. Although the added expense of such testing may limit the quantity
of animals used, the improved data quality will generally more than compensate
for this loss.
Comprehensive HERL guidelines for research involving animals are avail-
able from the HERL Animal Care Coordinator. In addition, the animal care
support facility has a QA Project Plan32 on file with the QA officer. Where
appropriate, this plan should be included by reference in the division,
branch, and section QA Project Plans and in individual research protocols.
4.2.4.4.5 Human studies—Human subject studies may be the most costly,
time consuming, and difficult studies to conduct because of the variability
in the human population and because of health and safety requirements restric-
ting experimental exposure levels. However, they may be the most important
types of experiments to perform when acquired data will be used in regulatory
actions establishing maximum permissible exposure levels for specific pollut-
ants potentially harmful to human health. Variability among human subjects
can be minimized, but not eliminated, by careful pretest screening to deter-
mine medical history, work history, personal habits, and present health
status. Project officers supervising research involving human subjects must
be careful to comply with all existing regulations and guidelines for the
protection of human subjects.33
4.2.5 QA Objectives for Measurement Data
Agency policy clearly mandates that all environmentally related data
produced by or for the Agency be of estimated quality and emphasizes the im-
portance of employing uniform data quality terminology throughout the Agency.
A primary objective of this policy is the establishment of appropriate data
quality acceptance criteria for Agency environmental measurement programs.
Acceptance criteria are most reasonably derived from considerations of a
measurement system's capabilities and/or from the end uses of the data, or
from study objectives; they are usually defined in terms of bias, precision,
46
-------
and completeness and may also involve representativeness and comparability.
In addition, resource limitations must often be factored into decisions
regarding data quality acceptance criteria.
For each major measurement method employed in a project, the QA objec-
tives for precision, bias, and completeness, as well as the experimental
conditions and reference for the method, should be summarized in the QA
Project Plan as shown in Table 4-1. Obtaining accuracy estimates for
research or state-of-the-art measurement methods may prove difficult or
impossible depending on the availability of appropriate reference materials
or other accepted methods of making the same measurements. Research
projects employing such measurement methods must rely more on peer review
and on thorough documentation of procedures as a means of ensuring acceptable
data quality. In these cases, it is sufficient to note on the summary
table (Table 4-1) that the item is not applicable and explain briefly.
Some of the more important issues that should be considered in assessing
data quality parameters and acceptance criteria are summarized below.
Accuracy, bias and precision in the one-sample case
There are typically many sources of variation or error in a meas-
urement process. In the fashion of analysis of variance, the
total sum of squares of errors can usually be decomposed into
constituent sums of squares representing individual sources of
variation. Most numeric measures of data quality relate to some
variance component.
The most frequently used measures of data quality are bias, preci-
sion, and accuracy. Bias is systematic error; precision measures
the closeness of data values to each other; accuracy measures
closeness of the experimental values to the target value. Accuracy
and precision may be stated as bias and precision.
More specifically, let ylt y2, . . . , y be a sample of measure-
ments aimed at estimating the target value t, and let y denote the
sample average (yi+ . . . -
(1) Average bias = y - t,
1 n
(2) Imprecision - sample standard deviation(s) = \—^-y I (y. - y)2
i n — j. -. i
and
47
-------
TABLE 4-1. FORMAT FOR SUMMARIZING PRECISION, ACCURACY, AND COMPLETENESS OBJECTIVES5
Measurement
parameter
(method)
Reference
Experimental
conditions
Imprecision'
Std. Dev.
Bias'
Completeness
Examples:
N02
(Chemi1umi nescent)
S02 (24 h)
(Pararosaniline)
EPA 650/4-75-011
February 1975
EPA 650/4-74-027
December 1973
Atmospheric samples
spiked with N02 as
needed
<±10% relative ±5% relative
Synthetic atmosphere <±20% relative ±15% relative
90%
90%
-(=>
00
Imprecision and bias should be clearly defined, including formulas.
-------
(3) Mean squared error = - I (y. - t)2 .
n l i
Note that
(4) Mean squared error = (- ) imprecision2 + bias2 .
For large samples,
(5) Mean squared error = imprecision2 + bias2 .
The definitions of bias and imprecision are extended to regression
situations in the discussion of calibration (Section 4.2.8.3).
Interlaboratory variation (reproducibility) and intralaboratory
variation (repeatability, replicability)
As noted above, there are typically numerous sources of variation
in a measurement (sampling/analysis) process. Interlaboratory
variation (imprecision) is often called reproducibility. This is
the variation (standard deviation) among results of measurements
on the same sample by different laboratories and, presumably,
different analysts.
Two types of interlaboratory variation are often distinguished.
Repeatability reflects the variation (standard deviation) among
results of measurements of the same sample at different times in
the same laboratory. Replicability expresses the variation (stand-
ard deviation) among results of independent analyses of the same
sample by the same analyst at essentially the same time under the
same conditions. Note that a(reproducibility) > a(repeatability)
> a(replicability).
It is possible that a precision measure depends on the true level
or some proxy for it, such as average apparent level.
In setting criteria for judging performance, such as warning
limits on control charts (see Section 4.2.12.1.9), the proper
measure of precision must be used. Otherwise, the criteria may be
too restrictive, resulting in excessive data invalidation, or not
restrictive enough, allowing data of questionable quality to pass
through undetected.
Criteria for results from an intralaboratory study involving
several analysts and sets of equipment would appropriately be
based on the repeatability measure of precision. Likewise, an
interlaboratory study should be evaluated against criteria based
on the reproducibility measure of imprecision.
49
-------
Completeness
Completeness refers to the percentage of planned measurements that
are actually taken and appear in the data base. This is related
to the statistical problem of missing data. Many data bases
contain a vector of variables for each observation, creating the
possibility of partially missing data for some observations. For
such data sets, a thorough treatment of completeness issues leads
to investigations of patterns related to missing data, e.g.,
identification of population subsets inclined to have missing
data, and search for patterns or associations between missing data
on two or more variables.
Representativeness
Measurement processes often involve distinct sampling and analysis
phases. Each phase makes its own contribution to bias and impre-
cision. The term representative is reserved for sampling proce-
dures. The representativeness of a sampling procedure may be
gauged in terms of the bias and imprecision attributable to the
sampling procedure relative to the total population. As with any
attempt to measure data quality, it is necessary to first define
the reference or target population unambiguously.
The question of representativeness is as important for basic
health effects research activities as for routine monitoring
programs; however, because of the inherent variability of biolog-
ical systems it often presents special problems. Establishing
acceptance criteria for representativeness in health effects
research areas will, therefore, require close cooperation between
QA and technical personnel on a project-by-project basis.
Comparability
To a great extent, the issue of comparability is one of correlation
between the results of applying two (or more) different methods to
the same materials. If the paired results cluster tightly about
some monotonic smooth curve, then the information content of the
two methods is similar; either measurement can essentially be
obtained from the other.
If known reference materials are available, then both methods can
be applied to them and compared with respect to biases, as well as
precisions, in predicting true values from measured values. It is
actually possible to estimate the ratio of these precisions (each
precision being in units of the "correct value") without recourse
to reference standards and the individual calibration curves for
the two methods. This can be done by applying both measurement
methods to the same set of unknown materials, replicating each
measurement, and using the proper statistical analysis.
50
-------
An issue of particular importance to health effects research that
is raised by comparability is that of the use of concurrent con-
trols. Controls do not necessarily legitimize interstudy compari-
sons. However, the comparison of two studies is much more credible
if both studies include control subjects. The comparison of the
studies can then be attempted relative to their controls; or, if
controls respond sufficiently similarly, then some justification
of direct comparison of experimental subjects is obtained. Even
with controls, rather strong assumptions are required for compar-
ison of studies conducted at different institutions.
4.2.6 Sample Collection
Collection of a representative sample is of paramount importance to any
measurement process. No amount of analytical expertise or data manipulation
will result in meaningful project conclusions if the samples being analyzed
are not representative of the parent material or population under study.
Collecting a suitably representative sample may be the most technically
difficult, hazardous, and time-consuming part of the measurement task, and
the one that is most often overlooked. The assumption that a sample retains
the desired property of the system from which it was extracted deserves
close examination. Sampling may range from collecting a representative
volume of aerosol from an exposure chamber to obtaining tissue samples from
rats or biological samples from human subjects. In some cases, additional
environmental data may be required (e.g., temperature, pressure, time of
day) to document the collection procedure. In many health effects research
systems, separate control samples must also be obtained.
The overall objectives of a research project affect many aspects of
sampling. Factors that must be considered in meeting project objectives
include:
Resource limitations
Complexity of measuring the parameters of interest
Duration of the study
End use of the data
Number of samples to be collected
Sampling frequency
Type of samples
51
-------
For each major measurement parameter being examined in a research
project, a detailed description of sampling procedures should be provided
and, where applicable, all of the topics in the following subsections should
be addressed. In addition, responsibility for all sampling should be clearly
delineated and approved by the project officer.
4.2.6.1 Sample Site Selection--
The selection of the sampling site is critical to obtaining representa-
tive data, whether the project involves environmental monitoring of air or
water pollution parameters at specific sites or the monitoring of physiolog-
ical or morphological changes in a specific target organ or tissue sample
from an animal exposed to an environmental contaminant. A description of
the criteria used to select the sampling sites should be given for each
project.
4.2.6.2 Sampling Procedures--
Various EPA-approved or other standard sampling methods are available
for a wide variety of sampling needs; these should be reviewed at the outset
of any project to ascertain if they can be used directly or with modifica-
tions. Use of EPA-approved methods or other standard sampling methods is
required where applicable. Where existing EPA-approved or standard method-
ology is not applicable to a particular project, sampling procedures may be
obtained from the scientific literature, or new methods may have to be
developed. In all cases, the sampling procedures used should be written as
standard operating procedures (SOPs) and included in full or by reference in
the QA Project Plan and research protocol. All SOPs should be written in
enough detail so that the sampling procedures can be duplicated by another
technically qualified individual. In cases where modifications of SOPs are
employed, the modifications may be documented either by reference to the SOP
with appropriate modifications noted or by issuance of a new SOP with mod-
ifications incorporated. The choice should be dictated by the extent to
which the method will be used.
After a sampling procedure has been selected and suitably modified for
a particular research project, additional details related to the sampling
program for the project need to be identified. These include sampling
52
-------
frequency, number of samples, size of samples, and specific timing of sample
collection with respect to natural circadian, annual, or seasonal cycles.
4.2.6.3 Sampling Flow Diagrams, Charts, or Tables--
Since sampling often involves a specific sequence of activities, flow
diagrams illustratfng sampling sites, key personnel, sampling methods, dates
of collection, maximum sample holding times, and other pertinent information
may be useful (Figure 4-7).34 Charts identifying site locations may also
provide easily accessible information and tables may provide a complete
listing of reagents, collection containers, or accessory supplies that the
sampling effort will utilize.
4.2.6.4 Sample Storage and Handling--
If analysis cannot be performed during or immediately after collection,
the sample must be handled and stored in an appropriate manner to maintain
its integrity. Handling and storage procedures assume varying importance
depending on the nature of the sample; however, in many cases there is a
finite residence time that can reasonably ensure that the sample is not
modified (or at least modified in a known way, for which correction can be
made). Proper storage conditions for all samples must be provided before,
during, and after analysis.
All procedures employed for ensuring the integrity of the sample should
be completely documented, and criteria for rejecting inadequate, inappropriate,
or degraded samples should be clearly defined. Key project personnel respon-
sible for ensuring, evaluating, and documenting sample integrity should be
clearly identified in relation to overall project management and to sample
custody authority.
The following factors must be considered in preventing degradation,
contamination, or loss of a sample.
4.2.6.4.1 Container material—The suitability of the selected container
material for maintaining sample integrity may be ascertained by using blanks
or known concentration samples under worst-case storage conditions. Once a
suitable container material has been identified, the manufacturer, item, and
lot number should be documented and the container material should be used
for all samples for the duration of the study. Where samples must be shipped
53
-------
SAMPLE FOR BIOLOGICAL ANALYSIS
GASES AND SUSPENDED
PARTICIPATE MATTER
EXTRACTED WITH
METHYLENE CHLORIDE
MICROBIAL
MUTAGENESIS,
RODENT ACUTE
TOXICITY.
CYTOTOXICITY,
MICROBIAL
MUTAGENESIS.
CYTOTOXICITY
MICROBIALMUTAGENESIS,
CYTOTOXICITY,
RODENT ACUTE TOXICITY,
ALGAL BIOASSAY,
FISH BIOASSAY.
INVERTEBRATE BIOASSAY
MICROBIAL
MUTAGENESIS,
CYTOTOXICITY,
RODENT ACUTE
TOXICITY
ALGAL
BIOASSAY,
FISH
BIOASSAY
INVERTEBRATE
BIOASSAY
Figure 4-7. Sample flow diagram for sample collection and analysis.34
-------
or transported, container material should also be of a quality to prevent
breakage during shipment.
4.2.6.4.2 Sample identification--A11 samples should be uniquely identi-
fied with individual labels for each sample container. The label information
should be adequate for tracing an archived sample back to its original sam-
pling site without ambiguity. Improper identification of samples will
result in irretrievable loss of data from the project.
4.2.6.4.3 Environmental conditions--The environment in which a sample
is stored, transported, or archived is of primary concern in maintaining
sample integrity. Proper environmental conditions (e.g., temperature, pres-
sure, light level, relative humidity, and agitation) should be determined
and specified for each sample type in every measurement project.
4.2.6.4.4 Reagents—When chemical reagents are added to a sample as
part of the collection, preservation, or storage procedures, checks on
reagent quality should be performed and documented to identify possible
contamination (see Sections 4.2.4.4 and 4.2.12).
4.2.6.5 Recordkeeping--
Where appropriate, standard format data sheets for sample collection
should be designed at the outset of a project and used for the duration of
the project. These data sheets should provide space for a unique sample
identification number, date of collection, collector's signature, project
number, and comments about sampling conditions related to sample quality.
When bound, these sheets may serve as the sampling logbook. When use of
standard format data sheets is not appropriate, all sampling data should be
recorded chronologically in a clear and complete manner in a single logbook.
In all cases, accurate recordkeeping should be emphasized for all project
personnel. For a complete discussion of recordkeeping, refer to Section
4.2.10.
4.2.7 Sample Custody
All projects involving health effects research should document and
implement a chain of possession and custody of any sample collected, whether
or not the resulting data are to be used in enforcement cases. Such proce-
55
-------
dures ensure that the samples are collected, transferred, stored, analyzed,
and disposed of only by authorized personnel.
4.2.7.1 Field Custody Procedures--
The following sample custody procedures are specifically applicable to
large-scale monitoring programs involving shipment of samples from the sam-
pling site to the analysis laboratory.35 They are intended to be comprehen-
sive and may be used as guidelines for all measurement or research programs.
Depending on the specific scope and nature of a project, the project officer
should tailor the sample custody procedure to individual project requirements.
The most important concern is that the sample custody procedure be properly
documented and adhered to for the duration of the task.
I. Samples must be accompanied by a chain-of-custody record that
includes the project title, collectors' signatures, collection
site, date, time, type of sample, sequence number, number of
containers, and analyses required. (An example of a chain-of-
custody record is shown in Figure 4-8. Note: Standardized
chain-of-custody formats should be tailored specifically to each
project and used consistently for the duration of the project.)
When transferring possession of samples, the transferor and trans-
feree sign, date, and time the record sheet. This record sheet
allows transfer of custody of a group of samples from a collection
area to the central analysis laboratory. When a custodian trans-
fers a portion of the samples identified on the sheet to the
laboratory, the individual samples must be noted in the column
with the signature of the person relinquishing the samples. The
laboratory person receiving the samples acknowledges receipt by
signing in the appropriate column.
2. The collector custodian has the responsibility of packaging and
dispatching samples to the laboratory for analysis. The dispatch
portion of the chain-of-custody record must be filled out, dated,
and signed.
3. To avoid breakage, samples should be carefully packed in shipping
containers such as ice chests. The shipping containers are pad-
locked for shipment to the receiving laboratory. Special shipping
precautions are necessary for toxic or hazardous materials and
must conform to Federal regulations.36"39
4. Packages must be accompanied by the chain-of-custody record showing
identification of the contents. The original record must accompany
the shipment. A completed copy is retained by the project officer
after completion of the analysis.
56
-------
CHAIN OF CUSTODY RECORD
STATION
NUMBER
STATION LOCATION
DATE
fcl
^
Relinquished by:'**""""'
Relinquished by: :s.w,u,,>
Relinquished by: «,<,„„„„;
Relinquished by: (Signature)
Dispatched by:«,»«»«y
Method of Shipment:
Date
TIME
.^
SAMPLERS(i9-»ru«j
SAMPLE TYPE
Warer
Com p.
,V<
Grab
P
Air
SEO
NO
NO. OF
CONTAINERS
ANALYSIS
REQUIRED
Received by: is*™**! Date/Time
Received by: ®9™^ Date/Time
Received by: «.« Date/Time
Received by Mobile Laboratory for field Date/Time
analysis: /5>»>uru»/
/Time
Received for Laboratory by: Date/Time
Distribution. Ong.— Accompany Shipment
1 Copy— Survey Coordinator Field Files
Figure 4-8. Sample chain-of-custody record.35
57
-------
5. If sent by mail, register the package with return receipt re-
quested. If sent by common carrier, a Government bill of lading
should be obtained. Receipts from post offices and bills of
lading should be retained as part of the permanent chain-of-
custody documentation.
6. If delivered to the laboratory when appropriate personnel are not
there to receive them, the samples must be locked in a designated
area within the laboratory so that no one can tamper with them, or
they must be placed in a secure area. The recipient must return
to the laboratory, unlock the samples, and deliver them to the
appropriate custodian.
4.2.7.2 Laboratory Custody Procedures--
Suitable laboratory sample custody procedures include the following:35
1. The laboratory should designate a sample custodian and an alter-
nate custodian to act in his/her absence. In addition, the lab-
oratory should set aside a sample storage security area. This
should be a clean, dry, isolated room with sufficient refrigerator
space that can be securely locked from the outside.
2. Samples should be handled by the minimum possible number of persons.
3. Incoming samples should be received only by the custodian, who
will indicate receipt by signing the chain-of-custody record sheet
accompanying the samples and retaining the sheet as a permanent
record. Couriers picking up samples at the airport or post office
shall sign jointly with the laboratory custodian.
4. Immediately upon receipt, the custodian places the samples in the
sample room, which will be locked at all times except when samples
are removed or replaced by the custodian. Only the custodian
should have access to the sample storage room.
5. The custodian should ensure that all samples are properly stored
and maintained under appropriate environmental conditions (i.e.,
temperature, humidity, light intensity).
6. Only the custodian should distribute samples to task personnel who
are to perform analyses.
7. In the laboratory notebook or analytical worksheet, the analyst
records information describing the sample, the procedures per-
formed, and the results of the analysis. The notes should be
dated, should indicate who performed the tests, and should include
any abnormalities that occurred during the testing procedure. The
notes should be retained as a permanent record in the laboratory.
In the event that the person who performed the tests is not avail-
58
-------
able as a witness at the time of a trial, the Government may be
able to introduce the notes in evidence under the Federal Business
Records Act.
8. Approved methods of laboratory analyses should be used and docu-
mented on all samples.
9. Laboratory personnel are responsible for the care and custody of a
sample once it is handed to them and should be prepared to testify
that the sample was in their possession and view or secured in the
laboratory at all times from the moment it was received from the
custodian until the analyses were completed.
10. The laboratory area should be maintained as a secured area and
should be restricted to use by authorized personnel only.
11. Once the sample analyses are completed, the unused portion of the
sample, together with identifying labels and other documentation,
should be returned to the custodian. The returned, tagged sample
should be retained in the custody room until permission to destroy
the sample is received by the custodian.
12. Samples should be destroyed only upon the order of the project
officer in consultation with the QA officer and only if it is
certain that the sample is no longer required. The same destruc-
tion procedure is true for tags and laboratory records.
4.2.7.3 Evidentiary Considerations—
As accurate and reliable environmentally related measurements become
increasingly important in documentation of environmental conditions of
public health concern, organizations collecting these data must address
evidentiary considerations.
Recording all sample custody procedures and promulgated analytical
procedures in writing will facilitate the admission of evidence under Rule
803(6) of the Federal Rules of Evidence (Public Law 93-575). Under this
statute, written records of regularly conducted business activities may be
introduced into evidence as an exception to the hearsay rule without the
testimony of the person(s) who made the record. Although it would be pref-
erable, it is not always possible for the individuals who collected, kept,
and analyzed samples to testify in court. In addition, if the opposing
party does not intend to contest the integrity of the sample or testing
evidence, admission under Rule 803(6) can save a great deal of trial time.
For these reasons, it is important that the procedures followed in the
59
-------
collection and analyses of evidentiary samples be standardized, formally
documented as written SOPs, and described in an instruction manual, which,
if need be, can be offered as evidence of the regularly conducted.business
activity followed by the laboratory or office in generating any given record.35
4.2.8 Calibration
Calibration is the process of establishing the relationship between the
output of a measurement system and that of a known input; it allows different
instruments to be correlated with each other and with a specified reference
standard.20 Calibration is an integral part of any measurement process and
is a major factor in controlling the accuracy of results. Since the reported
accuracy of the measurement method can be no better than the accuracy of the
calibration system, calibration is also a limiting factor.
A sound calibration system should include provisions for:
Selection of the highest quality calibration standard.
Detailed documentation of calibration procedures as written SOPs
including specifications for reagents, materials, support equipment,
and pertinent environmental conditions.
Construction of a calibration curve or a corrective table to
determine appropriate correction factors.
Maintenance of a record of calibration histories for instruments,
support equipment, and standards including identification of
instruments and standards, dates of calibration, and calibration
results.
Determination of calibration frequency needed to ensure quality
data collection.
Identification of acceptance limits in terms of bias and impreci-
sion, and specification of corrective action to be taken when
limits are exceeded.
4.2.8.1 Calibration Standards--
Calibration standards should be of the highest quality available and
fully characterized. In the United States, the National Bureau of Standards
(NBS) holds the position of final authority in the preparation of many
reference materials and the NBS Standard Reference Materials (NBS-SRMs) are
generally regarded as the best standards of each type available.
60
-------
Investigators should check on the availability of NBS-SRMs applicable
to their measurement needs. The NBS has been rapidly developing suitable
standard reference materials for environmentally related measurements. The
NBS provides information on available standards in its regularly revised
Special Publication 260,40 regular publicity releases, and in a special
mailing list for newly issued NBS-SRMs. In addition, a new monthly column
in American Laboratory entitled "Reference Materials," edited by the Deputy
Chief of the Office of Standard Reference Materials, is an excellent source
of current information on NBS-SRMs. Considering the rapidity with which
NBS-SRMs are being developed and the pressing need to compare data to stand-
ards of known high quality, this column should be reviewed regularly by each
investigator. Examples of currently available NBS-SRMs that may be applic-
able to environmental measurement systems include:
Hydrocarbon Blends
Primary Working and Secondary Standard Chemicals
Microchemical Standards
Metal lo-Orgam'c Compounds
Isotopic Reference Standards
Radioactivity Standards
Industrial Hygiene Standards
Trace Element Standards
Clinical Laboratory Standards
Environmental Standards
Biological Standards
Certified Physical Properties Standards.
Selected examples of some of these NBS-SRMs are shown in Appendix A.
Use of NBS-SRMs completely fulfills the requirement of high quality and
full characterization. However, because NBS-SRMs are not mass produced and
are individually characterized by lot, they are expensive and often in short
supply. Therefore, it is generally desirable to employ secondary standards
61
-------
as the actual calibration standards, maintaining an NBS-SRM as a final high
quality calibration standard for the secondary working calibration standard.
Whenever a secondary standard is employed in a calibration, it is necessary
that traceability to a quality primary standard be established and maintained.
EPA regulations now require traceability of calibration standards to NBS-SRMs
where possible,41 42 and it is likely that this requirement will appear in
all future regulations.
Unfortunately, for many common measurement processes routinely used at
HERL, there are no NBS-SRMs available. In these cases, the investigator
must use the best available calibration standard or devise a standard. Such
standards must also meet the requirements of high quality and complete
characterization applicable to NBS-SRMs. Careful characterization of such
standards involves rigorous testing to establish the true value of the
reference material within stated limits of imprecision. Such rigorous
testing may include repeated analysis of the standard material by more than
one analyst or technique or round-robin interlaboratory analyses.
4.2.8.2 Calibration Procedures--
Written procedures describing each step in the calibration process are
required and should be prepared as SOPs under documentation control. Cali-
bration procedures may be prepared in-house by qualified personnel, derived
from instrument or process manufacturer's instructions, or obtained from
sources such as the American Society for Testing and Materials43 or the
NBS.44 Only the most current and acceptable procedures available for the
specific calibration should be used. In addition, only personnel familiar
with the measurement process and the calibration procedure should perform
the calibration.
An aspect of the calibration operation that is often overlooked is the
calibration or certification of reagents, materials, and support equipment.
Most calibration procedures use equipment and/or reagents in addition to the
standard(s). All such reagents, materials, and support equipment should be
subjected to recent calibration or certification prior to use in the stand-
ard calibration procedure. Even for authoritative standards, such as NBS-
SRMs, sample integrity may be questionable if proper storage and handling
procedures are not observed.
62
-------
Maintenance of environmental conditions should be appropriate to the
specific calibration measurement being conducted. Proper environmental con-
ditions must be maintained and documented during the entire calibration pro-
cedure. The use and handling of the calibration standard should be of par-
ticular concern as potential problems in calibration may arise with mistreat-
ment of otherwise valid calibration standards. Problems associated with the
use of some common standards include:
1. Permeation devices should be used and stored under carefully
specified environmental conditions of humidity45 and temperature46
and should be protected from possible environmental con-
taminants.45
2. Certain gases in pressurized cylinders require special procedures
for routine installation to prevent cylinder and regulator contam-
ination with atmospheric oxygen or moisture.
3. Electronic standards frequently require periods of several hours
for stabilization of output.
4. Most solid standards require conditioning at a specified humidity
prior to weighing.
These examples illustrate that users of standards should be familiar
with specified environmental conditions pertinent to handling of each stand-
ard. It is imperative that this be recognized if high quality data are to
be obtained from the measurement process.
4.2.8.3 Analysis of Calibration Data--
For analysis of calibration data, as for any type of data analysis, it
is important to completely specify the assumed underlying model and to
perform QC (goodness-of-fit) checks of model validity. A detailed discus-
sion of issues related to analysis of calibration data may be found in
Reference 47.
The equation
y = f(x) + e(x)
is often employed for calibration data, where x denotes reference value, y
denotes measured value, f(x) is the systematic or bias component of the
measurement, and e(x) is a random error term with mean (expectation) zero.
The systematic (bias) term f(x) is often assumed to be linear:
63
-------
y = (ax + b) + e(x),
where a is the slope and b is the intercept of the line.
A standard regression assumption is that the error variance a2(x) of
e(x) at reference value x is constant, a2(x) = C. This assumption, made
primarily for reasons of mathematical convenience, is often violated in
practice. Chemical analyses, for instance, often contain an error component
proportional to the level of analyte being measured, a(x) = Ax. In some
cases, the error variance has two components, one affected by the analyte
level, the other constant and present at all levels, operating as additive
noise, a2(x) = Ax2 + C.
Coefficient estimates a and b obtained under the equal variance assump-
tion are unbiased ("correct in the long run"; correct for an infinite sample)
as long as the systematic component is correctly modeled (ax + b). This is
the case even if the equal variance assumption fails. On the other hand,
coefficients based on the equal variance assumption may be imprecise in the
presence of unequal variances. Appreciable gains can result from appropri-
ately modeling variances as functions of targets and using the inverse
estimated variances as weights in the regression scheme.48 However, Hunter47
warns that "as a practical statistical problem, rejecting the hypothesis
that a2 is constant is not an easy matter; many replicate observations are
required and assumptions concerning the distribution of the errors of obser-
vation can be crucial."
Ideally, before the process is put into routine use, an in-depth char-
acterization study of the measurement process is conducted to yield enough
data for reasonably powerful goodness-of-fit checks of the model. These
checks can focus separately on the deterministic (mean) and random components
of the model; i.e., on (1) adequacy of the linear model, which can be checked
by overfitting within appropriate classes of models, (2) tests for constancy
of variance, and (3) investigation of components of variance.
The most important parameters in a (linear) calibration are the bias
parameters (slope a and intercept b), and squared imprecision (a2). In
addition to reporting estimates of these parameters, it is also desirable to
report confidence intervals for the bias and imprecision parameters (a, b, a2)
and to monitor the parameters with control charts20 (Section 4.2.12.1.9).
64
-------
The formulas for these calculations can be found in most standard statistics
texts.47 48 49
It is also desirable to associate a confidence interval for the true
value x corresponding to a given measured value y. This problem is incorrect-
ly treated in a number of otherwise reputable references. A proper solution
should take into account the uncertainty in y as an estimate of its expecta-
tion and the considerations of simultaneous statistical inference.47
4.2.8.4 History of Calibration--
Documentation of each calibration must be maintained to establish a
complete history of all calibrations performed on a measurement system.
Control charts for slope, intercept, and standard error constitute partial
graphical histories of results. The history of calibration for a measurement
system should include complete documentation of:
Dates of calibration
Identification of the standards used
Support equipment, reagents, and devices used
Personnel performing calibration
Pertinent environmental conditions
Results of calibration (raw data and summary statistics)
Corrective actions taken.
4.2.8.5 Corrective Action--
To characterize the dynamics of the system, each measurement system
should undergo an initial intensive calibration phase in which it is frequent-
ly calibrated. The object is to investigate trends (e.g., time trends) in
parameters such as slope (a), intercept (b), and the imprecision (a). Two
important uses of this information are: (1) to determine if corrective
action (adjustment or replacement of equipment) is necessary; and (2) to
schedule future maintenance and calibration.
Exceeding preestablished control limits (see Section 4.2.12.1.9) calls
for corrective action, which should be specified by QA personnel. In some
cases, these limits follow from an a priori accuracy requirement. In other
65
-------
cases, they may result from observation of the measurement process over a
period of time (e.g., the intensive audit phase). If the measurement process
is judged to be satisfactory, then the average parameter estimates (e.g.,
slope, intercept, imprecision), plus or minus three of their standard errors,
may serve as future control limits.
4.2.8.6 Calibration Frequency--
It is possible to study the behavior of a measurement system over time
and to define the limits of acceptable performance. It is thus possible to
estimate the probability of unacceptable performance during a time interval
of arbitrary length between calibrations. The time between successive cali-
brations can then be chosen as the maximum time for which the probability of
exceedance is acceptably low.
In many cases the structure of a QA/QC program is largely determined by
practical considerations. For instance, many measurement processes exhibit
so much day-to-day variation that some quality control is necessary on a
daily basis. On the other hand, the need to accomplish routine project work
may largely dictate the QC effort within a day. In some cases mathematical
optimization techniques are useful in scheduling QC measurements.
These procedures will not anticipate nonroutine interferences or changes
to the measurement system as opposed to trends. Perceptions of operating
personnel are the best source for this information and any suspicion that
such a change has occurred should be followed by recalibration.
4.2.9 Sample Analysis
4.2.9.1 Selection of the Method of Analysis--
Selection of the method of analysis to be employed on any sample is
dictated by three major considerations:
Project data quality requirements
Cost and resource limitations
Use of the resulting data.
Attributes of a measurement method that address data quality require-
ments and that should be considered in the selection process include:
Accuracy (correctness)
66
-------
Precision (reproducibility)
Applicability (usable range)
Specificity (uniqueness of response)
Reliability (overall dependability)
Detectability (minimum amount measurable above background)
Sensitivity (response per unit concentration).20
From a management perspective, the project officer must also consider
the type and number of samples to be analyzed in relation to cost considera-
tions including the elapsed time for completion of the analysis, manpower
resources, availability of required equipment, cost of materials, and the
level of training required of project personnel for completion of the
proposed project.20 In all extramural projects, there should be a clear
understanding and agreement between the project officer and the extramural
organization in these areas.
The ultimate use of the analyzed data is the third consideration in
selection of the analytical method. In interlaboratory studies where several
research groups are providing data to a common data bank or are cooperating
in a joint research project, uniformity of the analytical methodology is
particularly important to remove methodology as an experimental variable.
For these cases, use of EPA-approved methods or other standardized analytical
methods, where applicable, is recommended. Widespread use of an analytical
method usually indicates that the method is reliable and therefore tends to
support the validity of the reported test results. Conversely, the use of
little-known analytical techniques forces the data user to rely on the
judgment of the analyst, who must defend his choice of the analytical technique.
For much of the health effects research conducted by HERL, however, standard
EPA-approved methods are not available and specific analytical methods must
be developed for each project.
4.2.9.2 Documentation of Analytical Procedures--
After the measurement method has been selected, a clear, concise de-
scription of the analytical procedures should be documented containing
sufficient detail so that the method can be repeated by other technically
67
-------
qualified personnel. These procedures should include a detailed description
of appropriate QC activities (see Section 4.2.12). The project officer
should choose the QC activities appropriate to a given task that will provide
the highest quality data given the existing limitations of the selected
analytical methods.
Various EPA-approved or other standard methods are available for a wide
variety of analytical needs; these should be reviewed at the outset of any
project to ascertain if they can be used directly or with modifications.
Use of EPA-approved methods or other standard sampling methods is required
where applicable. Where existing EPA-approved or standard methodology is
not applicable to a particular project, analytical procedures may be obtained
from the scientific literature, or new methods may have to be developed. In
all cases, the analytical procedures used should be written as standard
operating procedures (SOPs) and included in full or by reference in the QA
Project Plan and research protocol. All SOPs should be written in enough
detail so that the analytical procedure can be duplicated by another tech-
nically qualified individual. In cases where modifications of SOPs are
employed, the modifications may be documented either by reference to the SOP
with appropriate modifications noted or by issuance of a new SOP with modifi-
cations incorporated. The choice should be dictated by the extent to which
the method will be used.
4.2.10 Recordkeeping
High quality recordkeeping serves at least two useful functions:
(1) it makes possible the detailed reanalysis of a set of data at a future
time when the model has changed significantly, thus increasing the cost-
effectiveness of the data; and (2) it may be used in support of the experi-
mental conclusions if various aspects of the study are called into question.
This latter point is basic to scientific research. It is often possible to
interpret data in more than one way, therefore the raw data must be available
for evaluation by qualified professionals. When recordkeeping is careless,
suspicion is quickly aroused that all other aspects of the research are of
similarly poor quality.
68
-------
The cardinal principle of recordkeeping for scientific research is that
all raw data must be retained in a manner that is secure and that expedites
validation and access. Complete, permanent, and chronological records of
all project activities should be maintained. All information that might be
useful in data analysis and interpretation should be recorded. This includes,
in addition to raw data, explicit identification of equipment, reagents and
other supplies, experimental subjects (e.g., animals), protocol modifications,
and QC activities. The exact organization of the project records should be
specified in the work plan, research protocol, or QA Project Plan and is
subject to approval by the project officer.
A cross-referencing system should be used if the data are to be easily
accessible following their initial use. Such a system may be of various
levels of complexity, depending on the amount of data collected and their
potential applications. Requirements for nonclinical laboratory reports and
records and their generation, storage, retrieval, and retention on a long-
term basis have been specified in GLP regulations.18 When data are logged
by computers, it is important that adequate provision be made for redundant
and physically separate long-term storage of such records (Section 4.2.11.2).
All technical personnel should be provided with a personal notebook in
which they chronologically record all data and pertinent observations in
dark permanent ink. Where possible, formats for data should be standardized
for the project and not left to individual discretion. Where a large number
of measurements are made repeatedly, the use of preprinted standard format
data sheets is strongly recommended. When bound, these sheets often serve
most usefully as the laboratory notebook. The project officer should check
that the standard format data sheets have been designed to ensure complete
data, high productivity of technical personnel, and ease of reading the raw
data. Data coding forms should be designed in consultation with personnel
who must record and evaluate the data. In some cases, a data transfer can
be avoided by designing forms in consultation with keypunchers or other data
entry personnel. Information should be preprinted on forms when possible.
This applies to column headings, variable names, or questions on question-
naires. In some cases it is possible to code multiple observations with the
same format on a coding sheet. Variables that identify the coding form need
69
-------
not be repeated on each observation, although it may be desirable to include
them on each observation when the data are computerized.
Efforts should be made to encourage the entry, not only of specific
data (e.g., weights, absorbances, volumes, atmospheric or meteorological
conditions, and status of instruments), but also of anecdotal data and
comments. Erroneous or invalidated data should be indicated in such a way
that the entry is flagged, but remains legible. Drawing a single line
through the entry, so that the value is still readable, is an acceptable
indication; this flag should be initialed and the reason for suspicion of
the datum should be recorded in the comments column. Such information may
become extremely valuable in subsequent evaluation of a completed experiment
or in initial planning of a related one.
It may be advisable to provide station, laboratory, or task data note-
books, in addition to individual notebooks or project data notebooks, to
follow the relationship among various project activities. Such records will
generally take the same form and adhere to the same recommendations as per-
sonal notebooks. Related notebooks should be cross-referenced.
Instrument logbooks should be provided in which to record all data
relating to a particular piece of equipment. This log maintains, in one
location, a chronological record of instrument operation, calibration,
maintenance, failures, and idiosyncrasies. Such a record is often useful in
determining operational trends, spare parts inventories, etc. A specific
format should be used for recording such data to minimize the possibility of
omission of important procedures or data.
Computerized data acquisition systems have many advantages, but require
close monitoring and frequent auditing for erroneous or stray electrical
signals. Many systems are able to concurrently produce printed output as
well as computer-readable output (usually magnetic tape); where possible, it
is advisable to employ both.
4.2.11 Data Management
Data management operations include collection, storage, backup, valida-
tion, transfer, reduction, analysis, interpretation, and reporting. Each of
these aspects of the data analysis/processing regime must be addressed in
70
-------
the research protocol or the QA Project Plan, together with associated QA
measures and their documentation.
4.2.11.1 Data Collection—
A clear description of the manner in which raw data are collected is
essential to QA planning. These data, representing the actual measured
values in chronological sequence, subsequently may be flagged as invalid but
must never be destroyed or deleted. Manually collected data are frequently
monitored by the person recording the data. However, computerized data
acquisition systems do not have the potential for this treatment and are
known to pick up false voltage transmissions, which may introduce error.
Analysis of data trends and of the relationships between various
parameters may be used to establish windows or intervals within which valid
data are expected to occur, leading to the application of control charts or
cumulative sums charts for real-time and on-line data validation.50 51 It
must be recognized, however, that evaluating data to determine if they lie
in an expected range does not alone constitute adequate validation. Clearly,
data can remain in such an interval but still involve considerable error.
In all cases, methods for assessing the validity of the recorded raw
data must be established prior to project initiation and documented in the
QA Project Plan (see Section 4.2.11.4). It is a QA function to evaluate the
adequacy of these methods with respect to time, place, and documentation.
4.2.11.2 Data Storage and Backup--
Raw data must be stored in such a way that they are not degraded or
compromised and that any datum (value) desired may be retrieved (uniquely
identified). For computerized raw data, there must always be at least one
copy that is off-line and not machine-mounted. It is a common practice of
large computation centers to provide this service with regularly scheduled
backups for users renting on-line disk space. The users should know when
such backups are performed and how many prior versions are retained. A
project may provide heightened security by supplementing these backup measures
with further backup stored in separate locations. Duplication of user-owned
tapes is usually the responsibility of the user.
71
-------
Raw task data must be securely archived. Such aspects as storage media
(e.g., paper, punched cards, paper tape, magnetic tape, or disk), conditions,
and location must be addressed. Access by authorized personnel and retention
time must also be addressed in the research protocol or the QA Project Plan.
For certain types of studies, EPA's proposed GLPs13 are quite explicit.
The storage media, conditions, and locations should be selected based
on project-specific criteria. Computer files (tape or disk) should be
"exercised" due to their rather high instability. Physically separate stor-
age of duplicate raw data sets should be considered. The retention time
will vary, according to project objectives and legal considerations, but it
should be stated clearly at the outset of any study.
Access to the archived data should be described. The fewer persons
allowed access, the less chance there is of losing the data. This may
conflict with the need to disseminate data to a wide audience. In such a
case, copies of data may be provided rather than permitting free access to
the unrecoverable raw data.
Another aspect of data storage that should be addressed is data inviol-
ability. Raw data must never be altered. If an error in data can be demon-
strated and the correct value(s) determined, then of course the data should
be corrected. However, such changes must be documented fully, including
date and reason for correction. If records are kept in laboratory notebooks,
then drawing a single line through the incorrect value and writing the
correct value next to it, in such a way that both entries are legible, is
permissible. The date, reason for change, and identification of the indi-
vidual making the change must be recorded. For computerized data files, it
is advisable to keep a separate file containing notes fully documenting data
changes. This level of security is unnecessary for the correction of key-
punch, encoding, and transcription errors.
These documentation standards also apply to any outlier detection and
dispensation procedures that may be employed. It is advisable to keep a
variable on the master file to indicate whether an observation is normal, an
outlier, or aberrant for some other reason.
It is not necessary that all data sets created from the raw data set be
saved or backed up. In fact, if a second or later generation data set is
72
-------
retained without sufficient documentation to explain how it was created,
then that data set can be of little or no value for QA purposes. Thus, more
important than retention of intermediate data sets or analyses is the ade-
quate documentation of the procedures used. All computer code (including
job control language) that accomplishes objectives planned in the protocol
or results to be cited in the final report should be saved and dated. Docu-
mentation must permit reconstruction of analyzed data sets and analysis
computer programs.
4.2.11.3 Data Transfers--
Data transfer changes the form or location of a data set, but not its
content. Thus, a transferred data set may be used to fully reconstruct its
originating data set. If data transfer is error free, then no information
is lost in the transfer and the input is completely recoverable from the
output. Examples of data transfer are copying the raw data from the notebook
onto a data form for keypunching, converting a written data set to punched
cards, or copying from a computer tape to disk.
An audit trail should be kept as a check on the correctness of data
transfer operations. An audit trail is an account of the data (values,
pages, keypunch forms, keypunch cards, etc.) and a verification after each
operation on the data that the number of data items fed into the process is
reliably reflected by (usually equal to) the number of data items resulting
from the process. An audit trail provides a necessary, but not sufficient,
check on the accuracy of data transfers.
A good general rule is to minimize the number of data transfer steps in
the data processing, since the overall probability for errors increases with
the number of such transfers. Often this can be influenced by judicious
design or choice of data forms. For instance, since the reliability of key-
punchers and other data entry personnel is highly dependent upon the form
and legibility of the data they receive, it is highly desirable to initially
record the raw data on the same form used for data entry (keypunching) or in
computer-readable form. Data entry personnel should be consulted in advance
and, insofar as possible, forms should be designed to accommodate them;
standardized 80-column Fortran coding forms (6X28-7327-6 IBM) are often
desirable.
73
-------
As part of the study design, an overall admissible transfer error rate
should be specified. One of the primary purposes of data validation is to
test whether this error rate has been exceeded by comparing the analysis
data set with raw data. If the transfer process has several components,
their individual error rates are not of particular concern, as long as the
composite error rate is below the desired level.
4.2.11.4 Data Validation--
Data validation has been defined by EPA as "... the process whereby
data are filtered and accepted or rejected based on a set of criteria."46
This process may include manual or computerized checks and clearly involves
specified criteria. The research protocol or QA Project Plan should state
clearly that raw data are not to be altered, and study documentation should
indicate how subsequent data sets are generated and validated.
Validation checks may include evaluation of the data with respect to
physically determined criteria (e.g., a record indicating a negative weight
is not reasonable). Similarly, as the sophistication of the model increases,
relational checks between measured parameters may also be used.
The data sets to be statistically analyzed should be compared with the
first recorded form of the data to estimate the error rate. The QA officer
may request samples from both the raw data and the analysis data set. QA
functions involve definition of error and specification of allowable error
rates. In the case of a small data set, QC checks may consist of item-by-
item verification that the error rate does not exceed the allowable limit.
For large data sets, data validation should be considered as a hypothesis-
testing problem, with type I and II error probabilities49""52 chosen by
appropriate QA personnel. A random sample of a size sufficient to achieve
the desired significance level and power should be drawn. Dodge-Romig
tables may be employed to determine sample size, if 90 percent power is
acceptable.53 This subsample is then compared item by item with the corre-
sponding raw data to determine if the error rate is acceptable. QA require-
ments may include documentation in the form of a printed copy of the subsample
and access to the raw data.
74
-------
4.2.11.5 Data Reduction--
Data reduction includes all processes that transform one data set to
another in such a way that the original data set cannot be recovered from
the reduced data set. It is distinct from data transfer in that it entails
a reduction in the size (or dimensionality) of the data set and an associated
loss of information. Assumptions about the distribution of the observations
are implicit in data reduction, making it a data analysis activity. For
instance, if repeated measurements of a quantity are made in the laboratory
and summarized as a mean and standard deviation, then statistical theory can
be invoked to justify the sufficiency of these two measurements, if the data
follow a normal distribution.
If the data are reduced before analysis, the study documentation or
data management analysis scheme must clearly define the mathematical or
other processes used to obtain the reduced data set from the raw data set.
Quality assurance should address the accuracy of the mathematical operations
used in the reduction process. Permanent data reduction, resulting in
irretrievable loss of raw data, should be avoided if at all possible.
However, in some instances, the sheer volume of data that would result makes
it impractical to save each datum. In such cases data subgroups are some-
times summarized by statistics such as averages, standard deviations, and
sample sizes. In these cases the notion of raw data is broadened so that
the summary statistics are regarded as the raw data. A preliminary study
should address the adequacy of the candidate summary statistics, in terms of
the end uses of the data. For instance, a variable that can assume only
positive values may have a distribution that is skewed to the left. In some
cases a logarithmic or square root transformation might bring the distribu-
tion closer to normality. In this case the average of the transformed data
would come closer to satisfying standard statistical assumptions than would
the average of the raw data.
4.2.11.6 Software--
The objective of software QA is to ensure that calculator and computer
programs perform accurately. Such operations should introduce no more than
negligible error (e.g., 1 percent or less) relative to the intrinsic varia-
tion in the measured processes. For manual calculations, an example should
75
-------
be given in which actual raw data are transformed and can be checked by
reviewers. If a programmable calculator is used in this process, a copy of
the programs used should be provided.
Computer programs should be designed to expedite validation. Programs
should be modular, structured, well documented, logical, and should liberally
employ comment statements. The use of widely available statistical analysis
packages such as SAS, BMD, SPSS, and MINI-TAB is recommended, as opposed to
writing analysis programs in FORTRAN, BASIC, or PL/I code. Such packages
are heavily used; therefore errors have been largely eliminated, and standard
documentation is widely available. Software GLPs may be found in the EPA
ADP System Documentation Standards.54
The following minimal documentation is sufficient for computerized data
manipulation or analysis:
1. Reference to system documentation (some software packages supply
this automatically);
2. A copy of the calling program and resulting output;
3. A concise, clearly written description of the operand data set and
how it derives from the raw data and the operation or analysis to
be performed; these may be embedded in the beginning of the program
as a comment statement; and
4. A data dictionary defining the variables as they pertain to the
operation or analysis as described in item 3 above; the data
dictionary may be embedded in 3.
Compliance with items 1 through 4 has no implications for validity of
the analyzed data (see Section 4.2.11.4) or appropriateness of the statisti-
cal methodology employed; they must each be addressed separately.
4.2.11.7 Data Analysis--
At every phase of a research study, from initial design to final analy-
sis and reporting, there should be at least one mathematical model that is
tentatively entertained for the data. Statistical models are a prerequisite
to proper experimental design and are necessary for statistical analysis.
The models should be written with the underlying assumptions clearly stated.
When evidence points to modification of the models or adoption of different
models, the changes and rationale should be documented.
76
-------
Statistical analysis usually involves the use of study data for estima-
tion of model parameters. These estimates may be put to a variety of uses,
depending on the study objectives required and the tastes of the data analysts.
If the objectives have been expressed as formal hypotheses, then statistical
tests may be based on parameter estimates. If estimation rather than infer-
ence is the goal of the study, then a confidence interval approach to sum-
marization may be adopted. In either case, the estimates are acknowledged
to contain error, due to intrinsic variation and/or factors neglected by the
model.
The approach of hypothesis testing is to ask whether the deviation of
an estimate from a given hypothesized value or range of values is plausibly
due to chance alone. If study data are sufficiently improbable in the light
of a hypothesis, then the hypothesis may not be further entertained. Proba-
bilities must also be dealt with, in an inverse sense, to construct a confi-
dence interval; i.e., a range which contains the true value of a parameter
with a certain probability.
For the statistician/data analyst, whose ultimate measurements are
probabilities or functions of them, questions of accuracy of measurements
become questions of accuracy of these probabilities. If the model is correct,
then so are the probabilities. This is an oversimplification, of course,
since there is no such thing as a totally correct model.55 A model is a
theoretical construction intended to (approximately) represent reality. In
most cases we have only our observations and lack the detailed schematic
that would allow complete evaluation of the model. To put it another way,
the infinite sample size necessary for total model validation is lacking.
This illustrates a common problem for the quality assurance of scientific
research: True reference values are not available, and accuracy, in the
sense of closeness to the correct value, cannot be determined.
However, a well-defined model does have numeric consequences that can
be checked against the data. Such checks fall under the general rubric of
goodness-of-fit tests and are the statistician's main tool for the QC of his
own measurement process. As a general rule, if it can be seen from the data
that a model assumption is false, then corrective action is required.
77
-------
Some specific techniques for checking agreement between the model and
data include inspection of plots of observed and expected values or resid-
uals, goodness-of-fit tests for probability distributions, overfitting, and
analysis of replicates. Goodness-of-fit tests include chi-squared tests for
probability densities or discrete distributions of specified form and the
Kolmogorov-Smirnov test for the cumulative distribution function.49 Over-
fitting involves viewing the tentative model as embedded in a larger family
of models and testing within the larger family whether or not there is sig-
nificant evidence against the restricted (tentative) model. Ideally the
larger family is defined on the basis of suspicion of how the tentative
model might fail. For instance, in calibration problems where a straight
line through the origin is sometimes assumed (y = ax + error), the model
y = ax + b + error may be used to represent certain types of departures.
For regression problems, repeated (replicate) observations at fixed condi-
tions allow variance estimation, independent of any model. Also, it is
generally possible to estimate variance as a consequence of the model. The
two variance estimates should be consistent if the model is approximately
correct.
4.2.11.8 Reporting--
The most visible product of a research task is the report of significant
findings. Publication guidelines applicable to the HERL research reports
are available56 57 and minimum technical contents for nonclinical laboratory
reports and health effects research have been promulgated13 and are shown in
Figure 4-9.
The report should be concise, complete, and consistent with standard
EPA formats.56 57 Discussion of the important technical aspects of the
research should be adequate to permit qualified professionals to repeat the
work. Adequate data should be included to permit at least partial calcu-
lation of important results. The conclusions drawn from the data and the
rationale behind those conclusions should be clearly stated. Graphical and
illustrative data correlation with supporting tables should be used whenever
possible. Well-defined error estimates should be included with all quanti-
tative values reported.
78
-------
1. Name and address of the facility performing the study and the dates on
which the study was initiated and completed.
2. Objectives and procedures stated in the sponsor-approved protocol,
including any changes in the original protocol including justification(s).
3. Statistical methods employed for analyzing the data.
4. The test and control substances identified by name, chemical abstract
(CAS) number or code number, strength, purity, and composition or other
appropriate characteristics.
5. Stability of the test and control substances under the conditions of
administration and storage.
6. A description of the methods used.
7. A description of the test system used. Where applicable, the final
report must include the number of animals used, sex, body weight range,
source of supply, species, strain and substrain, age, and procedures
used for identification.
8. A description of the dosage, dosage regimen, route of administration,
and duration.
9. A description of all circumstances that may have affected the quality
or integrity of the data.
10. The name of the study director, the names of the other scientists or
professionals, and the names of all supervisory personnel involved in
the study.
11. A description of the transformations, calculations, or operations
performed on the data; a summary and analysis of the data, and a state-
ment of the conclusions drawn from the analysis.
12. The signed and dated reports of each of the individual scientists or
other professionals involved in the study.
13. The locations where all specimens, raw data, and the final report are
to be stored.
14. The final QA report should be prepared and signed by the quality assur-
ance unit.
Figure 4-9. Minimum technical report content for EPA health effects tests.13
79
-------
The presentation of results should delineate the functional relation-
ship between the raw data and the tables or graphs and be understandable to
nonstatisticians. Since most scientific studies fall short of complete
representativeness, useful conclusions usually require generalizations that
tend to lie outside the realm of strict statistical justification. Thus,
the reader of the technical report should be informed of the statistical and
physical justification supporting each conclusion. The purpose(s) and
conclusion(s) of the research should be stated clearly. The estimated
errors, as well as the limits of applicability of results, should be stated
in such a way as to minimize misinterpretation. Application of the results
to alternative theories (models) should be provided, with indication of the
rationale used in reaching the stated conclusions rather than the alterna-
tive conclusions.
QC and QA activities should be detailed to permit the specialist and
nonspecialist to assess correctly the level of the QA effort invested in the
research. Subjective evaluation of the validity of the reported results and
conclusions should be possible from the data presented.
4.2.12 Internal QC Checks
The ability of the total data system to produce data of a specified
quality should be regularly evaluated to determine if corrective action (see
Section 4.2.16) is needed. Internal audits, conducted by the operating
group or organization, are used to obtain data for this evaluation.
EPA defines two types of audits.23 52 A quantitative measure of the
quality of the data produced is usually obtained through a performance
audit; a qualitative assessment of the ability of a system to produce data
of the specified quality is evaluated through a systems audit.
In either situation, the program and rationale for internal audits
should be designed based on individual components of the specific measure-
ment process and clearly planned for and budgeted into the task plans. By
using internal audits, the project officer will be able to objectively
evaluate data quality as the task progresses. A detailed description of all
internal QC checks should be included in the research protocol or the QA
Project Plan.
80
-------
4.2.12.1 Performance Audits—
Performance audits should be performed by qualified technical personnel
not routinely involved in the specific task measurement process being audited.
Frequently, the performance audit can only be designed to evaluate a part of
the total data system, such as sampling, analysis, and/or data reduction.
In such a case, the audit should be designed to evaluate each subsystem to
the fullest extent possible. For example, for the Ames/Salmonella assay, an
audit could be performed by introducing several mutagens of known response
into the assay system. The identity of the test sample or samples would be
known only to the auditor. The same principle may also be applied to analyt-
ical instrumentation. In each case, the audit values are compared with
those generated by the data system(s), and conclusions are drawn to infer
the quality of the data being generated by the total system.
Performance audits are generally conducted using:
1. Reference materials, for accuracy determinations, are available
from several sources,40 43 58 most notably the National Bureau of
Standards. These may be included for analysis in various types of
measurement systems at relatively low cost with little interference
to the normal laboratory routine and with the highest possible
degree of confidence.
2. Reference devices may be obtained for which the critical parameters
are known to the auditor but not the analyst. These may be more
disruptive of laboratory operations and there is no possibility of
anonymity of the sample; however, the final result is still a
measure of the performance of the total analytical system, includ-
ing the operator.
3. Cooperative analyses, such as round-robin analyses, are useful for
estimating the precision of a measurement among several different
operators and/or laboratories. Accuracy of the measurement can
only be assessed if the analyte is a reference material.
4. Side-by-side analyses, or collaborative analyses, may be used if
important variables are not controllable in the sample.
Specific internal QC checks may be employed within the sampling or
analytical process to determine the bias and imprecision of specific project
methodology. Several examples of specific QC checks are described below.
Where applicable to HERL research projects, they should be discussed in the
QA Project Plan or the research protocol.
81
-------
4.2.12.1.1 ReplIcates—Repeated but independent measurements of the
same sample by the same analyst at essentially the same time and under the
same conditions are called replicates. Care should be exercised in consid-
ering replicates of a portion of an analysis and replicates of a complete
analysis. For example, duplicate titrations of the same digestion are not
valid replicate analyses, although they may be valid replicate titrations.23
4.2.12.1.2 Spiked samples—Spiked samples are environmental samples to
which a known quantity of a given analyte has been added in order to evaluate
matrix effects. When spiked samples are analyzed concurrently with corre-
spondingly unaltered samples, it is possible to determine if there are
components in a sample that bias measurement values. Results of such analy-
ses are often expressed as percent recovery and may be used to correct
unaltered sample results to obtain "true" or correct values.23
4.2.12.1.3 Split samples—To determine the variance between observa-
tions (replicability), between analysts or instruments in a laboratory
(intralaboratory variance), or more commonly between laboratories (interlab-
oratory variance), a quantity of homogeneous material is split into two or
more portions that are analyzed independently. The split-sample technique
may be used to determine analytical method variance in these three instances,
or it may be used to determine comparability of different analytical methods.
The split-sample technique is often employed when there are no reference
materials available; in such cases, the mean of reported results, recalcu-
lated after exclusion of outliers and technically flawed data, is often
represented as the "target" or reference value, which is then used to deter-
mine the bias of individual reported values.
4.2.12.1.4 Blanks—A basic principle of the scientific method is that
the baseline or nonperturbed state must be well characterized if a change
from baseline state is to be accurately observed. In chemical analysis, the
blank is often a pure sample component (e.g., distilled water) that does not
give a positive measured response. Chemical blanks can be classified as
reagent blanks or total method blanks.
82
-------
Reagent blanks—The first step the analyst must take is to deter-
mine the background of each of the reagents used in a given method
of analysis. The conditions for determining the background must
be identical to those used throughout the analysis, including the
detection system. If the reagents are found to contain substances
that interfere with a particular analysis, they should be treated
to remove interferences or other satisfactory reagents must be
found.23 35
Method or analytical blanks—After determining the individual
reagent blanks, the analyst must determine if the cumulative blank
interferes with the analyses. Determination of a method blank is
accomplished by following the normal analytical procedure, step by
step, including all of the reagents in the quantity required by
the method. If the cumulative blank interferes with the deter-
mination, steps must be taken to eliminate or reduce the interfer-
ence to a level that will permit this combination of reagents to
be used. If the interferent cannot be eliminated, the magnitude
of the interference must be corrected for when calculating the
concentration of specific constituents in the samples being
analyzed.35
A method or analytical blank should be determined for each analytical
procedure. The actual number of blanks analyzed is determined by the method
of analysis and the number of samples being analyzed at a given time. In
some methods, such as automated analytical procedures, the method blank is
automatically and continuously compensated for by a continuous flow of
reagents passing through the detector. In other procedures, such as gas
chromatographic determination of pesticides, a method blank is run with each
series of samples analyzed. Analysis of one blank for every nine samples is
often recommended.
In biological experiments, the term control is analagous to the method
or analytical blank described for chemical analysis. The control is an
organism or part of an organism that is handled in a manner identical to the
experimental group, except that it does not receive the treatment hypothesized
to cause the response of interest. Comparison of responses of experimental
subjects with those of controls is essential to establish levels of signif-
icance for observed results. A statistician should be consulted to determine
the number of blanks or controls necessary to meet project goals.
4.2.12.1.5 Internal standards—The method of internal standards is
often employed in the analysis of environmental or biological samples to
83
-------
compensate for possible matrix effects. In this method, a species different
from, but very similar in analytical response to, the analyte of interest is
added in a fixed and known amount to all samples and standards analyzed
(including calibration standards). The analytical response of both the
analyte of interest (R.) and of the internal standard (Rjr) is recorded for
each sample or standard and the ratio of these responses (i.e., RA/KTC) is
used to obtain quantitative estimates of the amount of analyte present.
This method also minimizes errors due to minor fluctuations or variations in
experimental conditions between individual analyses (e.g., instrumental or
electronic instability, changes in environmental conditions, minor variations
in instrumental settings or configuration). The basic assumption required
is that matrix effects and minor experimental variations will influence the
analytical responses of the analyte and the internal standard in a very
similar, if not identical, manner.
Internal standards can be prepared from any stable, well-characterized
materials that have the desired properties described above. It is important
that they be easy to add uniformly and precisely and that no appreciable
amount (free or combined) be present in the samples under investigation
prior to addition of the internal standard. Examples of appropriate internal
standard materials include:
Chemical or biological samples that have been thoroughly charac-
terized by repeated, independent analyses.
Chemical or biological materials validated against known reference
materials.
Validated reference materials, including NBS-SRMs (see Appendix A).
4.2.12.1.6 Quality control samples—QC samples are samples containing
known and verified concentrations of the analyte of interest that are pre-
pared independent of calibration standards and are analyzed at frequent
intervals throughout routine analysis. Such analyses provide continuous
internal evaluation of the measurement process. If practical considerations
do not prohibit the practice, QC samples should be inserted by laboratory
management into the usual routine analytical stream of samples without the
knowledge of the analyst(s). In this way, more objective evaluation of
everyday performance can be assured. QC standards may be prepared from
84
-------
materials suitable for calibration standards (Section 4.2.8). If no reli-
able sources of QC samples can be identified for a given project, samples
routinely analyzed can be split and the duplicates analyzed independently
(see Section 4.2.12.13). However, such analyses provide information on
precision only and not bias.
4.2.12.1.7 Surrogate samples—In the course of testing, it may be
impossible or prohibitively expensive to measure a variable of interest. In
cases where a relationship (ratio) can be established between the variable
of interest and a second variable that is easier to measure, the second
variable can be measured as a surrogate sample. An example is the measure-
ment of coliform bacteria as a surrogate for measurement of fecal pathogens
in polluted waters.
4.2.12.1.8 Calibration standards—Calibration standards should be of
the highest quality available, fully characterized, and, where possible,
should be traceable to NBS standards. A detailed discussion of calibration
standards and procedures is given in Section 4.2.8.
4.2.12.1.9 Control charts—When there is a repeated measurement of the
same material, such as a reference material, or replicate measurements of
similar samples, this information can be conveniently presented and analyzed
for QC purposes using control charts (see Figure 4-10).20 Samples of fixed
size are taken periodically from the process, and some measures of central
tendency (e.g., mean, X) and dispersion (e.g., range, R) are plotted against
time. If these measures are far from a target value, it is suspected that
the process is out of control. At this point, the process may be interrupted
and a search for assignable cause begun.
The X chart and R chart procedures may be implemented with relative
ease.55 The X chart approach will declare a process out of control with
respect to a target value if
Xn. - u > KS ,
where K is a constant (frequently, K = 3), S is the standard error of the
mean X. of the sample (as estimated from historical data), and u is a target
value. The R chart approach will declare the process out of control with
respect to variability if the range (R) of the sample deviates from a target
85
-------
CO
Upper Control Limits
Upper Warning Limits
Mean
Lower Warning Limits
Lower Control Limits
00
J I
I i
2 4 6 8 10 12 14 16 18 20 22 24 26
DAYS
Figure 4-10. Sample X control chart.20
-------
value derived from past data by more than 3&n, where &„ is the standard
deviation of R estimated from data or computed from a standard value for the
process standard deviation.
Once established, these control charts can be maintained by the labora-
tory personnel performing the measurements. In addition to showing graph-
ically extreme variations in the measurement process, such charts give
useful information on trends in a measurement process over time.20 35
4.2.12.1.10 Reagent checks—All chemicals, biological organisms, and
equipment used in an experiment should be of a quality necessary to meet
project goals. The desired level of quality must be specified in the re-
search protocol and QA Project Plan. There should be a well-established and
well-documented program for screening chemicals, animals, and equipment when
they are received from the supplier, and procedures for recertification of
unstable reagents should also be developed where appropriate (see Sec-
tion 4.2.4.4).
4.2.12.2 Systems Audits--
Internal systems audits are initiated by personnel within the organiza-
tion performing the research, but are conducted by individuals not directly
involved in project activities. A professional qualitative evaluation, re-
sulting from observations of and discussions with project personnel, is made
of the capability of a data system (including instruments, personnel, organi-
zation) to produce the specified data quality. Use of checklists or written
questionnaires covering QA/QC items listed in Figure 4-1, as appropriate, is
recommended to allow for a more complete, objective assessment during the
systems audit.
The purpose of a systems audit is to provide constructive evaluation of
measurement process data quality and to identify areas where improvements
can be made. If this intent is followed by auditors and made clear from the
beginning, project personnel will be more likely to cooperate in audit and
corrective action procedures.
Systems audits are particularly useful for evaluating overall quality
in measurement programs where no reliable quantitative probes exist for
conducting performance audits. In general, it is recommended that if
resources permit, a systems audit be conducted prior to the initiation of
87
-------
any measurement program to evaluate the capability of the system for pro-
ducing data of the quality required by the program objectives.
4.2.13 External Quality Assurance for Research Projects
For measurement activities supported by HERL, objective evaluation of
task performance and adherence to research protocols, work plans, and QA
Project Plans are accomplished through periodic audits by external nontask
personnel. Such audits may be initiated at the request of the project
officer, the QA officer, or the QAMS. They may be conducted on new projects
requisite to funding, or on ongoing projects, either in response to identi-
fied problems in performance or as part of a routine evaluative program.
4.2.13.1 External Performance Audits—
Quantitative measurements and comparisons provide the best objective
estimates of data quality. The performance audit directly evaluates the
measurement aspects of the laboratory research operation being audited. The
laboratory is usually given samples to be analyzed and their results are
compared to expected results and judged for accuracy and precision. A
pivotal issue in the proper interpretation of audit results is whether or
not high quality reference standards are available (see Section 4.2.8.1).
The National Bureau of Standards has developed a series of environmen-
tally related Standard Reference Materials (NBS-SRMs).40 44 A current
catalog of NBS-SRMs may be obtained from:
Office of Standard Reference Data
National Bureau of Standards
Washington, DC 20234.
In addition, the World Health Organization maintains information on world-
wide sources of biological standards.58
Appropriate use of available reference materials by the auditor can
provide an objective measure of specific parameter data quality. Unfortunate-
ly, for many environmentally related measurements in HERL research projects,
reliable reference materials do not exist for assessment of accuracy (i.e.,
deviation from a true value). However, in these cases, estimates of analytical
variability (precision) can still be obtained.
After completion of the performance audit, it is useful to present the
results to the audited laboratory for review. This review can be a starting
88
-------
point for cooperative identification of sources of measurement weakness and
subsequent corrective action.
4.2.13.2 External Systems Audits--
Systems audits consist of an evaluation of the various components of a
research operation, principally through inspection, and may be conducted on
any research or monitoring project. The first step of a systems audit is an
investigation of the laboratory's activities via inspection of protocols,
standard operating procedures, proposals, reports, and scientific publica-
tions. After inspection of these materials and contact with key laboratory
personnel to clarify questions, an onsite inspection may be performed. The
onsite systems audit consists of inspection of facilities and operations,
interviews with laboratory personnel, and reviews of key operations and
documentation. The systems audit may be scored using a checklist comparing
actual laboratory practices with some standard such as FDA or proposed EPA
GLPs.13 18
The objective of the onsite qualitative systems audit is to assess and
document: facilities; equipment; personnel; recordkeeping; data validation
and management; operation, maintenance, and calibration procedures; and
reporting aspects of the total QC program for a project. The review should:
Identify existing system documentation; i.e., maintenance manuals,
organizational structure, operating procedures.
Evaluate the adequacy of the procedures as documented.
Evaluate the degree of use of and adherence to the documented
procedures in day-to-day operations, based on observed conditions
and a review of applicable records on file.
From qualitative measures of data quality, an auditor independent of
the task organization can assess the suitability of the facilities and
operations to meet project goals and identify specific areas where correc-
tive actions may be implemented.
4.2.14 Preventive Maintenance
To ensure long-term data quality in a cost-effective manner, a rational
preventive maintenance program must be followed.23 This assumes importance
roughly in proportion to the amount of instrumental data recorded. For both
89
-------
routine measurement programs and basic research efforts, an effective pre-
ventive maintenance program minimizes and controls equipment downtime and
therefore extends the completeness of the data. The preventive maintenance
program should include scheduling, performance, and recordkeeping.
Scheduling of preventive maintenance should be developed based on the
effect of equipment failure on overall data quality, any relevant site-
specific effects, and equipment reliability. Extended laboratory use of
specific items can be scheduled with higher reliability than if maintenance
occurs only following equipment failure. This schedule should be available
to the personnel performing the maintenance as well as the personnel using
the equipment.
Preventive maintenance should be performed by qualified technicians,
using accepted, documented procedures. These procedures should be written
as SOPs and included in the QA Project Plan, research protocol, or work plan
either in full or by reference. Copies should be on file with the project
officer and with key project personnel. The specific service should be
based on the considerations noted in the preceding paragraph and should be
known to both the user and maintenance groups. A predefined set of data
should be obtained before and after the maintenance activities to permit
equipment performance evaluation. Calibration (see Section 4.2.8) should be
performed following all major maintenance activities.
Documentation of all maintenance activities—scheduled or not—is
essential to monitoring and documenting data quality. A bound notebook (see
Section 4.2.4.3) should be kept with each instrument as a record of its
maintenance history. A detailed description of all adjustments made and
parts replaced should be recorded. If the notebook is the multicopy type,
one of the copies should be kept by the maintenance group for analysis.
This analysis may include such considerations as mean time between failures
(MTBF) for specific components, MTBF analysis for total systems (individual
and laboratory-wide), and development of an onsite spare parts inventory to
cost-effectively reduce equipment downtime. Where possible, checklists
should be used to ensure and document thoroughness of maintenance activities.
90
-------
4.2.15 Specific Routine Procedures for Assessing Data Quality
For each major measurement variable, including all pollutant measure-
ment systems, the QA Project Plan or research protocol must describe the
routine procedures used to assess the imprecision, bias, and completeness of
the measurement data. These procedures should include the equations used to
calculate imprecision, bias, and completeness, and the methods used to
gather data for the imprecision and bias calculations.
At the heart of the Agency QA program is the realization that measure-
ments necessarily contain error, and that the magnitude of such errors
should be estimated. This same logic applies to estimates of imprecision
and bias, which should not only be reported, but should be accompanied by
confidence intervals at a stated confidence level. An approximate 100(1 - a)%
confidence interval for bias in the one-sample case is given by (y - T) ± ts/V
where n is the sample size and t is the upper 100(1 - a/2) percentage point
for a t distribution with n-1 degrees of freedom. An approximate 100(1 - a)%
confidence interval for squared imprecision (a2) is [(n-l)s2/x2(n-l, a/2),
(n-l)s2/x2(n-l, l-a/2)], where X2(ra,c) is the upper lOOc percentage point of
a chi-square variable with (m) degrees of freedom.49
Formulas for confidence intervals for bias and precision parameters in
regression situations can be found in standard statistics references.49 The
special case of simple linear regression is also covered by Hunter47 (see
Section 4.2.8.3). An example of this case is provided in Appendix B. Form-
ulas for other situations should be determined in consultation with a
statistician.
There should be an explicit statement made of how the QC and QA data
are stored and how they are used in evaluating overall project data quality.
The project officer should include appropriate results from: (1) internal
performance audits (Section 4.2.12.1), (2) external performance audits
(Section 4.2.13.1), and (3) calibration checks (Section 4.2.8).
4.2.16 Feedback and Corrective Action
For each project, a system for detecting, reporting, and correcting
problems that may be detrimental to data quality must be established. The
feedback and corrective action system chosen should accommodate the need for
91
-------
quick response, thorough communication, and documentation of the problem and
its solution.
The first priority in establishing a corrective action system for a
project is to clearly define the management line authority for dealing with
all corrective action issues. This line authority should be shown in the
project organization chart (Section 4.2.3.1) or in a separate chart depend-
ing on the complexity of the research project. The assignment of authority
for authorization and approval of the corrective action (once the action has
been taken) should be clearly defined for each major measurement method.
The project officer should also hold regular summary briefings with
technical personnel to discuss problems encountered in the research project
and to make the project personnel aware of how their individual contributions
to the project affect overall data quality. Such briefings should take
place during the initial project planning phases and should be continued
periodically throughout the duration of the project.
These activities provide an excellent opportunity to establish and
maintain active employee-management feedback. Since bench-level personnel
should be the best observers of routine task operations, they are the most
likely to detect disturbances that may affect data quality. With effective
feedback, management can quickly become aware of fluctuations that might
otherwise go undetected.
Additional feedback systems should be established when facilities,
equipment, or supplies are in common use by several researchers. For exam-
ple, the discovery of an impure substance by one investigator should be
communicated at once to all other users of the particular substance. This
can be facilitated by the use of central stockroom records. Or, if a ship-
ment of animals is found to be diseased by the Animal Care Coordinator and
contamination of other test animals may result, the Animal Care Coordinator
should notify all researchers using the animal care facility of the possible
problems.
A description of all problems detected, the solutions devised, correc-
tive actions taken and estimates of the effect of the problems on data
quality should be made available to appropriate management in regular written
QA reports (Section 4.2.17). The timing of these reports should be estab-
lished based on the duration and nature of the research project.
92
-------
4.2.17 QA Reports to Management
To allow routine evaluation of data quality, provision for adequate QA
reporting must be made for all intramural and extramural environmentally
related measurement programs. It is the responsibility of the project
officer and the QA officer to establish QA reporting requirements for indi-
vidual extramural projects prior to project initiation. For intramural
tasks, QA reporting requirements must be included in each QA Project Plan at
the division, branch, section, or group level. These requirements should be
established by the appropriate functional manager (i.e., division director,
branch or section chief, or group leader) in consultation with the QA officer.
Projects of short duration (1 year or less) may require only a final QA
report. Projects of longer duration may require periodic (e.g., quarterly)
QA reports. It is recommended that QA reports be separate from other required
project reports. They should contain such information as:
Changes in the QA Project Plan
Significant data quality problems and status of corrective actions
Results of performance audits
Results of systems audits
Results of interlaboratory or other cooperative studies
Assessment of data quality in terms of precision, accuracy, com-
pleteness, representativeness, and comparability, where appropriate
Quality-related training efforts
Work is currently underway within the HERL QA organization to establish
Laboratory-wide QA reporting guidelines and to incorporate routine QA reporting
into the existing computerized Management Information System.
4.3 REFERENCES
1. U.S. Environmental Protection Agency, Environmental Protection Agency
(EPA) Quality Assurance Policy Statement, Administrator's Memorandum,
May 30, 1979.
93
-------
2. U.S. Environmental Protection Agency, Quality Assurance Requirements
for All EPA Extramural Projects Involving Environmental Measurements,
Administrator's Memorandum, June 14, 1979.
3. U.S. Environmental Protection Agency, Strategy for the Implementation
of the EPA's Mandatory Quality Assurance Program: FY1980 and FY1981,
QAMS-001/80, Office of Research and Development, Washington, DC,
March 1980.
4. U.S. Environmental Protection Agency, Guidelines and Specifications for
Preparing Quality Assurance Program Plans, QAMS-004/80, Office of
Research and Development, Washington, DC, April 1980.
5. U.S. Environmental Protection Agency, Interim Guidelines and Specifica-
tions for Preparing Quality Assurance Project Plans, QAMS-005/80,
Office of Research and Development, Washington, DC, December 1980.
6. U.S. Environmental Protection Agency, Guidelines and Specifications for
Implementing Quality Assurance Requirements for EPA Contracts and Inter-
agency Agreements Involving Environmental Measurements, QAMS-002/80,
Office of Research and Development, Washington, DC, May 1980.
7. U.S. Environmental Protection Agency, Quality Assurance Requirements for
Contracts over $10,000, Acting Director's Memorandum, Quality Assurance
Management Staff, March 29, 1982.
8. U.S. Environmental Protection Agency, Guidelines and Specifications for
Implementing Quality Assurance Requirements for Research Grants Involv-
Thg Environmental Measurements, QAMS-003/80/01, Office of Research and
Development, Washington, DC, April 1981.
9. U.S. Environmental Protection Agency, Guidelines and Specifications for
Implementing Quality Assurance Requirements for Demonstration Grants
and Cooperative Agreements Involving Environmental Measurements,
QAMS-003/80/02, Office of Research and Development, Washington, DC,
April 1981.
10. U.S. Environmental Protection Agency, Guidelines and Specifications for
Implementing Quality Assurance Requirements for State and Local Assist-
ance Grants Involving Environmental Measurements, QAMS-003/80/03,
Office of Research and Development, Washington, DC, April 1981.
11. U.S. Environmental Protection Agency, Quality Assurance Requirements
for Financial Assistance, Acting Director's Memorandum, Quality
Assurance Management Staff, March 30, 1982.
12. U.S. Environmental Protection Agency, Quality Assurance Requirements
for Cooperative Agreements, Acting Director's Memorandum, Quality
Assurance Management Staff, July 16, 1982.
94
-------
13. U.S. Environmental Protection Agency, Proposed Good Laboratory Practice
Standards for Health Effects, Title 40, Code of Federal Regulations,
Part 772, Federal Register, 44(91), May 9, 1979, p. 27362.
14. Quality Assurance Program Plan for the Health Effects Research Labora-
tory, U.S. Environmental Protection Agency, Research Triangle Park, NC,
QA-1-82/03, September 1982.
15. U.S. Environmental Protection Agency, Health Effects Research Labora-
tory, Development of Quality Assurance Plans for Research Tasks,
EPA-600/1-78-012, Research Triangle Park, NC, 1978.
16. U.S. Environmental Protection Agency, Quality Assurance Research Plan,
FY 1978-81, EPA-600/8-77-008, Washington, DC, July 1977.
17. U.S. Environmental Protection Agency, Quality Assurance Guidelines for
Biological Testing, EPA-600/4-78-043, Las Vegas, NV, August 1978.
18. Department of Health, Education and Welfare, Food and Drug Administra-
tion, Nonclinical Laboratory Studies Good Laboratory Practice Regula-
tions, Federal Register, 43(247), December 22, 1978, pp. 59985-60025.
19. Proposed Health Effects Test Standards for Toxic Substances Control Act
Test Rules, Federal Register. May 9, 1979, p. 27334.
20. Inhorn, S. L., ed., Quality Assurance Practices for Health Laboratories,
American Public Health Association, 1978.
21. U.S. Environmental Protection Agency, HERL Guidelines for the Prep-
aration of Protocols for Intramural Research Tasks, QA-2-81/00, Health
Effects Research Laboratory, Research Triangle Park, NC, September 1981.
22. U.S. Environmental Protection Agency, HERL Guidelines for the Prep-
aration of Protocols for Intramural Support Tasks, QA-3-81/00, Health
Effects Research Laboratory, Research Triangle Park, NC, September 1981.
23. U.S. Environmental Protection Agency, Quality Assurance Handbook for
Air Pollution Measurement Systems, Vol I - Principles, EPA 600/9-76-005,
Research Triangle Park, NC, March 1976.
24. Rhodes, R. C. , Components of variation in chemical analysis, in Valida-
tion of the Measurement Process, J. R. DeVoe (ed.), American Chemical
Society Symposium Series 63, Washington, DC, 1977.
25. Whitehead, T. P. , Quality Control in Clinical Chemistry, New York:
John Wiley and Sons, 1976.
26. Box, G. E. P., W. Hunter, and J. S. Hunter, Statistics for Experiments,
New York: John Wiley and Sons, 1978.
95
-------
27. U.S. Environmental Protection Agency, Quality Assurance Project Plan for
Inhalation Exposure Program, Health Effects Research Laboratory, Research
Triangle Park, NC, July 1981.
28. (a) Public Law 91-596, Occupational Safety and Health Act of 1970
(December 29, 1970).
(b) Occupational Safety and Health Manual, U.S. Environmental Protec-
tion Agency (January 8, 1976).
(c) Safety Management Manual, U.S. Environmental Protection Agency,
TN 1440.1 (December 4, 1972).
29. U.S. Environmental Protection Agency, Health Effects Research Labora-
tory, Environmental Assessment—Short-Term Tests for Carcinogens, Muta-
gens and other Genotoxic Agents, EPA-625/9-79-Q03, Research Triangle
Park, NC, July 1979.
30. Sexton, N., L. E. Myers, and L. 0. Claxton, Quality assurance in bio-
logical testing, Proc. 35th Annual Amer. Society Quality Control,
pp. 716-722, 1981.
31. Green, E. L. (ed.), Biology of the Laboratory Mouse, 2nd Edition, New
York: McGraw-Hill, 1966.
32. U.S. Environmental Protection Agency, Animal Management Quality
Assurance Plan, Health Effects Research Laboratory, Research Triangle
Park, NC, March 1981.
33. (a) Federal Register, Vol. 40, No. 50, March 13, 1975.
(b) Declaration of Helsinki, Recommendations guiding doctors in clin-
ical research, Jour, of American Med. Assoc., 197(11):32, Septem-
ber 12, 1966.
(c) The Institutional Guide to PHEW Policy on Protection of Human
Subjects, U.S. Government Printing Office, 1972, #0-445-427.
34. Lentzen, D. E., D. E. Wagoner, E. D. Estes, and W. F. Gutknecht,
IERL-RTP Procedures Manual Level 1 Environmental Assessment (Second
Edition), EPA-600/7-78/201. October 1978.
35. U.S. Environmental Protection Agency, Environmental Monitoring and Sup-
port Laboratory, Handbook for Analytical Quality Control in Water and
Wastewater Laboratories, EPA-600/4-79-019, Cincinnati, OH, March 1979.
36. Title 49, Code of Federal Regulations.
37. Federal Express Service Guide, 2nd Quarter, 1980.
38. Guide for Handling Hazardous Materials, United Parcel Service, March
1980.
96
-------
39. Domestic Mall Manual, Section 124.28.
40. National Bureau of Standards, NBS Standard Reference Materials Catalog
1979-80 Edition, NBS Special Publication 260, U.S. Department of Commerce,
Washington, DC, April 1979.
41. Measurement principle and procedure for the measurement of nitrogen
dioxide in atmosphere (gas phase chemi luminescence), Title 40, Code of
Federal Regulations, Part 50, Federal Register, December 2, 1976,
p. 52688.
42. National Archives and Records Service, Traceability requirements for
calibration gases, Title 40, Code of Federal Regulations, Part 60.13
43. American Society for Testing and Materials, Annual Books for ASTM
Standards, Philadelphia, PA, annual publication.
44. National Bureau of Standards, NBS Standard Reference Materials for
Environmental Research Analysis and Control, U.S. Department of Com-
merce, Washington, DC.
45. Scaringelli, E. P., A. E. O'Keefe, E. Rosenberg, and J. P. Bell, Prep-
aration of known concentrations of gases and vapors with permeation
devices calibrated gravimetrically, Analytical Chemistry, 42(8): July
1970.
46. Federal Register, December 14, 1977, p. 62971.
47. Hunter, J. S. , Calibration and the straight line: Current statistical
practices, J. Assoc. Off. Anal. Chem. , 64(3): 574-583, 1981.
48. Garden, J. S. , D. G. Mitchell, W. N. Mills, Nonconstant variance
regression techniques for calibration-curve-based analysis, Analytical
Chemistry, 52:2310-2315, 1980.
49. Steel, R. G. D. , and J. H. Torrie, Principles and Procedures of Statis-
tics, New York: McGraw-Hill, 1960.
50. Hunt, W. F. , G. Akland, W. Cox, T. Curran, N. Frank, S. Coranson,
P. Ross, H. Sauls, V. Suggs, USEPA Intra-Agency Task Force Report on
Air Quality Indicators, EPA-450/4-81-015, February 1981.
51. Clayton, C. A. C. , L. E. Myers, S. B. White, A. V. Rao, B. V. Alexander,
Investigation of Statistical Techniques for Environmental Quality Moni-
toring Data, prepared for Office of Planning and Management, Environ-
mental Protection Agency, Contract #68-01-6323, July 1981.
52. U.S. Environmental Protection Agency, Quality Assurance Handbook for
Air Pollution Measurement Systems, Vol II - Ambient Air Specific Methods,
EPA-600/4-77-027a, Research Triangle Park, NC, 1977.
97
-------
53. Dodge, H. F., and H. G. Romig, Sampling Inspection Tables, New York:
John Wiley & Sons, 1959.
54. U.S. Environmental Protection Agency, ADP System Documentation Standards,
Management Information and Data Systems Division, Washington, DC,
February 1980.
55. Box, G. E. P., Science and statistics, J. Amer. Stat. Assoc. , 71(356):
791-799, December 1976.
56. U.S. Environmental Protection Agency, Handbook for Preparing Office of
Research and Development Reports, EPA-600/9-76-001, 1976.
57. U.S. Environmental Protection Agency, Health Effects Research Labora-
tory, Health Effects Research Laboratory Procedures for Publishing
Office of Research and Development Technical and Scientific Materials,
Research Triangle Park, NC, July 1977.
58. World Health Organization, Biological Substances: International Stand-
ards, Reference Preparations, and Reference Reagents, Geneva: World
Health Organization, 1977.
98
-------
APPENDIX A
SELECTED NATIONAL BUREAU OF STANDARDS STANDARD
REFERENCE MATERIALS
Source: National Bureau of Standards, NBS Standard
Reference Materials for Environmental
Research Analysis and Control, U.S. Depart-
ment of Commerce, Washington, D.C.
99
-------
Clinical Laboratory Standards
These SRM's are intended for use in calibrating apparatus and validating analytical methods used in
clinical and pathological laboratories, and to assist manufacturers of clinical products in meeting the chemical
and physical specifications required for clinical chemicals. (For details on SRM's 930D and 93Ib, see
Spectrophotometric Filters, page 65.)
SRM
900
91 la
912
913
914
915
916
917
918
919
920
921
922
921
924
925
926
927
928
929
91QD
91 Ib
932
933
934
935
916
917
1968
Type
Anliepilcpsy Drug Level Assay .
Cholesterol
Urea
Uric Acid
Crcatinine .
Calcium Carbonate*. . .
Bilirubin
D- G lucose •
Potassium Chloride
Sodium Chloride .
D- Mannitol
Cortisol
Tris (hvdroxymcthyl) aminomethane
Tris (hydroxvmethyl) aminomethane HC1
Lithium Carbonate
VMA (4-hydroxy-3-methoxymandelic acid)
Bovine Serum Albumin ( Powder)
Bovine Serum Albumin (7% Solution)
Lead Nitrate
Glass Filters for Spectrophotometry
Quart/ Cuvette for Spectropholometry
(UV Absorbance) Standard
Quinine Sulfate Dihydrnte
(Fluorescence)
Purity ?J>
99 8
99 7
99 7
99 8
99 9
990
99.9
99.9
999
99.8
989
999
997
100.0
99.4
*•
**
100.00
IN PREP
+
+
•f
t
«+
(99972)"*
(98 2)*"
9990
+++
Wt/Unit
2 2
25 e
10 E
10 e
20 z
100 mg
25 e
30 e
30 e
50 g
1 i
25 z
35 z
30 z
1 2
5 g
10 vials. 2.15 mL ca.
10 g
10 z
Set of 3
3 sets of 4
1 each
Set of 3
1 each
15 z
1 E
50 g
1 ea.
•SRM 915. Calcium Carbonate, was used to develop the first referee method of analysis in clinicnl chemistry. This work is docribed
in NBS Special Publication 260-36. A Referee Method for the Determination of Calcium in Serum (See inside of hack cover for
ordering instructions.)
+Certified for optical properties (see p. 65.)
tlndividually calibrated at 0°C and either 25. 30, or 37 °C.
ttlndividually calibrated at 0, 25. 30. and 37 °C.
••Conforms to NCCLS specification ACC-1.
"•Apparent purity, certified for optical properties.
+++Meltmg Point Certified at 29 7723 "C. (See p. 61.)
100
-------
Biological Standards
These SRM's are intended for use in the calibration of apparatus and methods used in the analysis of
biological materials for major, minor, and trace elements.
(Values in parentheses are not certified, but are given for information only.)
SRM
1566
156'
I5*S
1569
15-0
Type
O\»ter Tissue IN PREP
Wheat Flour
Rice Flour
Brewers Yeast
Spinach. Trace Elements
Wt'Unii
(crams)
80
80
SO
60
1571
1573
1575
1577
Orchard Lcaxes
75
70
70
V)
Content in Mg/g (or where noted, wt
ELEMENT \ SRM/ 1566
Aluminum
Antimony
Barium
Beryllium
Bismuth
Boron ...
Cadmium
Calcium
Cesium
Chlorine .
Chromium
Cohalt
Copper
Europium
Fluorine
Gallium
Indium
Iron
I anthanum .
Lead
Lithium
Magnesium . .
Manganese
Molybdenum
Nickel . .
Phosphorus
Ruhidium
Scandium
Selenium
Silicon
Silver
Strontium
Sulfur
Tellurium
Thallium
Thorium . .
Uranium . . .
Zinc .
—
1567
(0.006)
(9)
0.032
0.019%
2.0
18.3
8.5
0.001
(0.4)
(0.18)
0.136%
(1)
I.I
80
O0.002)
10.6
1568
0.41
(1)
0.029
0.014%
0.02
2.2
8.7
20.1
0.0060
(1.6)
(0.16)
0.112%
(7)
0.4
6.0
teQ 002)
19.4
1569
2.12
1570
870
(0.04)
0.15
(30)
(54)
(1.5)
1.35%
4.6
(1.5)
12
(0.02)
550
(0.37)
1.2
165
0.030
(6)
(5.9%)
0.55%
3.56%
12.1
(0.16)
87
(0.03)
0.12
0.046
50
1571
2.9
10
(44)
0.027
(O.I)
33
(10)
0.11
2.09%
(0.04)
(690)
2.6
iO.2)
12
(4)
(0,08)
(0.17)
300
45
(0.6)
0.62%
91
0 155
• 0.3
1.3
2.76%
0.21%
1-47%
12
0.08
82
37
(1900)
(0.01)
0064
0.029
25
1573
(0.12%)
0.27
(30)
(26)
(3)
3.00%
(1.6)
45
(0.6)
II
(0.04)
690
(0.9)
6.3
(0.7%)
238
(0 1)
(5.0%)
0.34%
4.46%
165
(0.13)
449
(0.05)
0.17
0.061
62
1575
545
(0.2)
0.21
(9)
«0.5)
0.41%
(0.4)
2.6
(0 1)
3.0
(0 006)
200
(0.2)
10 8
675
0.15
(3.5)
(1.2%)
0.12%
0.37%
11.7
(0 03)
48
(005)
0.037
0020
1577
(0.005)
0.055
(0.017)
0.27
124
(0.27%)
0.088
(0.18)
193
(0.05)
(0.18)
268
0.34
604
10.3
0016
(3.4)
10.6%
(1-1%)
0.97%
18.3
I.I
(17)
(0.06)
0.243%
(0.14)
(0.05)
(0 0008)
130
101
-------
Environmental Standards
Analyzed Gases
These SRM's are intended for the calibration of apparatus used for the measurement of various compo-
nents in gas mixtures, and in some cases for particular atmospheric pollutants. Each SRM is accurately
certified and is primarily intended to monitor and correct for long-term drifts in instruments used. Each
cylinder (except 1609) contained 870 liters at STP prior to certification, and thus contains somewhat less than
870 L (SRM 1609 contained 68 liters). All cylinders conform to the appropriate DOT specifications.
SRM
1609
1658
1659
1660
1661
1662
1663
1664
I665a
1666a
I667a
1668a
1669a
I673a
1674a
1675a
1677b
I678b
I679b
I680a
168 la
1683a
I684a
I685a
I686a
I687a
2613
?fi!4
2619
2620
2621
2622
2623
2624
2625
2626
Type
Oxygen in Nitrogen
Methane in Air
Methane in Air
Methane-Propane in Air
Sulfur Dioxide in Nj
Sulfur Dioxide in N2
Sulfur Dioxide in Nj
Propane in Air
Propane in Air
Propane in Air
Propane in Air
Carbon Dioxide in Nitrogen
Nitric Oxide in Nitrogen
Nominal Concentrations
O2, 20.95 mole percent
CH«e 0.951 ^mol/mol (ppm)
CH4. 9.43 pmol/mol (ppm)
CH4. 4 10 ;* mol 'mol (ppm)
CjH». 0.976 /imol/mol (ppm)
SO2, 480 ^mol/mol (ppm)
SO}, 942 /imol'mol (ppm)
SO2, 1497 jjmol''mol (ppm)
SO2, 2521 /imol/mol (ppm)
CjH8. 3 ppm
C,H», 10 ppm
CjH,, 50 ppm
C,H,, 100 ppm
C)H,, 500 ppm
C02, 1.0 mol 7c
COj, 7.5 mol cj
CO2 15.0 mol 3.5 mol percent
C07, 4 0 mol percent
102
-------
Analyzed Liquids and Solids
These SR M's are intended for use in the analysis of materials for elements of interest in health or environ-
mental problems. See also: Clinical SRM's page 39, and Industrial Hygiene SRM's page 44.
SINGLE ELEMENT
Concentrations:
Weight percent — boldface
Microgram per gram — light face
Nanogram per milliliter — italics
SRM
1579
1620
1621
I622a
I623a
1624
1630
I64la
I642a
Type
Powdered Lead Base Paint
Sulfur in Residual Fuel Oil
Sulfur in Residual Fuel Oil
Sulfur in Residual Fuel Oil
Sulfur in Residual Fuel Oil
Sulfur in Distillate Fuel Oil
Trace Mercury in Coal
Mercury in Water (/ig/mL)
Mercury in Water (ng/ mL)
Unit Sire
35 g
IN PREP
100 mL
IN PREP
IN PREP
100 mL
50 g
IN PREP
950 mL
Lead
11.87%
Sulfur
1.05%
.211%
Mercury
0.13,1 g/g
t-tO ng/ml.
SRM
1636
1637
1638
Type
Lead in Reference Fuel
Lead in Reference Fuel
Lead in Reference Fuel
Element
Certified
Pb
Pb
Pb
Nominal
Concentration
12 20 28 and 773 u g ' e*
12 "*0 and 28uji'g*
773 UP ff*
Vol/ Unit
(ml.)
.
•Equivalent grams per gallon are: 0.03, 0.05, 0.07, and 2.0 g/gal, respective!)
MULTI-ELEMENT
Concentrations:
Weight percent — boldface
Microgram per gram — light face
Nanogram per gram — italics
SRM
I632a
I633a
1634
1635
I643a
1645
1646
1648
Type
Trace Elements in Coal (Bituminous)
Trace Elements in Coul Fly Ash
Trace Elements in Fuel Oil
Trace Elements in Coal (Subbituminous)
Trace Elements in Water (ng'g) . ,
River Sediment
Estuanne Sediment
Urban Paniculate
Unit Size
75 g
IN PREP
100 mL
75 e
IN PREP
70 g
IN PREP
2 2
Al
(1 64%)
As
9 3
(0 095)
42
1 15
Be
/<
-------
Industrial Hygiene Standards
Organic Solvents on Charcoal
These SRM's consist of charcoal tubes to which have been added known quantities of the specified organic
solvent. Each SRM consists of eight tubes, two each of four solvent levels (except 2661a). SRM 2661a consists
of nine tubes, three each of three solvent levels. Each tube is color coded for both the solvent and the solvent
level.
SRM
266 In
2662
2663
2664
2665
2666
2667
Solvent \ Solvent Color Code
\
\
Benzene
p-Dioxane
1,2-Dichloroethane
Trichloroethylene
red
blue
green
while
yellow
black
gold
Solvent level, mg per tube
(Solvent Level Color Code)
1
(red)
16*
0.040
.016
.098
.147
.286
.033
11
(blue)
30*
0.293
.112
.381
.516
1.03
0.114
III
(green)
54»
1.79
0.996
1.56
2.14
4.09
0.414
IV
(black)
S.3S
6.49
5.80
6.87
15.4
1.58
*Mg per tube
Freaze-Oried Urine
These SRM's consist of two bottles of freeze-dried human urine, one containing a low and one an elevated
level of the element certified.
SRM
2671
7fiT>
Element
Low Level*
(mg'L)
OS35
00042
Elevated Level"
(mg/l.)
7.14
0.294
•When reconstituted with 50 mL water.
Materials on Filter Media
These SRM's consist of potentially hazardous materials deposited on fillers to be used to determine the
levels of these materials in industrial atmosphere.
SRM
2675
2676a
2679
Type
Beryllium on Filter Media
Metals on Filter Media
Material Certified
Beryllium
Cadmium
Lead
Manganese
Zinc
Quartz
Clay
Quantity Certified
(vg., filler)
1
0.052
1 02
6.96
1.97
9.86
3.8
(400)
II
026
2.50
15.23
9.89
49.52
29.9
(370)
III
1.00
10.18
29.64
19.70
99.22
76.1
(320)
IV
—
—
—
—
193.2
(200)
104
-------
Trace Element Standards
The SRM's listed below were designed for trace chemical analysis, specifically for calibrating instruments
and checking analytical techniques and procedures used to determine trace elements in various inorganic
matrices. In addition many SRM's certified for chemical composition have one or more constituents certified at
or below the lOO^g/g level.
Trace Element Standard* (Nominal Concantrations)
Element
Boron
Cadmium
Chromium
Cobalt
Coooer
Dysprosium
Erbium
Europium
Gallium
Gold
1 ndium . . . .
Iron
Lanthanum
Lead
Manganese
Xeodymium
Nickel
Potassium
Rhenium
Rubidium
Samarium
Scandium
Silver
Strontium '.
Thallium
Tantalum
Thorium
Titanium
Uranium
Ytterbium .
?.inc . . .
607
(ppm)
52390
65 48 5
610-611
(ppm)
(351)
(398.5)
(390)
(444)
(25)
458
426
485
(422.6)
4587
(461)
(49 43)
425 7
(254)
515 5
(61 8)
(447)
457 2
(437)
461 5
(433)
612-613
(ppm)
(40
(32)
(39)
(378)
(35 51
(377)
(35)
(39)
(36)
(39)
(5)
51
(36)
3857
(196)
(36 94)
(36)
38 8
(64)
(667)
31 4
(39)
220
784
15 7
(44)
3779
(50 1)
37 38
(42)
614-615
(ppm)
/I U6)
(I 30)
(0 55)
(099)
0 71
I 34
(099)
(1 3)
(0 5)
<0 75)
13 5
(0 83)
2 32
(1 41)
(0 79)
(095)
30
(0 17)
0 855
(0 59)
0 42
45 8
0 269
(0 74)
0 748
(3 I)
0 823
(2 41)
616-617
(ppm)
(0 078)
(0.20)
(065)
(0 23)
(0 IK)
(0 26)
(11)
(0 034)
1 85
062
29
(0 004)
00998
(0 026)
41 72
(0 0082)
(0 025)
0 0252
(25)
0 0721
In addition to the.36 elements listed above, the Glass SRM's contain the following 25 elements: As, Be. Bi.Cs. Cl. F. Ge. Hf. Hg Li,
l-ii. Mg. Nb, P. Pr, Se, S, Te, Tb. Tm, Sn, W, V, Y, and Zr.
105
-------
CERTIFIED PHYSICAL PROPERTIES STANDARDS
Ion Activity Standards
These SRM's are intended for use in the preparation of solutions for the calibration of specification
electrodes. This includes the pH and pD measuring systems.
pH Standards
These SRM's are furnished as crystals for the preparation of solutions of known hydrogen ion con-
centration "for calibrating and checking the performance of commercially available pH materials and
instruments. They are furnished with certificates giving directions for preparation of the solutions and
tables of pH values at various temperatures.
SRM's 186Ic and I8611c, 191 and 192, and 922 and 923, are certified for use in admixture only. At an
equimolar (0.025 molal) mixture of SRM's 186Ic and 186IIc, a pH(S) of 6.863 at 25 °C is obtained. Direc-
tions also are furnished for the preparation of a physiological reference solution from 1861c and 186Ilc
having a pH(S) of 7.415 at 25 °C.
SRM
185e
186Ic
I8611c
I87b
188
189
191
192
922
923
Type
Potassium acid phthalate
Potassium dihydrogen phosphate \
Disodium hydrogen phosphate /
Borax
Potassium hydrogen tartrate
Potassium tetroxalate ...
Sodium bicarbonate^
Sodium carbonate J
Tris(hydroxymethyl)aminomethane \
Tris(nydroxymethy!)aminomethane hydrochlonde/
pH(S)
(at 25 °C)
4004
/6863 l
1 7.415 {
9 183
T, 557
I 679
1001
7 699
Wt Unit
(grams)
60
30
30
"0
60
65
.'0
30
25
35
pO Standards
These SRM's are furnished as crystals for preparation of solutions of known deuterium-ion concentra-
tion for the calibration and correction of pH indicating equipment to indicate pD data. SRM's 21861 and
2186II, and 2191 and 2192, are certified for use in admixtures only.
SRM
21861
21861!
2191
2192
Type
Potassium dihydrogen phosphate^
Disodium hydrogen phosphate /
Sodium bicarbonatq
Sodium carbonate J
pD(S)
Values
741
10 74
Wt, JJmt
(grams)
30
30
30
30
Ion-S«l«ctiv« Efoetrodes
These SRM's are certified for the calibration of ion-selective electrodes and have conventional ionic
activities based on the Stokes-Robinson hydration theory for ionic strengths greater than 0.1 mole per liter.
SRM
2201
2202
2203
Type
Sodium Chloride
Potassium Chloride
Potassium Fluoride
Certified Property
oNa. uCl
pK, pCl
pF
Wt/Unit
(grams)
125
160
125
106
-------
Primary, Working, and Secondary Standard Chemicals
These SRM's are high-purity chemicals defined as primary, working, and secondary standards in
accordance with recommendations of the Analytical Chemistry Section of the International Union of Pure and
Applied Chemistry [Ref. Analyst 90, 251 (1965)]. These definitions are as follows:
Primary Standard:
a commercially available substance of purity 100 ± 0.02 percent (Purity 99.984- percent).
Working Standard:
a commercially available substance of purity 100 ± 0.05 percent (Purity 99.95+ percent).
Secondary Standard:
a substance of lower purity which can be standardized against a primary grade standard.
SRM
!7a
40h
4lb
83c
84h
I36c
350
723
944
949e
950b
951
960
984
987
999
Type
Sucrose
Dextrose (D-glucose)
Arsenic Trioxide
Acid Potassium Phthalate
Potassium Dichromate
Benzoic Acid
Tris(hvdroxymethvl)aminomethane
Plutonium Metal . . .
Uranium Oxide (U3O^)
Boric Acid
Uranium Metal
Rubidium Chloride .
Strontium Carbonate ... . .
Potassium Chloride
Wt/ Unit
(grams)
60
60
70
75
60
60
30
50
0.5
05
25
100
26
1
1
60
Certified Use
Polanmetric Value
Reductometric Value
Reductometric Value
Reductometric Value
Acidimetric Value
Oxidimetric Value
Acidimetric Value
Basimetric Value
Assay
Assav
Uranium Oxide Standard Value
Acidimetric and Boron Isotopic Value
Assay
Assay and Isotopic
Assav and Isotopic
. _ , , , Potassium
Assav Standard tor ~, , ,
nxxL} ouiuuaiu lui Qj,|on(Je
Purity
Stoichiometric
a
99.95
b
99.99
99.99
99.98
99.98
99.97
100
99.996
99.968
100.00
99.975
9990
9998
99.98
99.99
^Sucrose-Moisture <0.01 percent. Reducing Substances <0.02 percent. Ash 0.001 percent.
Dextrose-Moisture 0.07 percent, Ash 0.002 percent.
107
-------
Hydrocarbon Blends
Four standard hydrocarbon blends are available for calibration of mass spectrometers and gas chromato-
graphic procedures used in the analysis of gasolines, naphthas, and blending stocks. The even numbered SRM
596 is a virgin naphtha and the odd numbered SRM's 593, 597, and 599, are representative of typical catalyti-
cally cracked naphthas in the C7 and C8 paraffin and cycloparaffm series.
Each SRM is supplied in a unit often sealed ampoules. Each ampoule contains 0.03 mL of the blend. Each
ampoule is intended to provide material for only one calibration analysis so that possible fractionation of
components will be avoided.
For individual components present in the mixtures in the amount of 10% or less (by volume), the limits of
error in composition are not greater than ±0.01 percent and for components present in more than 10 percent,
the limits of error are not greater than ±0.10 percent.
SRM
Blend No.
Unit (Ampoules)
Hydrocarbon
n-Heptane
2-Methv!hexane
3-Methvlhexane
2.2-Dimethvlpentane
2.3-Dimethvlpentane
2.4-Dimethylpentane
Methvlcvclohexane
Ethvlcvciopentane
1 . 1 -Dimethvlcvclopentane
I .trans-2-Dimethvicvclopentane
1 .trans-3-Dimethyicvclopentane
Ethvlcvclohexane
1 .trans-2-Dimethvlcvclohexane
l.cis-3-Dimethvlcyclohexane
1 .trans-4-Dimethvlcvclohexane
l-Methvl-cts-2-ethvicvclopentane
1 . 1 .3-Trimethvlcvclopentane
1 .trans-2-cis-3-Trimethvlcyclopentane
i.trans-2-cis-4-Trimethvlcvclopentane
593
2
10
596
5
10
597
6
10
599
8
10
Volume Percent (Nominal)
17
25
30
20
8
57
9
4
14
16
32
14
3
30
21
17
7
19
14
20
4
6
13
108
-------
Metallo-Organic Compounds
These SRM's are intended for the preparation of solutions in oils of known and reproducible concentra-
tions of metals. Because "matrix" effects occur, it is desirable to prepare the standard solutions in oil identical or
similar to the oil being studied. Possession of an adequate collection of these metallo-organic SRM's permits the
preparation of any desired blend of known concentrations of metal in the appropriate lubricating oil. They are
used primarily for the calibration of spectrochemical equipment used in the determination of metals in
lubricating oil. This technique is used extensively in the defense program, the transportation industry, and other
industries where the consequences of failure of a moving metal part may range from inconvenient to
catastrophic.
The Certificate supplied with each SRM gives the percentage of the element of interest and directions for
preparing a solution of known concentration in lubricating oil.
Constituent Certified
SRM
I075a
105!b
I063a
1053a
1074a
1078b
I055b
1080a
1079b
I059b
I060a
1061c
1062b
1064
I065b
107tb
1066a
1076
1077a
1069b
1070a
I057b
1052b
1073b
Element
Al
Ba
B
Cd
Ca
Cr
Co
Cu
Fe
Pb
Li
Mg
Mn
Hg
Ni
P
Si
K
Ag
Na
Sr
Sn
V
Zn
(wt. percent)
8.07
28.7
2.4
24.8
12.5
9.6
14.8
16.37
10.45
36.65
4.1
6.45
13.2
36.2
13.89
9.48
14.14
10.1
42.60
12.0
20.7
22.95
13.01
16.66
Wt/Umt
(grams)
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
5
Type
Aluminum 2-«thylhexanoate
Barium cyclohexanebutyrate
Menthyl borate
Cadmium cyclohexanebutyrate
Calcium 2-ethy!hexanoate
Tris( 1 -phenyl- 1 ,3-butanediono)chromium (111)
Cobalt cyclohexanebutyrate
Bis (l-phenyl-l.3-butanediono) copper (11)
Tris (l-phenyl-l,3-butanediono) iron (III)
Lead cyciohexanebutyrate
Lithium cyclohexanebutyrate
Magnesium cyclohexanebutyrate
Manganous cyclohexanebutyrate
Mercuric cyclohexanebutyrate
Nickel cyclohexanebutyrate
Triphenyl phosphate
Octaphenylcyclotetrasiioxane
Potassium erucate
Silver 2-ethylhexanoate
Sodium cyclohexanebutyrate
Strontium cyclohexanebutyrate
Dibutyltin bis (2-ethylhexanoate)
Bis (l-phenyl-1.3-butanediono) oxovanadium (IV)
Zinc cyclohexanebutyrate
109
-------
Neutron Density Standard
This SRM is provided as a reference source of a cobalt-in-aluminum alloy to serve as a neutron density
monitor wire SR M. Accurate determination of thermal neutron densities is essential in irradiation tests to obtain a
basis for comparison of densities among reactors, in applying data in the design of reactors, in understanding the
mechanisms of radiation damage, and for use in neutron activation analysis. The wire is 0.5 mm in diameter and 1
meter lona.
SRM
953
Type
Neutron density monitor wire (Co in Al) ,
Cobalt Content
(Weight percent)
0.116
Fission Track Glass Standards
These SRM's, at four uranium concentration levels, will aid fission track laboratories in interlaboratory
comparisons of data and in monitoring neutron flux for irradiations. The fission track glass standards are
certified for the neutron flux (n- cm"2-sec"1) that induced uranium fission in selected wafers. The materials were
irradiated in the NBS 10 Megawatt Research Reactor, at two different neutron energies.
Each SRM unit contains four unirradiated glass wafers and two irradiated wafers.
SRM
961
962
963
964
Total U concentration
ppra (by weight)
461.5
37.38
0.823
0.0721
235 U atom percent
0,2376
0.2392
0.2792
0.616
Irradiation
time (sec.)
RT-3 8
RT-4 12
RT-3 8
RT-4 12
RT-3 80
RT-4 120
RT-3 360
RT-4 540
Isotopic Reference Standards
The isotopic composition of these SRM's has been determined by mass spectrometry, by comparison with
mixtures prepared from high-purity separated isotopes: They are useful for those looking for small variations in
the isotopic composition of the elements, and for the evaluation of mass discrimination effects encountered in
the operation of mass spectrometers.
SRM
951
952
975
976
977
978
979
980
*981
*982
*983
984
987
989
990
991
**993
Isotopic Reference Standards
Boric Acid 95% Enriched IOB
Silver Nitrate
Lead Vletal Natural
Lead Metal. Equal Atom (206' 208)
Lead Metal Radiogenic (92%-206)
Strontium Carbonate assay and isotopic
Lead-206 Spike assay and isotopic
Element Certified
Boron
Boron
Chlorine
Copper
Bromine
Silver
Chromium
Magnesium
Lead
Lead
Lead
Rubidium
Strontium
Rhenium
Silicon
Lead
Uranium
Wt/Unit
(grams)
1.0
100
0.25
.25
.25
.25
.25
.25
1.0
1.0
1.0
0.25
.25
pkg. (50)
wafer, 3 cm D
X 0.2 cm
15
15
*Sold as a set only of three 981. 982, and 983.
"Special Nuclear forms required.
110
-------
Radioactivity Standards
Information concerning the SRM appears on it or its container. A Certificate containing pertinent
information on the SRM is sent under separate cover; a photocopy of the certificate is sent with the SRM.
Copies of these Certificates and information concerning the applications of these SRM's are available on
request to the NBS Office of Standard Reference Materials. These materials (except the carbon-14 con-
temporary dating standard) are shipped only by express or air freight (shipping charges collect). The prices
of SRM's may change as current stocks are depleted and are replaced. Purchasers will be billed at the prices
in effect at the time of shipment.
The stated accuracies of the older standards are, in general, an estimate of the standard deviation added
to an estimate of maximum possible systematic error. The accuracies of more recent standards are based on
the 99 percent confidence level of precision, with the same estimate of systematic error.
The International Commission on Radiation Units (ICRU) recommended definition of the activity (A)
of a quantity of a radioactive nuclide is the quotient of AN by At, where AN is the number of nuclear trans-
formations that occur in this quantity, in time At: (A = AN/ At). NBS uses the abbreviation ntps for nuclear
transformation per second. In this list both ntps and dps are used; the latter when dps has been used in certi-
ficates printed before 1968. The terms: aps, /Tps, ^ps, K-x-rays ps, yps are used for the emission rates of
alpha particles, beta particles, positrons, K-x-rays, and gamma-rays, respectively.
The SRM's listed below, not marked with an asterisk (*), may be ordered singly, without a license,
under the general licensing provisions of the Atomic Energy Act of 1954. Those marked by an asterisk are
available only under the special licensing provisions of the Atomic Energy Act of 1954.
NOTE: Certain radionuclides are not economical to maintain in stock because of short half lives or low
demand. When sufficient demand exists, based on letters of inquiry, these materials are prepared and those
who have expressed interest are notified of their availability. If you need any radionuclides not listed, con-
tact the Radioactivity Section, Room C114, Radiation Physics Building, National Bureau of Standards,
Washington, D.C. 20234 (Telephone: 301-921-2668).
In addition, chemically stable solutions of most radionuclides, including those no longer issued by NBS
or that are currently out of stock, may be submitted to NBS for calibration as described in "Calibration and
Related Measurement Services of the National Bureau of Standards," NBS Special Publication 250 (1978).
Requests for these tests should be submitted, with full source information for approval of suitability, to the
Radioactivity Section.
Alpha-Particle Standards
These SRM's consist of a practically weightless deposit of the nuclide on a thin platinum foil cemented
to a monel disk.
SRM
*4906
4904-E
4907
Radionuclide
Plutomum-238 .
Americium-241
Gadolinium- 1 48
Approximate Activity at Time of
Calibration ( Month , Year)
IN PREP
103 to 6 X 1(P ntps (2 70)
50 to 2 X 104 ntps (1/73)
Uncertainty
(%)
1 0
1 7
111
-------
Beta-Particle and Gamma-Ray Gas Standards
These SRM's contain the Radionuclide in the inactive gas at a pressure of about one atmosphere in a
glass break-seal ampoule.
SRM
Radionuclide
Approximate Activity at Time of
Calibration (Month/Year)
Uncertainty
4302
4308
4935-C
*4235
Argon-39 ..
Krypton-85.
Krypton-85.
Krypton-85.
3.3 X 104 ntps, source
IN PREP
5 X 10? ntps/mole (3/74)
1 X 107 ntps/source (IT74)
1.5
1.0
1.0
Alpha-Particle, Beta-Particle, Gamma-Ray, and Electron-Capture Solution Standards
These standard reference materials are contained in flame-sealed ampoules.
SRM
Radionuclide
Approximate Activity
or Emission Rate per gram of
Solution at Time of Calibration
(Month, Year)
Approx.
Weight of
Solution
(gram)
Accuracy
4229
4333
4219-B
4250
*4233
4925
4222
4223
4224
4245
4246
4247
4943
4941-D
4913-B
4370
4926-C
4927-B
4947
*4949
4929-C
*4226
4331
4940-B
4228-B
4922-E
4919-D
4234
Aluminum-26
Americium-243
Cadmium-109
Cesium-134
Cesium-137
Carbon-14 (benzoic acid in toluene)
Carbon-14 (n-hexadecane)
Carbon-14 (n-hexadecane)
Carbon-14 < n-hexadecane)
Carbon-14 (sodium carbonate)
Carbon-14 (sodium carbonate)
Carbon-14 (sodium carbonate)
Chlorine-36
Cobalt-57
Cobalt-60
Europium-152
Hydrogen-3 (H,O)
Hydrogen-3 (H,O)
Hydrogen-3 (C6H5CH3)
Iodine-129
Iron-55
Nickel-63
Plutcnium-239
Promethium-147
Selenium-75
Sodium-22
Strontium-Yttnum-90
Strontium-Yttrium-90
39 ntps (11/71)
3 ntps (9/74)
1 X 105 7 ps (11' 76)
1 X 10'(9/77)
5 X 10'4 atoms (12/72)
2 X 10" dps (7/58)
4 X 10" dps (6/67)
4 X 103 dps (6'67)
4 X 102 dps (6167)
4X 10s dps (5, 74)
4 X 10* ntps (5/74)
4 X 102 ntps (5/74)
1 X ICH/Tps (1962)
6 X 103 ntps (5'73)
1 X 105 ntps (5; 75)
7 X 104 ntps (5/78)
4X 103 ntps (6/78)
7 X 10' ntps (6/78)
3 X 10J dps (2/64)
0.2
-------
Environmental Standards
SRM
4350
RM45b
Radionuclides
4"K. 54Mn.
""Co, "Zn,
Certified *>Sr, WY,
'"Cs, ":Eu,
I54£u> 22SAc.
-3gPu ~40Pu
»Fe, '"Sb,
l35Eu :08T1.
Values 214Pb. :l4Bi.
given, but «6Ra, -8Th,
uncertified :30Th, :2-Th,
23lPa- 234^,
:3spu> :n Am,
Approximate Activity, or
Emission Rate, per gram, at
Time of Calibration
(Month Year)
(s-'g-')
1 per radionuclide ( 1 - 75)
1 per radionuclide
Contains the same radionuclides at SRM 4350 at approximately
equal concentrations. Xo values provided.
Form
River
Sediment
River
Sediment
Approx.
Vlass
100
100
Overall
Uncertainty
ra
vanes
NA
Low Energy Photon Sources*
SRM
4260-B
4261
4264
Radionuclide
Iron-55
Cadmium-109
Tin-121m
Approximate Emission Rate
(Month /Year)
1 to 2 X 104 kxs'1 steradian (4/77)
105 kxs-' (6/76)
200 to 750 - 37 15 keVvs'1 (5-71)
Uncertainty
(%)
1 8
1 5
30
"These SRM's consist of a thin-layer deposit of the radionuclide on a thin stainless steel or platinum foil cemented to a monel disk.
113
-------
Gamma-Ray "Point-Source" Standards
This group of Standard Reference Materials is usually prepared by depositing the radioactive material and
sealing it between two layers of polyester tape, mounted on an aluminum ring. Exceptions to this procedure are
americium, krypton, and thorium SRM's. The americium-241 SRM's, 4211 and 4213, are prepared by
electroplating americium onto a 0.010-cm thick platinum foil, which is covered with a 0.005-cm thick aluminum
foil. The aluminum-covered source is sandwiched between two layers of 0.036-cm thick polyurethane film tape.
The krypton-85 SRM, 4212, is prepared by sealing a krypton-85 impregnated aluminum foil between two glass
disks, with an epoxy adhesive. The thorium-228 S RM's, 4205 and 4206, are prepared by depositing and sealing the
radionuclide between two layers of gold foil and this sandwich is then sealed between two double layers of
polyurethane-film tape.
SRM
4206-B
4218-C
*4211
•4213
4202-C
*4212
4200-B
*4207
420 1-B
4240
4203-C
4210
499 1-C
4996- B
Radionuclide
Thorium-228
Europium- 152
Americium-241
Americium-241
Cadmium- 109
Krypton-85 ,
Cesium- 137
Cesium- 1 37
Niobium-94 . .
Bismuth-207
Cobalt-60
Cobalt-60
Sodium-22
Sodium-^
Gamma-Ray Energy
(mev)
2 614
0 122-1 408
0060
0060
0 088
0514
0 662
0 662
0 702 0 871
0 5696 I 0634 1 7697
1 173 1 331
1 173 1 333
1 2745
1 2745
Approximate Activity (ntps)
at Time of Calibration
(Month; Year)
10s (12/76)
4 to 50 X 104 (5' 78)
4 to 18 X 104 (2 '70)
2 to 4 X 105 (2/70)
3 to 6 X 10J (11(76)
7 X 106 to 4 X 107 (5/71)
7 X 10" (12/68)
5 X 105 (12'68)
4 to 6 X 103 (4/70)
5 X 104 to 1 X 10s (1 73)
1 to 2 X 10s P'73)
2 X 106 (4 '69)
6 X 104 (4' 69)
3 X 105 (4' 69)
Uncertainty
(99
1 8
1 4
2 8
28
2 1
26
1 3
I 3
5
4
i
1
5
1 5
Radium Gamma-Ray Solution Standards
These samples are contained in flame-sealed glass ampoules.
SRM
4955
4956
4957
4958
4959
4960
4961
4962
4963
4964-B
Radium Content
(in micrograms)
0.1
0.2
0.5
1.0
2.0
5.0
10
20
50
102
Uncertainty
TO
±3.6
4.4
1.8
1.8
1.3
1.3
1.1
1.1
1.1
0.5
Radium Solution Standards for Radon Analysis
These samples are contained in flame-sealed glass ampoules.
SRM
495 1 -C
4950-D
4953-C
4952-B
Nominal Radium Content,
per gram (Month/ Year)
lO-1 ' (4/78)
10-1* (4 78)
10-" (4/78)
Blank Solution (8/76)
Approx. Wt. Soln.
(grams)
10.4
10.3
10.3
20
Uncertainty
(To)
1.8
1.5
1.3
68
114
-------
APPENDIX B
EXAMPLE OF CALCULATIONS THAT SHOULD BE PROVIDED
FOR ESTIMATES OF IMPRECISION AND BIAS
115
-------
Linear Regression Analysis of Calibration Data
Five calibration standards consisting of sodium sulfate deposits on
glass fiber filters were analyzed for sulfate mass. Letting, x denote the
true value and y denote the value obtained by analysis, the data (in ug
sulfate/filter) are (x,y) = (1988, 2040), (648.7, 719), (1161.5, 1170),
(221.6, 222), (1504, 1640).
For linear regression analysis of these data, the assumed model is
y = (a + bx) + e, with a linear systematic component a + bx and random error
term e having mean zero and constant variance a2. Bias parameters are a and b;
a is called imprecision, x, y, S , S denote the averages and standard
x y
deviations of x and y, r denotes the sample correlation coefficient, n denotes
sample size, t is the 100 (l-a/2) percentage point of a t distribution with
(n-2) degrees of freedom, and x2(m,c) in the upper lOOc percentage point of
a chi-squared distribution with m degrees of freedom. For the data in this
example:
n = 5 S = 720.97
x = 1,104.76 r2 = 0.9954
y = 1,158.20 t(3, 0.975) = 3.182
S = 694.75
/\
Formulas and recommended reporting format for summarizing linear regres-
sion analyses are shown in Table B-l. Using these formulas, estimates of
bias and imprecision parameters (and their confidence intervals) for the.
data set in this example are shown in Table B-2.
116
-------
Parameter
TABLE B-l. FORMULAS AND RECOMMENDED REPORTING FORMAT
FOR SUMMARIZATION OF LINEAR REGRESSION ANALYSES
Estimate of
parameter
100 (1-a) %
Confidence interval
parameter
Slope
Intercept
Imprecision
b = rSy/Sx
s*.
a = y - Pi
a = VSj(l-r2)(n-l)/(n-2)
bit- a/(SxVrri
lit
2-
[n-l])
aV(l/n) + x~2/
/\
(J (n-2) a2 ~ , J (n-2) a
\i x^(n-2, a/2) ' xz(n-2, l-<
TABLE B-2. ESTIMATES AND CONFIDENCE INTERVALS
FOR BIAS AND IMPRECISION PARAMETERS
OF THE FIVE-POINT CALIBRATION EXAMPLE
Parameter
Estimate of
parameter
95% Confidence
interval for
parameter
Slope b = 1.04
Intercept a = 14.36
Imprecision a = 56.46
(0.91, 1.16)
(-149.51, 178.23)
(32, 210)
117
* US GOVERNMENT PRINTING OFFICE 1983 - 659-095/1938
------- |