Constitution, ByLaws
and Standards
Approved May 25, 2001
Effective July 1, 2003 unless otherwise noted
EPA/600/R-01/100

-------
Note that the NELAC standards now have two significant dates: 1) the
date the standards were approved at the annual meeting, and 2) the date
the standards are effective and must be implemented. This is especially
important as some portions of the standards have different effective
dates. The approval date is part of the document control header on each
page. The cover of each chapter shows both the approval date and the
effective date. Changes approved for implementation at a time other
than the effective date (on the chapter cover) are noted in the chapter,
showing the approved text and its effective date.

-------
PROGRAM POLICY
AND STRUCTURE
Approved May 25, 2001
Effective July 1, 2003 unless otherwise noted

-------
Note that the NELAC standards now have two significant dates: 1) the
date the standards were approved at the annual meeting, and 2) the date
the standards are effective and must be implemented. This is especially
important as some portions of the standards have different effective
dates. The approval date is part of the document control header on each
page. The cover of each chapter shows both the approval date and the
effective date. Changes approved for implementation at a time other
than the effective date (on the chapter cover) are noted in the chapter,
showing the approved text and its effective date.

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page i of ii
TABLE OF CONTENTS
PROGRAM POLICY AND STRUCTURE
1.0	PROGRAM POLICY AND STRUCTURE	 1
1.1	INTRODUCTION 	 1
1.1.1	Overview of NELAC		1
1.1.2	History		1
1.1.3	Summary of the NELAC Standards		1
1.1.4	General Application of NELAC Standards		2
1.1.5	Application of NELAC Standards to Small Laboratory Operations 		2
1.2	OBJECTIVES	 2
1.3	ELEMENTS	 3
1.4	PURPOSE AND SCOPE OF NELAC	 3
1.4.1	Purpose	 3
1.4.2	Scope	 4
1.4.2.1	Scope of NELAC		4
1.4.2.2	Applicable EPA Statutes		4
1.4.2.3	Exemptions		4
1.4.2.4	No Restriction on Legal Actions 		4
1.5	ROLES AND RESPONSIBILITIES OF THE FEDERAL GOVERNMENT, THE STATES, AND
OTHER PARTIES	 5
1.5.1	EPA 	 5
1.5.1.1 National Environmental Laboratory Accreditation Program 	 5
1.5.2	States and Federal Agencies as Accrediting Authorities	 5
1.5.2.1	Federal Agencies 		5
1.5.2.2	States 		6
1.5.2.3	Accrediting Authorities		6
1.5.3	Recognition 		7
1.5.4	Joint Federal and State Roles		8
1.5.5	Assessor Bodies 		8
1.5.6	Other Parties 		8
1.6	STRUCTURE OF NELAC	 8
1.6.1	The Board of Directors	 8
1.6.2	The Environmental Laboratory Advisory Board	 9
1.6.3	The Accrediting Authority Review Board	 9
1.6.4	The Participants in NELAC 	 10
1.6.4.1 Participation of the Voting Members and Contributors 	 10
1.6.5	The Committees 	 11
1.6.5.1	The Standing Committees	 11
1.6.5.2	The Administrative Committees 	 13
1.7	CONDUCT OF CONFERENCE BUSINESS 	 13
1.7.1	The Generation of Standards 	 13
1.7.2	Meetings 	 13
1.7.2.1	Annual Meeting	 13
1.7.2.2	Interim Meeting 	 14

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page ii of ii
1.7.2.3	Special Meetings 	15
1.7.2.4	Committee Meetings 	15
1.8 ORGANIZATION OF THE ACCREDITATION REQUIREMENTS 	15
1.8.1	Scope of Accreditation 	15
1.8.2	Supplemental Accreditation Requirements	17
1.8.3	General Laboratory Requirements	17
1.8.4	General Field Sampling Requirements	17
1.8.5	Chemistry Requirements 	18
1.8.6	Whole Effluent Toxicity Requirements 	18
1.8.7	Microbiology Requirements 	18
1.8.8	Radiochemistry Requirements 	18
1.8.9	Microscopy Requirements 	18
1.8.10	Field Measurement Requirements	19
APPENDIX A - GLOSSARY	A-1
LIST OF FIGURES
Figure 1-1. NELAC Structure 	20

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 1 of 21
1.0	PROGRAM POLICY AND STRUCTURE
Chapter One provides an overview of the history, purpose and objectives of the National
Environmental Laboratory Accreditation Conference (NELAC). The organizational structure and
function of NELAC, and the roles of the various participants, form the major portion of this chapter.
In addition, the Constitution and Bylaws, and the content of the five chapters which follow are briefly
described. Together, these six chapters and related appendices constitute the NELAC standards.
1.1	INTRODUCTION
[Effective July 1, 2001]
1.1.1 Overview of NELAC
This association shall be known as the "National Environmental Laboratory Accreditation Conference"
(NELAC) and is sponsored by the United States Environmental Protection Agency (EPA) as a
voluntary association of State and federal officials. The purpose of the organization is to foster the
generation of environmental laboratory data of known and documented quality in a cost-effective
manner through the development of nationally accepted standards for environmental laboratory
accreditation. NELAC encompasses all fields of accreditation associated with compliance with EPA
regulations. The program will be administered by State and federal accrediting authorities in a
uniform, consistent fashion nationwide.
1.1.2	History
NELAC is the result of a joint effort by EPA, other federal agencies, the States, and the private sector
that began in 1990 when EPA's Environmental Monitoring Management Council (EMMC) established
an internal work group to consider the feasibility and advisability of a national environmental
laboratory accreditation program. The work group concluded that EPA should consult with
representatives of all stakeholders, by establishing a federal advisory committee. As a result, the
Committee on National Accreditation of Environmental Laboratories (CNAEL) was chartered in 1991
under the Federal Advisory Committee Act. In its final report to EMMC, CNAEL recommended that
a national program for environmental laboratory accreditation be established. In response to the
CNAEL recommendations, EPA and State representatives formed the State/EPA Focus Group that
developed a proposed framework for NELAC, modeled afterthe National Conference on Weights and
Measures. The Focus Group prepared a draft Constitution, Bylaws and standards, which were
published in the Federal Register in December 1994. NELAC was established on February 16,1995
by State and federal officials with the adoption of an interim Constitution and Bylaws.
NELAC was established as a standards-setting body to support a National Environmental Laboratory
Accreditation Program (NELAP). The goal of NELAP is to foster cooperation among the current
accreditation activities of different States or other governmental agencies. NELAP seeks to unify the
existing State and federal agency standards, at minimum cost to the States, federal agencies and
accredited laboratories.
1.1.3	Summary of the NELAC Standards
The NELAC uniform standards are contained in this chapter and the following five chapters and
related appendices.

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 2 of 21
Chapter 2 contains the criteria for the proficiency testing (PT) program. Laboratory participation in
PT programs fulfills one part of the quality assessment requirements of NELAC. The PT programs
in which a laboratory must participate to become accredited are defined as well as the criteria for
samples, PT providers, and acceptance limits.
Chapter 3 describes the essential elements that are to be included in an on-site assessment and the
requirements for an accrediting authority conducting on-site assessments. The qualifications and
requirements for assessors are described as well as the program elements to ensure uniform and
consistent implementation of the NELAC standards.
Chapter 4 describes the accreditation process the laboratory must follow to be recognized as a
NELAC laboratory. The chapter defines the period of accreditation, and the process for maintaining,
awarding and revoking accreditation.
Chapter 5 and the related appendices contain the elements of the laboratory quality system. The
section provides detail concerning quality assurance/quality control requirements so that all
accrediting authorities will evaluate laboratories consistently and uniformly.
Chapter 6 defines the process and operating requirements established by NELAC for an accrediting
authority to become nationally recognized. It provides the policies and criteria that an accrediting
authority must meet to apply for and maintain recognition.
The Glossary, which is contained as Appendix A to Chapter 1, contains the definition of terms which
are used throughout the standards to assure the consistency of their use and interpretation.
1.1.4	General Application of NELAC Standards
These standards are for use by accrediting authorities and others concerned with the competence
of environmental laboratories and other organizations directly involved and interested in the
standardization of environmental measurements. Note that any reference to NELAP approval or
NELAC accreditation means that the accrediting authority or laboratory meets the requirements in the
NELAC standards, and is not an endorsement by EPA.
As described in more detail in Chapter 4, an accredited organization may use the NELAC logo on
general literature. It is the ethical responsibility of an accredited organization to describe its
accredited status in a manner that does not imply accreditation in areas that are outside its actual
Scope of Accreditation. When soliciting business or reporting test results, an accredited organization
must distinguish between those tests that fall within its scope of accreditation and those that do not.
1.1.5	Application of NELAC Standards to Small Laboratory Operations
All laboratory operations subject to NELAC standards are expected to generate data of known and
documented quality and maintain the quality systems required to generate quality data. However,
NELAP recognizes that some laboratory operations have some unique characteristics that
differentiate them from other operations. The NELAC standards have addressed these issues by
allowing some flexibility in meeting the requirements for personnel (Section 5.4.2, Section 5.6) and
their credentials (Section 4.1.1).
1.2 OBJECTIVES
The objectives of NELAC, as specified in Article II of the Constitution, are: to provide a national forum
for the discussion of all questions related to standards for environmental laboratory accreditation; to
provide a mechanism to establish policy and coordinate activities within NELAC; to develop a

-------
Program Policy
NELAC
and Structure
Revision 14
May 25, 2001
Page 3 of 21
consensus on uniform standards for laboratory accreditation, and encourage and promote uniform
standards of qualityforassessment and accreditation; and to foster cooperation among environmental
laboratory accrediting authorities and regulatory officials.
1.3 ELEMENTS
Functional elements of the objectives are:
a)	To develop and improve the standards for qualifying as an accredited laboratory, for qualifying
as an accrediting authority, and for uniformly implementing the national accreditation program.
The standards address the accreditation process; on-site laboratory assessments to review the
quality systems; assessor training; proficiency testing; and oversight of accrediting authorities for
uniform interpretation of the standards.
b)	To designate the States, Territories and Possessions of the United States (hereinafter referred
to as States) and federal agencies as the accrediting authorities. These authorities may be the
assessor bodies, or may use third parties as assessor bodies to carry out in part or in whole the
assessment functions. As accrediting authorities, the States and the federal agencies shall grant
accreditation and ensure compliance with NELAC laboratory standards and criteria.
[Effective July 1, 2001]
c) To provide for recognition among the States and the federal agencies by assuring the consistent
application of the national standards. Oversight by NELAP assures uniformity among the various
accrediting authorities. The Accrediting Authority Review Board (AARB) provides a balanced
review of the program.
d) To develop model language for legislation and regulations which can be adopted by the State
legislatures and accrediting authorities.
[Effective July 1, 2001]
e) To incorporate, to the extent applicable, ISO 17025, ISO 43, and ISO 58.
1.4 PURPOSE AND SCOPE OF NELAC
1.4.1 Purpose
NELAC shall be a standards-setting body. NELAC shall, through the process described in the
Constitution and Bylaws, develop, adopt and publish uniform consensus performance standards on
which the national accreditation program shall be based. These standards will be adopted by NELAC
at its annual meeting. These uniform standards shall include, but are not limited to, quality systems,
proficiency testing, audit programs, and other key elements as established by the Standing
Committees of NELAC. It is not the purpose of NELAC to function as an assessor body, oversee or
approve assessor bodies, or administer any of the main elements of the accreditation program, other
than the development and adoption of standards.

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 4 of 21
1.4.2 Scope
[Effective July 1, 2001]
1.4.2.1	Scope of NELAC
The scope of NELAC shall encompass the necessary environmental sampling and testing to serve
the needs of the States, United States Environmental Protection Agency (EPA), and other federal
agencies involved in the generation and use of environmental data, where such generation or use is
mandated by EPA statutes and pursuant regulations. Organizations are encouraged to use the
NELAC standards for all other environmental sampling and testing.
1.4.2.2	Applicable EPA Statutes
Applicable EPA statutes include the Clean Air Act (CAA); the Comprehensive Environmental
Response Compensation and Liability Act (CERCLA); the Federal Insecticide, Fungicide and
Rodenticide Act (FIFRA); the Federal Water Pollution Control Act (Clean Water Act; CWA); the
Resource Conservation and Recovery Act (RCRA); the Safe Drinking Water Act (SDWA); and the
Toxic Substances Control Act (TSCA). The standards shall also include provisions to permit special
requirements or fields of accreditation promulgated by any of the accrediting authorities.
1.4.2.3	Exemptions
The NELAC standards apply to federal and state mandated testing. Exceptions to EPA-mandated
testing include those provided below:
a)	laboratory analyses associated with FIFRA (40 CFR Part 160) good laboratory practices (GLP),
for testing performed for studies that support applications for research or marketing permits for
pesticide products regulated by EPA under FIFRA.
b)	laboratory analyses associated with TSCA (40 CFR Part 792) good laboratory practices (GLP),
for studies relating to health effects, environmental effects and chemical fate testing as directed
under Section 4 and Section 5 of TSCA.
c)	State governmental laboratories when conducting analyses such as pesticide formulation,
efficacy and residue testing to support FIFRA compliance and enforcement activities under
pesticide cooperative agreement grants.
d)	governmental laboratories engaged solely in the analysis of forensic evidence.
1.4.2.4 No Restriction on Legal Actions
The standards shall not be implemented or administered in a way which limits the ability of local, State
or federal agencies to investigate and prosecute enforcement cases. Specifically, when engaged in
the collection and analysis of forensic evidence to support litigation, those agencies may use any
procedure that is appropriate given the nature of the investigation, subject only to the bounds of sound
scientific practice.

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 5 of 21
1.5 ROLES AND RESPONSIBILITIES OF THE FEDERAL GOVERNMENT, THE STATES, AND
OTHER PARTIES
1.5.1	EPA
EPA shall provide staff support to NELAC as provided for in the Bylaws and agreed to by EPA. EPA
shall assist NELAC by publishing all proposed and final standards .
EPA also participates in joint activities with other federal and State agencies, as described below.
1.5.1.1 National Environmental Laboratory Accreditation Program
EPA shall establish and administer the National Environmental Laboratory Accreditation Program
(NELAP), and shall staff an office to oversee the implementation of NELAC standards. The purpose
of this oversight is to ensure a high degree of standardization and coordination among the different
accrediting authorities.
NELAP performs the following functions in support of NELAC:
a)	evaluating and approving the implementation of NELAC standards by accrediting authorities;
b)	establishing and maintaining a national database on environmental laboratories which contains
information on the status of accrediting authorities, current status of NELAC accredited
laboratories, and status of providers of proficiency test samples;
c)	where conflict of interest may occur in an accrediting authority, accrediting that authority's
principal laboratory if requested. See Chapter 6, section 6.2.2 d) and e);
d)	accrediting EPA laboratories;
e)	reporting to NELAC on the evaluation of the conformance of State and federal accreditation
program activities to NELAC standards;
f)	reporting to NELAC on results of evaluations of proficiency testing sample providers and assessor
training programs; and
g)	approving supplemental accreditation requirements proposed by accrediting authorities (see
Section 1.8.2).
1.5.2	States and Federal Agencies as Accrediting Authorities
In order to be considered a NELAP approved accrediting authority, the individual State or federal
program must adopt the NELAC standards, utilize assessors trained according to the requirements
of NELAC, and be evaluated by the EPA oversight office as being an agency whose accreditation and
assessment program meet all of the requirements of NELAC. Failure in anyone of these areas would
preclude a State or federal program from being recognized by NELAP.
1.5.2.1 Federal Agencies
To operate as accrediting authorities, or to obtain NELAC accreditation for their environmental
monitoring laboratories, federal agencies shall conform to the NELAC standards.

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 6 of 21
1.5.2.2	States
The authority of the States to adopt the NELAC standards is manifest in the authority granted to their
administrative agencies by State legislatures. State governments shall be the principal accrediting
authorities.
1.5.2.3	Accrediting Authorities
An accrediting authority can be either a) any federal department/agency with responsibility for
operating mandated environmental monitoring programs which require laboratory testing, or b) any
State which requires laboratory testing in conformance with at least one of the EPA programs listed
within the scope of NELAC (see Section 1.4.2). If a State chooses not to adopt the NELAC
standards, laboratories in that State may obtain accreditation from any other accrediting authority.
[Effective July 1, 2001]
A primary accrediting authority is one which ensures directly that the laboratory is in conformance with
the NELAC standards. A secondary accrediting authority is one which, through recognition, accepts
the accreditation of a primary accrediting authority.
1.5.2.3.1 Responsibilities of Primary Accrediting Authorities
Once a State or federal department/agency has been approved by NELAP as being an entity whose
accreditation and assessment program meets all of the requirements of NELAC, it will be a primary
accrediting authority, and it will have full responsibility for:
a)	using the NELAC standards as the basis for assessing the qualifications of laboratories applying
for initial or continuing NELAC accreditation;
b)	ensuring conformance by the laboratories it accredits with the national standards established by
NELAC;
c)	granting interim and/or full accreditation to applicant laboratory organizations through the review
and approval of applications, performance of on-site assessments, evaluation of results on
proficiency testing samples, and enforcement of all applicable laws and rules relating to
accreditation; and
d)	submitting the names and appropriate accreditation material to EPA for inclusion in the national
laboratory database.
[Effective July 1, 2001]
Federal laboratories within a State may be accredited by the State accrediting authority or by a federal
accrediting authority. A State accrediting authority is the primary accrediting authority for all non-
federal NELAP accredited laboratories in that State. However, if the State accrediting authority does
not grant NELAP accreditation for testing in conformance with a particular field of accreditation (see
section 1.8), laboratories may obtain primary accreditation forthat particular field of accreditation from
any other accrediting authority.

-------
Program Policy
NELAC
and Structure
Revision 14
May 25, 2001
Page 7 of 21
In addition, a primary accrediting authority may delegate assessment activities to a third party
(assessor body). If any of these assessment activities are delegated to a third party, the accrediting
authority maintains responsibility for ensuring compliance with the standards established by NELAC.
1.5.2.3.2 Responsibilities of Secondary Accrediting Authorities
A secondary accrediting authority must be approved by NELAP as being an entity whose
accreditation and assessment program meets all of the requirements of NELAC for a secondary
accrediting authority.
[Effective July 1, 2001]
A secondary accrediting authority may require laboratories to submit an application, may issue
certificates of accreditation, and will exercise its legal authority for enforcement of all applicable laws
and rules. However, it must accept the laboratory accreditations through recognition, and must not
replicate any of the assessment functions, of a primary accrediting authority.
1.5.2.3.3 Accreditation Fees
Accrediting authorities may adopt and impose laboratory accreditation fees.
[Effective July 1, 2001]
1.5.3 Recognition
Recognition means that an accrediting authority will accept the accreditation status of a laboratory
issued by another NELAP accrediting authority. This principle of recognition is an element of the
national accreditation standard to which all accrediting authorities are held. In accepting the
accreditation status of a laboratory through recognition, the accrediting authority assumes the
responsibilities of a secondary accrediting authority as stated in Section 1.5.2.3.2. A State, in the role
of a secondary accrediting authority, which has a law or decision resulting from a legal action, the
legal effect of which precludes that State from granting any accreditation to a particular laboratory,
is not required to accept the accreditation of this laboratory.
Recognition among the environmental laboratory accreditation authorities is necessary to the success
of a national program. The essential ingredient of recognition is uniformity from one accrediting
authority to another. The mechanisms to assure this uniformity (e.g., uniform national performance
standards, thorough and consistent inspections, and comparable decisions on accreditation status
when deficiencies are uncovered) are necessary to ensure that recognition is equitable.
Federal accrediting authorities shall serve as the accrediting authority only for governmental
laboratories. Non-governmental laboratories shall not claim either primary or secondary accreditation
by a federal agency, even if the laboratory is performing analyses under contract to that agency.

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 8 of 21
1.5.4	Joint Federal and State Roles
NELAC shall be the joint responsibility of EPA, the States, and the other federal agencies. As
provided in the following section on the structure of NELAC and in the NELAC Bylaws, EPA, the
States, and the other federal agencies share responsibilities of governance, analysis and
establishment of policy and NELAC technical standards.
1.5.5	Assessor Bodies
An assessor body, operating under written agreement with an accrediting authority, may perform
specified functions of the assessment process. These functions may include: the review of the
laboratories' documentation regarding facilities, personnel, use of approved methods, and quality
assurance procedures; and conduct of on-site assessments, including review of performance in the
analysis of proficiency test samples. The assessor body reports to the accrediting authority under
which it is operating. The assessor body will provide full documentation to the accrediting authority.
Only the accrediting authority may determine if a laboratory has met the NELAC standards, may issue
certificates of accreditation, may make any decisions on the granting and withdrawal of a laboratory's
accreditation status, and may take responsibility for the accreditation process.
1.5.6	Other Parties
All other interested parties including, but not limited to, the laboratory industry, clients of the laboratory
industry, environ mental or other public interest groups, private industry, third party assessors, and the
general public, may participate in NELAC. In this role, these other parties may bring technical and
policy issues to the attention of NELAC, its Board of Directors, or its committees and subcommittees.
It is anticipated that these issues shall be brought to NELAC in the form of reports, presentations,
discussion material, or other forms of documentation for presentation at the NELAC annual, interim,
or committee/subcommittee meetings.
1.6 STRUCTURE OF NELAC
The structure of NELAC is shown in Figure 1-1. NELAC is composed of a Board of Directors, a
House of Representatives, a House of Delegates, Contributors, and a number of committees. There
are nine elected officials of NELAC: the Chair; the Chair-Elect; the immediate Past Chair; and six
members at large. The Standing Committees and Administrative Committees are appointed by the
Chair. The activities of the Standing and Administrative Committees are overseen by the Board of
Directors.
NELAC will meet twice a year: an annual meeting at which final action is taken on all issues, and an
interim meeting about six months prior to the annual meeting at which time committees meet to
receive, consider and deliberate on issues, propose and draft standards or policies for adoption at the
annual meeting.
NELAC shall also consider advice and comment provided by the Environmental Laboratory Advisory
Board (ELAB) chartered under the Federal Advisory Committee Act and the Accrediting Authority
Review Board (AARB).
1.6.1 The Board of Directors
The Board of Directors consists of the NELAC Chair, the Chair-Elect, immediate Past Chair, six
members elected at large from the active membership (to serve 3-year staggered terms), a NELAC
Director, and an Executive Secretary. The NELAP Director is the ex officio Director of NELAC. The
Executive Secretary is an EPA employee.

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 9 of 21
The Board of Directors serves as a policy and coordinating body in matters of national and
international significance and makes interim policy decisions when necessary between annual
meetings. Such policies shall have effective and expiration dates and/or shall be referred to the
appropriate committee for potential incorporation into the standards by a NELAC vote. The Board of
Directors has the overall responsibility and authority forthe supervisory, administrative and procedural
duties associated with NELAC. The Board of Directors will charge the committees with issues they
must address or take under consideration. Comments on the standards should be directed to the
committees through their respective chairs.
1.6.2	The Environmental Laboratory Advisory Board
The Environmental Laboratory Advisory Board (ELAB), chartered under the Federal Advisory
Committee Act, consists of members appointed by EPA and composed of a balance of non-State,
non-federal representatives, from the environmental laboratory community, and chaired by an ELAB
member. The ELAB advises EPA and NELAC on matters affecting the interests of the regulated
laboratories and other interested parties. The recommendations of the ELAB shall be presented to
the Chairs of the standing committees, the Board of Directors and to the EPA.
1.6.3	The Accrediting Authority Review Board
The Accrediting Authority Review Board (AARB) shall be an independent body composed of five
voting members and one non-voting member. Each member shall be appointed for a five-yearterm.
a)	The non-voting member shall be a representative of the USEPA and appointed by the NELAP
Director. The appointment should be rotated among the EPA Regions and EPA Headquarters.
b)	The five voting members shall consist of one federal accrediting authority official and four state
accrediting authority officials, of which at least three must be from NELAP-recognized state
accrediting authorities.
1)	The state accrediting authority officials should be from different EPA Regions.
2)	The appointments must be made in such a manner that the correct mix of membership is
maintained at all times. Any AARB member appointed prior to July 1, 1999 will remain an
AARB member even though the correct mix of membership may not be attained until July 1,
2004.
c)	Appointments to the AARB are made by the NELAP Director after consultation with the NELAC
Board of Directors. The Director will solicit nominees from the NELAC stakeholders and present
them to Board of Directors. Nominations are to be submitted to the NELAP Director at least three
months prior to the NELAC annual meeting.
d)	Voting members of the AARB shall not be NELAP staff, on the NELAC Board of Directors or a
member of a NELAC standing committee. The AARB annually selects one of its members to
serve as its chair.
e)	The AARB has responsibilities to:
1) monitor NELAP to assure that EPA is following the NELAC standards for recognizing
accrediting authorities;

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 10 of 21
2)	serve as a review board for accrediting authorities that have been denied NELAP recognition
or have had such recognition revoked (see Chapter 6, section 10), and providing advice to
the NELAP Director, who will make the final decision;
3)	report on its activities to the NELAC Board of Directors at each annual meeting;
4)	conduct an annual assessment of the NELAP process for recognizing accrediting authorities
in accordance with the NELAC standards.
1.	The AARB shall report its findings at the general opening session of each NELAC annual
meeting; and
2.	The report of the annual assessment shall be provided for posting on the NELAC web
site; and
5)	provide advice on issues referred by the NELAP Director, which may include matters raised
by entities other than the accrediting authorities.
1.6.4 The Participants in NELAC
The participants consist of two groups, i.e., Voting Members and Contributors.
Membership is limited to officials who are in the employ of the Government of the United States and
the States, and who are actively engaged in environmental programs or accreditation of
environmental laboratories. State and federal participants being compensated by the private sector
to inspect environmental laboratories or as consultants are considered to have a conflict of interest
and are ineligible for Voting Membership but may participate as Contributors. The Voting Member
may vote and is eligible to serve on all committees and the Board of Directors. At the annual meeting
the Voting Members are divided into a House of Representatives and a House of Delegates.
The House of Representatives is composed of one officially designated representative from each
State, one representative from each of eight EPA Assistant/Associate Administrators, and one
representative from each EPA Region. Each other cabinet level federal department or independent
agency (as defined in the Constitution) with environmental laboratory accreditation, certification or
evaluation activities may appoint one official to the House of Representatives.
The House of Delegates is composed of all other State and federal environmental officials. The size
of the House of Delegates is not limited.
Contributors are all other interested parties and groups. They include, but are not limited to,
laboratory personnel, industry representatives, environmental groups, the general public, laboratory
associations, industry associations, accreditation associations and retired Voting Members. The
Contributors may not vote, but can make presentations, comments or input at all stages of the
standards and procedures making process, and do have the ability to enter the substantive debate
on the floor of the meeting as it occurs. Contributors are eligible to serve as non-voting participants
on all committees.
1.6.4.1 Participation of the Voting Members and Contributors
Contributors, as well as Voting Members, have the right to appear before the standing committees
as they consider proposed standards and procedures related to the national accreditation program
and to debate the substantive issues before NELAC as such discussion occurs during the meeting.

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 11 of 21
Appearance before the committees will be in accordance with procedures approved by the Board of
Directors and Voting Membership.
1.6.5 The Committees
Two types of committee are associated with NELAC: Standing Committees and Administrative
Committees. Each committee has five Voting Members including the chair and five Contributors who
may not vote. Except forthe Nominating Committee, the Voting Members of each committee annually
select a chair from one of its Voting Members. All committees report to NELAC through the Board of
Directors. Following each annual meeting, the Board of Directors will make available an updated
roster of the Board of Directors, NELAC officers and committee participants and chairs.
New Standing Committees:
The Board of Directors shall establish a new standing committee if the following conditions exist: an
ad hoc group appointed by a NELAC Chair has been studying an issue which is likely to require
continuing attention by NELAC; the ad hoc group has reached a consensus and is ready to develop
standards; once the standards are implemented, they are likely to need evaluation and revision in the
future; no NELAC committee exists to deal with the issue; the topic is of broad scope and has impact
on a significant portion of the laboratory community; the Program Policy and Structure Committee has
reviewed the proposal and has recommended that the new standing committee be created; and the
NELAC Voting Members have approved the creation of the committee.
1.6.5.1 The Standing Committees
The participants of each committee serve for five years, with one Voting Member and one Contributor
being appointed each year. There are eight Standing Committees:
Program Policy and Structure Committee
Accrediting Authority Committee
Quality Systems Committee
Proficiency Testing Committee
On-site Assessment Committee
Accreditation Process Committee
Regulatory Coordination Committee
Field Activities Committee
The Standing Committees shall receive input regarding standards and test procedures, then process
this input into resolutions which shall be put before the Voting Membership at the annual meeting.
These resolutions will be made available not less than 45 calendar days prior to the annual meeting.
All resolutions shall be presented to the Voting Membership at the annual meeting for discussion and
ballot. The committees may also receive input via comments and presentations at the interim and
annual meetings. The committees shall draft resolutions which shall be made available not laterthan
30 calendar days prior to either the interim or annual meetings. The committees shall prepare and
arrange agenda items for interim meetings and annual meetings to be made available 30 calendar
days prior to the meeting.

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 12 of 21
[effective July 1, 2001]
1.6.5.1.1	Program Policy and Structure Committee
This committee generates the Constitution and Bylaws of NELAC, and interprets the intent and
meaning of the Constitution and Bylaws, presents amendments, proposes changes in organizational
structure, and defines roles and responsibilities as appropriate, for approval of the Voting
Membership. This committee develops modifications to the scope, structure, and requirements to the
tiers and fields of accreditation.
1.6.5.1.2	Accrediting Authority Committee
This committee develops the standards for use by EPA to oversee compliance by State and federal
accrediting authorities with NELAC standards. This committee considers matters concerning
implementation of recognition among accrediting authorities.
1.6.5.1.3	Quality Systems Committee
This committee develops and keeps current uniform standards for quality systems in testing
operations. The elements of the quality system include organizational structure, responsibilities,
procedures, processes and resources (e.g., facilities, staff, equipment) for implementing quality
management in testing operations.
1.6.5.1.4	Proficiency Testing Committee
This committee develops standards forthe proficiency testing samples, develops criteria forselection
of the providers of the samples, and develops and updates protocols for the use of proficiency test
samples and data in the accreditation of laboratories.
1.6.5.1.5	On-Site Assessment Committee
This committee generates procedures forthe on-site assessments, and publishes standard check-lists
based on these procedures. This committee also establishes the frequency of inspection, and the
minimum education, experience, and training requirements of the assessors.
1.6.5.1.6	Accreditation Process Committee
This committee generates and develops procedures forthe administrative aspects of the accreditation
process of environmental laboratories, for use by the accrediting authorities, including the
requirements for accreditation, procedures for changes in accreditation status, roles and
responsibilities of laboratories, and appeal processes.
1.6.5.1.7	Regulatory Coordination Committee
This committee provides the Standing Committees with current information on regulations and laws
that impact laboratory testing and accreditation. The Regulatory Coordination Committee is also
responsible for the development of model language for state legislation and regulations that reflect
the findings and actions of NELAC.

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 13 of 21
1.6.5.1.8 Field Activities Committee
This committee develops and maintains uniform standards for field measurement and sampling, and
coordinates the development of these standards with other standing committees.
1.6.5.2 The Administrative Committees
Administrative Committees have varying terms. The duties are outlined below. The term of service
shall be three years; two Voting Members and two Contributors will be appointed each of two years
and one Voting Member and one Contributor the third year, except for the Nominating Committee
(see below).
1.6.5.2.1	Nominating Committee
The chair is the NELAC Past Chair. Four Voting Members and five Contributors shall be appointed
annually to serve one year. This committee presents nominees for all elective offices at the annual
meeting. The names of these nominees shall appear in the report of the Nominating Committee and
be published in the meeting announcement.
1.6.5.2.2	Membership and Outreach Committee
This committee initiates membership invitations, publicizes NELAC to prospective participants,
coordinates and resolves participants' concerns, establishes credentialing criteria and resolves
credentialing conflicts of Voting Members.
This committee solicits and develops informational materials to promote understanding and
appreciation of the importance of the NELAC objectives.
This committee promotes a spirit of cooperation and timely dialogue between NELAC and other
organizations and federal agencies.
1.7 CONDUCT OF CONFERENCE BUSINESS
1.7.1	The Generation of Standards
The process for the generation and adoption of standards by a State accrediting authority is shown
in Figure 1-2. The standards forthe accreditation of laboratories begin with recommendations made
within or to the committees. Committees shall propose standards in the form of resolutions on which
the Voting Membership shall vote. Standards proposed by the committees are publicized on the
NELAC electronic bulletin board by EPA not later than 45 calendar days prior to the date of the
meeting at which they will be considered.
Proposed amendments from the floor to specific standards and proposals offered by the committee
for adoption by NELAC shall be allowed in the manner described in the Constitution and Bylaws.
Amendments to the report describing committee activities over the year will not be allowed without
the concurrence of the chair of the subject committee and the concurrence of the Chair of NELAC.
1.7.2	Meetings
1.7.2.1 Annual Meeting
An annual meeting of NELAC shall be held to conduct business including, but not limited to, election
of officers, consideration of issues for presentation to the membership for voting, receiving reports

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 14 of 21
from committees, task groups, or other sources, and conducting other business of NELAC. All final
action on resolutions or proposals shall take place at the annual meeting.
The Board of Directors shall determine the place and dates forthe annual meeting, and shall publish
this information on the NELAC electronic bulletin board at least 90 calendar days prior to the annual
meeting.
A completed registration for the annual meeting shall serve as the application for participation as
Voting Member or Contributor. The registration form must be completed by all potential participants,
whether or not attending the annual meeting. Prior to the annual meeting, the Executive Secretary
shall certify the names of the Voting Members and their alternates of the House of Representatives
to the Board of Directors. The Nominating Committee shall present, to the Board of Directors,
nominees for all elective offices for the annual meeting. The names and qualifications of the
nominees shall be published in the annual meeting announcement.
The following deadlines will apply in preparing and submitting material forthe annual meeting:
a)	Sixty calendar days prior to the date of the annual meeting, each of the standing committees shall
present to the Board of Directors a summary of the issues and matters considered by the
committees over the course of the year. This report shall discuss all matters which the committee
considered since its last report, including how the committee disposed ofthe issues it considered.
The report shall also contain draft standards for consideration by NELAC.
b)	Committees shall prepare and arrange agenda items and resolutions forthe annual meeting.
These, and other resolutions received by the Board of Directors will be made available not less
than 45 calendar days prior to the meeting.
c)	Standards proposed by the committees for consideration at the annual meeting shall be
publicized on the electronic bulletin board not less than 45 calendar days prior to the annual
meeting.
As soon as possible, but no later than 90 calendar days after the annual meeting, the Board of
Directors shall make available an updated roster ofthe Board of Directors, NELAC officers, committee
members and chairs, and minutes and findings ofthe meeting to the participants. EPA shall publish
the revised standards as soon as possible, but no later than 90 calendar days after the annual
meeting. Changes in organization and/or procedures of NELAC proposed at the annual meeting shall
not be acted upon until the annual meeting following the annual meeting at which proposed.
1.7.2.2 Interim Meeting
The interim meeting, at which time committees meet to receive, consider and debate issues, and
propose and draft standards or policies forthe annual meeting, shall be scheduled at least six months
prior to the annual meeting.
The Board of Directors shall determine the place and dates forthe interim meeting, and shall publish
this information on the NELAC electronic bulletin board at least 90 calendar days prior to the interim
meeting.
Committees shall prepare and arrange agenda items forthe interim meeting. The agenda shall be
approved by the Board of Directors and will be made available not less than 30 calendar days prior
to the date ofthe meeting.

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 15 of 21
Conclusions and findings of the interim meeting shall be provided to the participants not laterthan 90
calendar days following the interim meeting.
1.7.2.3	Special Meetings
The NELAC Chair is authorized to call a meeting of the Board of Directors at any time deemed
necessary by the Chair to be in the best interests of NELAC. Announcements of the meetings and
meeting summaries or reports shall be made available to the participants.
1.7.2.4	Committee Meetings
Committees of NELAC are authorized to hold meetings at times other than the annual or interim
meeting. Announcements ofthe meetings and meeting summaries or reports shall be made available
to the participants.
1.8 ORGANIZATION OF THE ACCREDITATION REQUIREMENTS
1.8.1 Scope of Accreditation
Prior to NELAP initial accreditation and to maintain continuing accreditation, laboratories must meet
all relevant EPA regulatory requirements, including quality assurance/quality control requirements.
Laboratories must also meet the general requirements found in Chapter 5 and the specific quality
control requirements for the type of testing being performed, as found in Appendix D of Chapter 5.
For laboratory testing, accreditation may be granted in conformance with a Field of Accreditation
tiered approach as follows:
Matrix — Technology/Method —Analyte/Analyte Group.
When adopted by the Conference, for Field Sampling, accreditation will be granted in conformance
with a Field of Accreditation tiered approach as follows:
Matrix— Field Sampling Method —Analyte/Analyte Group.
Technology/method is a specific arrangement of analytical instruments and detection systems, and/or
preparation techniques combined with a test method as defined in the glossary. Examples of
technologies are GC/ECD, ICP/MS, etc. Technology groupings will be published on the NELAC
Website. The tables will be amended from time to time as deemed appropriate by the Program Policy
and Structure Committee.
Matrix is a description of sample type. Matrices include 1) Drinking Water, 2) Non-Potable Water (to
include all aqueous samples that are not public drinking water, e.g. RCRA water samples, treatment
plant additives, etc.), 3) Solid and Chemical Materials (to include soils, sediments, other solids and
non-aqueous liquids), 4) Biological Tissues (not as yet defined in the scope of NELAC) and 5) Air and
Emissions (to include ambient air and stack emissions). Other more specific matrices are used
elsewhere in the standards.
Analyte/Analyte Group indicates that a laboratory may be accredited by individual analyte or for a
group of analytes. If accredited by analyte group, the laboratory must perform a Demonstration of
Capability (DOC) for each analyte, and the laboratory must perform all required QC and satisfactorily
meet the PT requirements as defined in Chapter 2. It is possible that PT samples may not be
available for all analytes. Accrediting authorities may grant accreditation by analyte group. All

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 16 of 21
accrediting authorities accrediting by analyte group must use the same analyte groups, which will be
determined by the Program Policy and Structure Committee and published on the NELAC web site.
Typical examples of Fields of Accreditation using the tiered approach, including PBMS examples, are:
Drinking Water— HPLC - UV/EPA 555 — Pentachlorophenol
Non-Potable Water — GC - MS/EPA 625 — PAHs
Solid and Chemical Materials — ICPAES/EPA 6010 — Arsenic
Drinking Water — GC - ECD/EPA 505 — Atrazine
Drinking Water — CVAA (with EPA 1631 extraction)/PBMS — Mercury
Non-Potable Water — Headspace GCMS/PBMS — Tetraethyl Lead
The following example shows the tiered approach applied to a laboratory seeking accreditation in
hazardous waste organic testing under the auspices of RCRA. The laboratory must meet all the
requirements listed in general laboratory (NELAC Chapter 5), chemistry (NELAC Chapter 5, Appendix
D.1), the RCRA regulations (40CFR261), and the method(s) used (e.g., SW846 5030/ 8260). In all
cases, a NELAC accredited laboratory must be accredited for the specific method it uses. In some
cases the regulations mandate the method to be used (e.g., 40CFR261 specifies SW846 Method
1311, TCLP). In other cases the regulations provide guidance for the methods which can be used
(e.g., 40CFR264, Appendix IX, suggests applicable methods). Finally, in some situations the
regulations provide no guidance as to the methods to be used (e.g., 40CFR268 lists analytes required
to be measured, with no guidance on methods). In those cases where the test method is not
mandated by regulation, the laboratory must be accredited for the specific method used, as
documented in the laboratory's SOP (see Chapter 5). This method must meet the relevant start-up,
calibration, and on-going validation and QC requirements specified in Chapter 5. The tiered approach
allows for the incorporation of performance based measurement systems (PBMS) by substituting
PBMS for the specified analytical methods when allowed under EPA regulations.
Additional accrediting authorities may recognize a laboratory's primary accreditation for certain tiers
without additional review and on-site assessment.
For example, under a tiered approach:
1.	A laboratory's home state (State A) only provides accreditation for Drinking Water. As
primary accrediting authority, State A accredits the laboratory for the Field of Accreditation
Drinking Water — GC-ECD/EPA 505 — Atrazine.
2.	The laboratory then applies to a second state (State B) to be its primary accrediting authority
for the Field of Accreditation
Non-Potable Water — GC-ECD/EPA 612 — 1,2-dichlorobenzene.
3.	State B recognizes the technology GC-ECD, since that technology was accredited by State
A: i.e., State A has examined the instrumentation, checked run logs, interviewed the
analyst(s) operating that instrument, etc.

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 17 of 21
4. To accredit the laboratory for the requested Field of Accreditation, State B may only require
the SOP (for Method 612), the DOC, other QC data and satisfactory PT results (where PT's
are available, see Chapter 2) for the analyte 1,2-dichlorobenzene. State B may obtain these
documents from the laboratory and PT providers as appropriate, review them and approve
them without the need for an on-site assessment. If there is any concern about the laboratory
performance, the NELAC standards allow any accrediting authority to conduct announced
or unannounced on-site assessments at any time.
The procedures and conditions for interim accreditation are described in Chapter 4.
[Effective July 1, 2001]
1.8.2	Supplemental Accreditation Requirements
In addition, a category of supplemental accreditation requirements is designated for additional
methods or analytes required by an accrediting authority. Supplemental accreditation requirements
shall be reserved for methods or analytes that are not required under any of the EPA programs that
are part of NELAC, and shall not be used to modify any NELAC standards for analytes or methods.
Any supplemental accreditation requirements essential to meet the specific needs of an accrediting
authority would be added at the method-specific or analyte level, and must be approved by NELAP
and made available to all NELAC participants. Exceptions to this requirement maybe necessary (e.g.,
national security concerns) and will be processed as waivers by the NELAP Director.
1.8.3	General Laboratory Requirements
The general requirements are applicable to all laboratory applicants regardless of their size, volume
of business, or field of accreditation. The organizational structure, or procedures used by applicant
laboratory organizations to meet these general requirements may differ as a function of size or scope
of testing of an organization. Under the tiered approach the general requirements shall include the
elements outlined in Chapter 5.
The following applicable requirements are presented in Chapter 5 (Quality Systems): Organization
and Management (5.4); Quality System - Establishment, Audits, Essential Quality Controls and Data
verification (5.5); Personnel (5.6); Physical Facilities - Accommodation and Environment (5.7);
Equipment and Reference Materials (5.8); Measurement Traceability and Calibration (5.9); Test
Methods and Standard Operating Procedures (5.10); Sample Handling, Sample Acceptance Policy
and Sample Receipt (5.11); Records (5.12); Laboratory Report Format and Contents (5.13);
Subcontracting Analytical Samples (5.14); Outside Support Services and Supplies (5.15); and
Complaints (5.16).
1.8.4 General Field Sampling Requirements
(To be developed)

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 18 of 21
[Effective July 1, 2001]
1.8.5 Chemistry Requirements
The following applicable requirements are presented in Section D.1 of Appendix D of Chapter 5
(Quality Systems): Positive and Negative Controls (D.1.1); Analytical Variability/Reproducibility
(D.1.2); Method Evaluation (D.1.3); Detection Limits (D.1.4); Data Reduction (D.1.5); Quality of
Standards and Reagents (D.1.6); Selectivity (D.1.7); and Constant and Consistent Test Conditions
(D.1.8).
1.8.6	Whole Effluent Toxicity Requirements
The following applicable requirements are presented in Section D.2 of Appendix D of Chapter 5
(Quality Systems): Positive and Negative Controls (D.2.1); Variability and/or Reproducibility (D.2.2);
Accuracy (D.2.3); Test Sensitivity (D.2.4); Selection of Appropriate Statistical Analysis Methods
(D.2.5); Selection and Use of Reagents and Standards (D.2.6); Selectivity (D.2.7); and Constant and
Consistent Test Conditions (D.2.8).
1.8.7	Microbiology Requirements
The following applicable requirements are presented in Section D.3 of Appendix D of Chapter 5
(Quality Systems): Positive and Negative Controls (D.3.1); Test Variability/Reproducibility (D.3.2);
Method Evaluation (D.3.3); Test Performance (D.3.4); Data Reduction (D.3.5); Quality of Standards,
Reagents and Media (D.3.6); Selectivity (D.3.7); and Constant and Consistent Test Conditions
(D.3.8).
[Effective July 1, 2001]
1.8.8 Radiochemistry Requirements
The following applicable requirements are presented in Section D.4 of Appendix D of Chapter 5
(Quality Systems); Negative and Positive Controls (D.4.1); Analytical Variability/Reproducibility
(D.4.2); Method Evaluation (D.4.3); Radiation Measurement System Calibration (D.4.4); Detection
Limits (D.4.5); Data Reduction (D.4.6); Quality of Standards and Reagents (D.4.7); and Constant and
Consistent Test Conditions (D.4.8).
1.8.9 Microscopy Requirements
(To be developed)

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 19 of 21
[Effective July 1, 2001]
1.8.10 Field Measurement Requirements
(To be developed)

-------
NATIONAL ENVIRONMENTAL LABORATORY ACCREDITATION CONFERENCE
Accrediting Authority Review Board
Board of
Directors
ELAB
Voting
Members
	1	
Contributors
	1	
Standing
Committees
Administrative
Committees
House of
Representatives
House of
Delegates
_One Representative from
each State
One Representative from
-each EPA Assistant/Associate Administrator
and each EPA Region
	Federal Agency
Officials
—State Officials
One Representative from
each Federal Agency
_General
Public
laboratories
Regulated
-Industry
_Environmental
Groups
_Laboratory/lndustry
Associations
Assessor
-Bodies
Retired Voting
"Members
Accreditation
Process
Accrediting
Authority
_Field
Activities
On-site
Assessment
Proficiency
Testing
Program Policy
and Structure
Quality
Systems
_Regulatory
Coordination
	Membership
Outreach
—Nominating
Figure 1-1. NELAC Structure

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 21 of 21
No
House of Representatives
and House of Delegates
approve Standards?
Proposed Standards published by EPA
Interim Meeting for
input and preparation of draft Standards
Annual Meeting
Committees present Draft Standards as
Resolutions
Committee proposes Standards or
changes to Standards
Yes
State and/or Federal
Agency
Adopts Standards?
No
Yes
No
Yes
Eate and/or Federal Agency
participates
in NELAP for the relevant
.	 field of testing
Approved Standards Published by EPA
Laboratories in State seek
accreditation from
any primary accrediting authority
Laboratories in State seek
accreditation from
the primary accrediting authority in
that State
Figure 1-2. Flowchart for Standards Development and Implementation

-------
PROGRAM POLICY AND STRUCTURE
APPENDIX A
GLOSSARY

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 1A-1 of 12
APPENDIX A - GLOSSARY
Acceptance Criteria: specified limits placed on characteristics of an item, process, or service
defined in requirement documents. (ASQC)
Accreditation: the process by which an agency or organization evaluates and recognizes a
laboratory as meeting certain predetermined qualifications or standards, thereby accrediting the
laboratory. In the context of the National Environmental Laboratory Accreditation Program (NELAP),
this process is a voluntary one. (NELAC)
Accrediting Authority: the Territorial, State, or federal agency having responsibility and
accountability for environmental laboratory accreditation and which grants accreditation
(NELAC) [1.5.2.3]
Accrediting Authority Review Board (AARB): five voting members from Federal and State
Accrediting Authorities and one non-voting member from USEPA, appointed by the NELAP Director,
in consultation with the NELAC Board of Directors, forthe purposes stated in 1,6.3.e. (NELAC) [1.6.3]
Accuracy: the degree of agreement between an observed value and an accepted reference value.
Accuracy includes a combination of random error (precision) and systematic error (bias) components
which are due to sampling and analytical operations; a data quality indicator. (QAMS)
Assessor Body: the organization that actually executes the accreditation process, i.e., receives and
reviews accreditation applications, reviews QA documents, reviews proficiency testing results,
performs on-site assessments, etc., whether EPA, the State, or contracted private party. (NELAC)
Analyst: the designated individual who performs the "hands-on" analytical methods and associated
techniques and who is the one responsible for applying required laboratory practices and other
pertinent quality controls to meet the required level of quality. (NELAC)
Applicant Laboratory or Applicant: the laboratory or organization applying for NELAP
accreditation. (NELAC)
Assessment: the evaluation process used to measure or establish the performance, effectiveness,
and conformance of an organization and/or its systems to defined criteria (to the standards and
requirements of NELAC). (NELAC)
Assessment Criteria: the measures established by NELAC and applied in establishing the extent
to which an applicant is in conformance with NELAC requirements. (NELAC)
Assessment Team: the group of people authorized to perform the on-site inspection and proficiency
testing data evaluation required to establish whether an applicant meets the criteria for NELAP
accreditation. (NELAC)
Assessor: one who performs on-site assessments of accrediting authorities and laboratories'
capability and capacity for meeting NELAC requirements by examining the records and other physical
evidence for each one of the tests for which accreditation has been requested. (NELAC)
Audit: a systematic evaluation to determine the conformance to quantitative and qualitative
specifications of some operational function or activity. (EPA-QAD)

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 1A-2 of 12
Batch: environmental samples that are prepared and/or analyzed together with the same process
and personnel, using the same lot(s) of reagents. A preparation batch is composed of one to 20
environmental samples of the same NELAC-defined matrix, meeting the above mentioned criteria and
with a maximum time between the start of processing of the first and last sample in the batch to be
24 hours. An analytical batch is composed of prepared environmental samples (extracts, digestates
or concentrates) which are analyzed together as a group. An analytical batch can include prepared
samples originating from various environmental matrices and can exceed 20 samples. (NELAC
Quality Systems Committee)
Blank: a sample that has not been exposed to the analyzed sample stream in order to monitor
contamination during sampling, transport, storage or analysis. The blank is subjected to the usual
analytical and measurement process to establish a zero baseline or background value and is
sometimes used to adjust or correct routine analytical results. Blanks include:
Equipment Blank: a sample of analyte-free media which has been used to rinse common
sampling equipment to check effectiveness of decontamination procedures. (NELAC)
Field Blank: blank prepared in the field by filling a clean container with pure de-ionized water and
appropriate preservative, if any, for the specific sampling activity being undertaken. (EPA
OSWER)
Instrument Blank: a clean sample (e.g., distilled water) processed through the instrumental steps
of the measurement process; used to determine instrument contamination. (EPA-QAD)
Method Blank: a sample of a matrix similar to the batch of associated samples (when available)
that is free from the analytes of interest and is processed simultaneously with and underthe same
conditions as samples through all steps of the analytical procedures, and in which no target
analytes or interferences are present at concentrations that impact the analytical results for
sample analyses. (NELAC)
Reagent Blank: (method reagent blank): a sample consisting of reagent(s), without the target
analyte or sample matrix, introduced into the analytical procedure at the appropriate point and
carried through all subsequent steps to determine the contribution of the reagents and of the
involved analytical steps. (QAMS)
Blind Sample: a sub-sample for analysis with a composition known to the submitter. The
analyst/laboratory may know the identity of the sample but not its composition. It is used to test the
analyst's or laboratory's proficiency in the execution of the measurement process. (NELAC)
Calibration: to determine, by measurement or comparison with a standard, the correct value of each
scale reading on a meter, instrument, or other device. The levels of the applied calibration standard
should bracket the range of planned or expected sample measurements. (NELAC)
Calibration Curve: the graphical relationship between the known values, such as concentrations,
of a series of calibration standards and their instrument response. (NELAC)
Calibration Method: a defined technical procedure for performing a calibration. (NELAC)
Calibration Standard: a substance or reference material used to calibrate an instrument. (QAMS)
Certified Reference Material (CRM): a reference material one or more of whose property values
are certified by a technically valid procedure, accompanied by or traceable to a certificate or other
documentation which is issued by a certifying body. (ISO Guide 30 - 2.2)

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 1A-3 of 12
Chain of Custody Form: record that documents the possession of the samples from the time of
collection to receipt in the laboratory. This record generally includes: the number and types of
containers; the mode of collection; collector; time of collection; preservation; and requested analyses.
(NELAC)
Clean Air Act: the enabling legislation in 42 U.S.C. 7401 etseq., Public Law 91-604, 84 Stat. 1676
Pub. L. 95-95, 91 Stat., 685 and Pub. L. 95-190, 91 Stat., 1399, as amended, empowering EPA to
promulgate air quality standards, monitor and to enforce them. (NELAC)
Comprehensive Environmental Response, Compensation and Liability Act
(CERCLA/Superfund): the enabling legislation in 42 U.S.C. 9601-9675 etseq., as amended by the
Superfund Amendments and Reauthorization Act of 1986 (SARA), 42 U.S.C. 9601 etseq., to eliminate
the health and environmental threats posed by hazardous waste sites. (NELAC)
Confidential Business Information (CBI): information that an organization designates as having
the potential of providing a competitor with inappropriate insight into its management, operation or
products. NELAC and its representatives agree to safeguarding identified CBI and to maintain all
information identified as such in full confidentiality.
Confirmation: verification of the identity of a component through the use of an approach with a
different scientific principle from the original method. These may include, but are not limited to:
Second column confirmation
Alternate wavelength
Derivatization
Mass spectral interpretation
Alternative detectors or
Additional cleanup procedures.
(NELAC)
Conformance: an affirmative indication or judgement that a product or service has met the
requirements of the relevant specifications, contract, or regulation; also the state of meeting the
requirements. (ANSI/ASQC E4-1994)
Contributor: a participant in NELAC who is not a Voting Member. Contributors include
representatives of laboratories, manufacturers, industry, business, consumers, academia, laboratory
associations, laboratory accreditation associations, counties, municipalities, and other political
subdivisions, other federal and state officials not engaged in environmental activities, and other
persons who are interested in the objectives and activities of NELAC. (NELAC)[Art III, Const]
Corrective Action: the action taken to eliminate the causes of an existing nonconformity, defect or
other undesirable situation in order to prevent recurrence. (ISO 8402)
Data Audit: a qualitative and quantitative evaluation of the documentation and procedures
associated with environmental measurements to verify that the resulting data are of acceptable quality
(i.e., that they meet specified acceptance criteria). (NELAC)
Data Reduction: the process of transforming raw data by arithmetic or statistical calculations,
standard curves, concentration factors, etc., and collation into a more useable form. (EPA-QAD)
Deficiency: an unauthorized deviation from acceptable procedures or practices, or a defect in an
item. (ASQC)

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 1A-4 of 12
Delegate: any environmental official of the States orthe Federal government not sitting in the House
of Representatives, who is eligible to vote in the House of Delegates. (NELAC)
Demonstration of Capability: a procedure to establish the ability of the analyst to generate
acceptable accuracy. (NELAC)
Denial: to refuse to accredit in total or in part a laboratory applying for initial accreditation or
resubmission of initial application. (NELAC)[4.4.1]
Detection Limit: the lowest concentration or amount of the target analyte that can be identified,
measured, and reported with confidence that the analyte concentration is not a false positive value.
See Method Detection Limit. (NELAC)
Document Control: the act of ensuring that documents (and revisions thereto) are proposed,
reviewed for accuracy, approved for release by authorized personnel, distributed properly and
controlled to ensure use of the correct version at the location where the prescribed activity is
performed. (ASQC)
Environmental Laboratory Advisory Board (ELAB): a Federal Advisory Committee, with members
appointed by EPA and composed of a balance of non-state, non-federal representatives, from the
environmental laboratory community, and chaired by an ELAB member. (NELAC)[1.6.2]
Environmental Monitoring Management Council (EMMC): an EPA Committee consisting of EPA
managers and scientists, organized into a Policy Council, a Steering Group, ad hoc Panels, and work
groups addressing specific objectives, established to address EPA-wide monitoring issues. (NELAC)
Federal Insecticide, Fungicide and Rodenticide Act (FIFRA): the enabling legislation under
7 U.S.C. 135 et seq., as amended, that empowers the EPA to register insecticides, fungicides, and
rodenticides. (NELAC)
Federal Water Pollution Control Act (Clean Water Act, CWA): the enabling legislation under 33
U.S.C. 1251 etseq., Public Law 92-50086 Stat. 816, that empowers EPA to set discharge limitations,
write discharge permits, monitor, and bring enforcement action for non-compliance. (NELAC)
[effective July 1, 2001[
Field of Accreditation: (previously Field of Testing) NELAC's approach to accrediting laboratories
by matrix, technology/method and analyte/analyte group. Laboratories requesting accreditation for
a matrix-technology/method-analyte/analyte group combination or for an updated/improved method
are required to submit only that portion of the accreditation process not previously addressed (see
NELAC, section 1.8 fl). (NELAC)
Field of Proficiency Testing: NELAC's approach to offering proficiency testing by matrix,
technology, and analyte/analyte group.
Finding: an assessment conclusion that identifies a condition having a significant effect on an item
or activity. An assessment finding is normally a deficiency and is normally accompanied by specific
examples of the observed condition. (NELAC)
Governmental Laboratory: as used in these standards, a laboratory owned by a Federal, state, or
tribal government; includes government-owned contractor-operated laboratories. (NELAC).

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 1A-5 of 12
Holding Times (Maximum Allowable Holding Times): the maximum times that samples may be
held prior to analysis and still be considered valid or not compromised. (40 CFR Part 136)
Inspection: an activity such as measuring, examining, testing, or gauging one or more
characteristics of an entity and comparing the results with specified requirements in orderto establish
whether conformance is achieved for each characteristic. (ANSI/ASQC E4-1994)
Interim Accreditation: temporary accreditation status for a laboratory that has met all accreditation
criteria except for a pending on-site assessment which has been delayed for reasons beyond the
control of the laboratory. (NELAC)
Internal Standard: a known amount of standard added to a test portion of a sample as a reference
for evaluating and controlling the precision and bias of the applied analytical method. (NELAC)
Laboratory: a body that calibrates and/or tests. (ISO 25)
Laboratory Control Sample (however named, such as laboratory fortified blank, spiked blank,
or QC check sample): a sample matrix, free from the analytes of interest, spiked with verified known
amounts of analytes or a material containing known and verified amounts of analytes. It is generally
used to establish intra-laboratory or analyst specific precision and bias or to assess the performance
of all or a portion of the measurement system. (NELAC)
Laboratory Duplicate: aliquots of a sample taken from the same container under laboratory
conditions and processed and analyzed independently. (NELAC)
Legal Chain of Custody Protocols: procedures employed to record the possession of samples from
the time of sampling until analysis and are performed at the special request of the client. These
protocols include the use of a Chain of Custody Form that documents the collection, transport, and
receipt of compliance samples by the laboratory. In addition, these protocols document all
handling of the samples within the laboratory. (NELAC)
Manager (however named): the individual designated as being responsible forthe overall operation,
all personnel, and the physical plant of the environmental laboratory. A supervisor may report to the
manager. In some cases, the supervisor and the manager may be the same individual. (NELAC)
Matrix: the substrate of a test sample.
Field of Accreditation Matrix: these matrix definitions shall be used when accrediting a laboratory
(see Field of Accreditation).
Drinking Water: any aqueous sample that has been designated a potable or potential potable
water source.
Non-Potable Water: any aqueous sample excluded from the definition of Drinking Water
matrix. Includes surface water, groundwater, effluents, watertreatment chemicals, and TCLP
or other extracts.
Solid and Chemical Materials: includes soils, sediments, sludges, products and by-products
of an industrial process that results in a matrix not previously defined.
Biological Tissue: any sample of a biological origin such as fish tissue, shellfish, or plant
material. Such samples shall be grouped according to origin.

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 1A-6 of 12
Air and Emissions: whole gas or vapor samples including those contained in flexible or rigid
wall containers and the extracted concentrated analytes of interest from a gas or vapor that
are collected with a sorbenttube, impinger solution, filter, or other device. (NELAC)
Quality System Matrix: These matrix definitions are an expansion of the field of accreditation
matrices and shall be used for purposes of batch and quality control requirements (see Appendix
D of Chapter 5). These matrix distinctions shall be used:
Aqueous: any aqueous sample excluded from the definition of Drinking Water matrix or
Saline/Estuarine source. Includes surface water, groundwater, effluents, and TCLP or other
extracts.
Drinking Water: any aqueous sample that has been designated a potable or potential potable
water source.
Saline/Estuarine: any aqueous sample from an ocean or estuary, or other salt water source
such as the Great Salt Lake.
Non-aqueous Liquid: any organic liquid with <15% settleable solids.
Biological Tissue: any sample of a biological origin such as fish tissue, shellfish, or plant
material. Such samples shall be grouped according to origin.
Solids: includes soils, sediments, sludges and other matrices with >15% settleable solids.
Chemical Waste: a product or by-product of an industrial process that results in a matrix not
previously defined.
Air and Emissions: whole gas or vapor samples including those contained in flexible or rigid
wall containers and the extracted concentrated analytes of interest from a gas or vapor that
are collected with a sorbenttube, impinger solution, filter, or other device. (NELAC)
Matrix Spike (spiked sample or fortified sample): a sample prepared by adding a known mass
of target analyteto a specified amount of matrix sample for which an independent estimate of Target
analyte concentration is available. Matrix spikes are used, for example, to determine the effect of the
matrix on a method's recovery efficiency. (QAMS)
Matrix Spike Duplicate (spiked sample or fortified sample duplicate): a second replicate matrix
spike prepared in the laboratory and analyzed to obtain a measure of the precision of the recovery
for each analyte. (QAMS)
May: denotes permitted action, but not required action. (NELAC)
Method: see Test Method
Method Detection Limit: the minimum concentration of a substance (an analyte) that can be
measured and reported with 99% confidence that the analyte concentration is greater than zero and
is determined from analysis of a sample in a given matrix containing the analyte. (40 CFR Part 136,
Appendix B)
Must: denotes a requirement that must be met. (Random House College Dictionary)

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 1A-7 of 12
National Accreditation Database: the publicly accessible database listing the accreditation status
of all laboratories participating in NELAP. (NELAC)
National Institute of Standards and Technology (NIST): an agency of the US Department of
Commerce's Technology Administration that is working with EPA, States, NELAC, and other public
and commercial entities to establish a system under which private sector companies and interested
States can be accredited by NIST to provide NIST-traceable proficiency testing (PT) to those
laboratories testing drinking water and wastewater. (NIST)
National Environmental Laboratory Accreditation Conference (NELAC): a voluntary organization
of State and Federal environmental officials and interest groups purposed primarily to establish
mutually acceptable standards for accrediting environmental laboratories. A subset of NELAP.
(NELAC)
National Environmental Laboratory Accreditation Program (NELAP): the overall National
Environmental Laboratory Accreditation Program of which NELAC is a part. (NELAC)
National Voluntary Laboratory Accreditation Program (NVLAP): a program administered by NIST
that is used by providers of proficiency testing to gain accreditation for all compounds/matrices for
which NVLAP accreditation is available, and for which the provider intends to provide NELAP PT
samples. (NELAC)
Negative Control: measures taken to ensure that a test, its components, or the environment do not
cause undesired effects, or produce incorrect test results. (NELAC)
NELAC Standards: the plan of procedures for consistently evaluating and documenting the ability
of laboratories performing environmental measurements to meet nationally defined standards
established by the National Environmental Laboratory Accreditation Conference. (NELAC)
NELAP Recognition: the determination by the NELAP Director that an accrediting authority meets
the requirements of the NELAP and is authorized to grant NELAP accreditation to laboratories.
(NELAC)
Non-governmental Laboratory: any laboratory not meeting the definition of the governmental
laboratory. (NELAC)
Performance Audit: the routine comparison of independently obtained qualitative and quantitative
measurement system data with routinely obtained data in order to evaluate the proficiency of an
analyst or laboratory. (NELAC)
Performance Based Measurement System (PBMS): a set of processes wherein the data quality
needs, mandates or limitations of a program or project are specified and serve as criteria for selecting
measurement processes which will meet those needs in a cost-effective manner. (NELAC)
Positive Control: measures taken to ensure that a test and/or its components are working properly
and producing correct or expected results from positive test subjects. (NELAC)
Precision: the degree to which a set of observations or measurements of the same property,
obtained under similar conditions, conform to themselves; a data quality indicator. Precision is usually
expressed as standard deviation, variance or range, in either absolute or relative terms. (NELAC)
Preservation: refrigeration and/or reagents added at the time of sample collection (or later) to
maintain the chemical and/or biological integrity of the sample. (NELAC)

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 1A-8 of 12
Primary Accrediting Authority: the agency or department designated at the Territory, State or
Federal level as the recognized authority with responsibility and accountability for granting NELAC
accreditation for a specified field of testing. (NELAC)[1.5.2.3]
Proficiency Testing: a means of evaluating a laboratory's performance under controlled conditions
relative to a given set of criteria through analysis of unknown samples provided by an external source.
(NELAC)[2.1]
Proficiency Testing Oversight Body/Proficiency Testing Provider Accreditor (PTOB/PTPA):
an organization with technical expertise, administrative capacity and financial resources sufficient to
implement and operate a national program of PT provider evaluation and oversight that meets the
responsibilities and requirements established by NELAC standards. (NELAC)
Proficiency Testing Program: the aggregate of providing rigorously controlled and standardized
environmental samples to a laboratory for analysis, reporting of results, statistical evaluation of the
results and the collective demographics and results summary of all participating laboratories.
(NELAC)
Proficiency Testing Study Provider: any person, private party, or government entity that meets
stringent criteria to produce and distribute NELAC PT samples, evaluate study results against
published performance criteria and report the results to the laboratories, primary accrediting
authorities, PTOB/PTPA, and NELAP. (NELAC)
Proficiency Test Sample (PT): a sample, the composition of which is unknown to the analyst and
is provided to test whether the analyst/laboratory can produce analytical results within specified
acceptance criteria. (QAMS)
Protocol: a detailed written procedure forfield and/or laboratory operation (e.g., sampling, analysis)
which must be strictly followed. (EPA-QAD)
Quality Assurance: an integrated system of activities involving planning, quality control, quality
assessment, reporting and quality improvement to ensure that a product or service meets defined
standards of quality with a stated level of confidence. (QAMS)
Quality Assurance [Project] Plan (QAPP): a formal document describing the detailed quality
control procedures by which the quality requirements defined for the data and decisions pertaining
to a specific project are to be achieved. (EPA-QAD)
Quality Control: the overall system of technical activities whose purpose is to measure and control
the quality of a product or service so that it meets the needs of users. (QAMS)
Quality Control Sample: an uncontaminated sample matrixspiked with known amounts of analytes
from a source independent from the calibration standards. It is generally used to establish
intra-laboratory or analyst specific precision and bias or to assess the performance of all or a portion
of the measurement system. (EPA-QAD)
Quality Manual: a document stating the management policies, objectives, principles, organizational
structure and authority, responsibilities, accountability, and implementation of an agency,
organization, or laboratory, to ensure the quality of its product and the utility of its product to its users.
(NELAC)
Quality System: a structured and documented management system describing the policies,
objectives, principles, organizational authority, responsibilities, accountability, and implementation

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 1A-9 of 12
plan of an organization for ensuring quality in its work processes, products (items), and services. The
quality system provides the framework for planning, implementing, and assessing work performed by
the organization and for carrying out required QA and QC. (ANSI/ASQC E-41994)
Quantitation Limits: levels, concentrations, or quantities of a target variable (e.g., target analyte)
that can be reported at a specified degree of confidence . (NELAC)
Range: the difference between the minimum and the maximum of a set of values. (EPA-QAD)
Raw Data: any original factual information from a measurement activity or study recorded in a
laboratory notebook, worksheets, records, memoranda, notes, or exact copies thereof that are
necessary for the reconstruction and evaluation of the report of the activity or study. Raw data may
include photography, microfilm or microfiche copies, computer printouts, magnetic media, including
dictated observations, and recorded data from automated instruments. If exact copies of raw data
have been prepared (e.g., tapes which have been transcribed verbatim, data and verified accurate
by signature), the exact copy or exact transcript may be submitted. (EPA-QAD)
[effective July 1, 2001[
Recognition: previously known as reciprocity. The mutual agreement of two or more parties (i.e.,
States) to accept each other's findings regarding the ability of environmental testing laboratories in
meeting NELAC standards. (NELAC)[1.5.3]
Reference Material: a material or substance one or more properties of which are sufficiently well
established to be used forthe calibration of an apparatus, the assessment of a measurement method,
or for assigning values to materials. (ISO Guide 30-2.1)
Reference Method: a method of known and documented accuracy and precision issued by an
organization recognized as competent to do so. (NELACj
Reference Standard: a standard, generally of the highest metrological quality available at a given
location, from which measurements made at that location are derived. (VIM-6.08)
Reference Toxicant: the toxicant used in performing toxicity tests to indicate the sensitivity of a test
organism and to demonstrate the laboratory's ability to perform the test correctly and obtain consistent
results (see Chapter 5, Appendix D, section 2.1 f). (NELAC)
Replicate Analyses: the measurements of the variable of interest performed identically on two or
more sub-samples of the same sample within a short time interval. (NELAC)
Requirement: denotes a mandatory specification; often designated by the term "shall". (NELAC)
Resource Conservation and Recovery Act (RCRA): the enabling legislation under 42 USC 321
et seq. (1976), that gives EPA the authority to control hazardous waste from the "cradle-to-grave",
including its generation, transportation, treatment, storage, and disposal. (NELAC)
Revocation: the total or partial withdrawal of a laboratory's accreditation by the accrediting authority.
(NELAC)[4.4.3]

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 1A-10 of 12
Safe Drinking Water Act (SDWA): the enabling legislation, 42 USC 300fef seq. (1974), (Public Law
93-523), that requires the EPA to protect the quality of drinking water in the U.S. by setting maximum
allowable contaminant levels, monitoring, and enforcing violations. (NELAC)
Sample Tracking: procedures employed to record the possession of the samples from the time of
sampling until analysis, reporting, and archiving. These procedures include the use of a Chain of
Custody Form that documents the collection, transport, and receipt of compliance samples to the
laboratory. In addition, access to the laboratory is limited and controlled to protect the integrity of the
samples. (NELAC)
Secondary Accrediting Authority: the Territorial, State or federal agency that grants NELAC
accreditation to laboratories, based upon their accreditation by a NELAP-recognized Primary
Accrediting Authority. See also Recognition and Primary Accrediting Authority. (NELAC)[1.5.2.3]
Selectivity: (Analytical chemistry) the capability of a test method or instrument to respond to a target
substance or constituent in the presence of non-target substances. (EPA-QAD)
Sensitivity: the capability of a method or instrument to discriminate between measurement
responses representing different levels (e.g., concentrations) of a variable of interest. (NELAC)
Shall: denotes a requirement that is mandatory whenever the criterion for conformance with the
specification requires that there be no deviation. This does not prohibit the use of alternative
approaches or methods for implementing the specification so long as the requirement is fulfilled.
(ANSI)
Should: denotes a guideline or recommendation whenever noncompliance with the specification is
permissible. (ANSI)
Spike: a known mass of target analyte added to a blank sample or sub-sample; used to determine
recovery efficiency or for other quality control purposes. (NELAC)
Standard: the document describing the elements of laboratory accreditation that has been developed
and established within the consensus principles of NELAC and meets the approval requirements of
NELAC procedures and policies. (ASQC)
Standard Operating Procedures (SOPs): a written document which details the method of an
operation, analysis or action whose techniques and procedures are thoroughly prescribed and which
is accepted as the method for performing certain routine or repetitive tasks. (QAMS)
Standardized Reference Material (SRM): a certified reference material produced by the U.S.
National Institute of Standards and Technology or other equivalent organization and characterized
for absolute content, independent of analytical method. (EPA-QAD)
Statistical Minimum Significant Difference (SMSD): the minimum difference between the control
and a test concentration that is statistically significant; a measure of test sensitivity or power. The
power of a test depends in part on the number of replicates per concentration, the significance level
selected, e.g., 0.05, and the type of statistical analysis. If the variability remains constant, the
sensitivity of the test increases as the number of replicates is increased. (NELAC)
Supervisor (however named): the individual(s) designated as being responsible for a particular area
or category of scientific analysis. This responsibility includes direct day-to-day supervision of
technical employees, supply and instrument adequacy and upkeep, quality assurance/quality control

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 1A-11 of 12
duties and ascertaining thattechnical employees have the required balance of education, training and
experience to perform the required analyses. (NELAC)
Surrogate: a substance with properties that mimic the analyte of interest. It is unlikely to be found
in environment samples and is added to them for quality control purposes. (QAMS)
Suspension: temporary removal of a laboratory's accreditation for a defined period of time, which
shall not exceed six months, to allow the laboratory time to correct deficiencies or area of non-
compliance with the NELAC standards. (NELAC)[4.4.2]
Technical Director: individual(s) who has overall responsibility for the technical operation of the
environmental testing laboratory. (NELAC)
Technology: a specific arrangement of analytical instruments, detection systems, and/or preparation
techniques.
Test: a technical operation that consists of the determination of one or more characteristics or
performance of a given product, material, equipment, organism, physical phenomenon, process or
service according to a specified procedure. The result of a test is normally recorded in a document
sometimes called a test report or a test certificate. (ISO/IEC Guide 2-12.1, amended)
Test Method: an adoption of a scientific technique for a specific measurement problem, as
documented in a laboratory SOP or published by a recognized authority. (NELAC)
Testing Laboratory: a laboratory that performs tests. (ISO/IEC Guide 2-12.4)
Test Sensitivity/Power: the minimum significant difference (MSD) between the control and test
concentration that is statistically significant. It is dependent on the number of replicates per
concentration, the selected significance level, and the type of statistical analysis (see Chapter 5,
Appendix D, section 2.4.a). (NELAC)
Tolerance Chart: A chart in which the plotted quality control data is assessed via a tolerance level
(e.g. +/-10% of a mean) based on the precision level judged acceptable to meet overall quality/data
use requirements instead of a statistical acceptance criteria (e.g. +/- 3 sigma) (applies to
radiobioassay laboratories). (ANSI)
Toxic Substances Control Act (TSCA): the enabling legislation in 15 USC 2601 et seq., (1976),
that provides for testing, regulating, and screening all chemicals produced or imported into the United
States for possible toxic effects prior to commercial manufacture. (NELAC)
Traceability: the property of a result of a measurement whereby it can be related to appropriate
standards, generally international or national standards, through an unbroken chain of comparisons.
(VIM-6.12)
United States Environmental Protection Agency (EPA): the federal governmental agency with
responsibility for protecting public health and safeguarding and improving the natural environment
(i.e., the air, water, and land) upon which human life depends. (US-EPA)
Validation: the process of substantiating specified performance criteria. (EPA-QAD)
Verification: confirmation by examination and provision of evidence that specified requirements have
been met. (NELAC)

-------
NELAC
Program Policy and Structure
Revision 14
May 25, 2001
Page 1A-12 of 12
NOTE: In connection with the management of measuring equipment, verification provides a
means for checking that the deviations between values indicated by a measuring instrument and
corresponding known values of a measured quantity are consistently smaller than the maximum
allowable error defined in a standard, regulation or specification peculiar to the management of
the measuring equipment.
The result of verification leads to a decision either to restore in service, to perform adjustment,
to repair, to downgrade, or to declare obsolete. In all cases, it is required that a written trace of
the verification performed shall be kept on the measuring instrument's individual record.
Voting Member: officials in the employ of the Government of the United States, and the States, the
Territories, the Possessions of the United States, or the District of Columbia and who are actively
engaged in environmental regulatory programs or accreditation of environmental laboratories.
(NELAC)
Work Cell: a well-defined group of analysts that together perform the method analysis. The
members of the group and their specific functions within the work cell must be fully documented.
(NELAC)
Sources:
40CFR Part 136
American Society for Quality Control (ASQC), Definitions of Environmental Quality AssuranceTerms,
1996
American National Standards Institute (ANSI), Style Manual for Preparation of Proposed American
National Standards, Eighth Edition, March 1991
ANSI/ASQC E4, 1994
ANSI N42.23-1995, Measurement and Associated Instrument Quality Assurance for
Radiobioassay Laboratories
International Standards Organization (ISO) Guides 2, 30, 8402
International Vocabulary of Basic and General Terms in Metrology (VIM): 1984. Issued by BIPM, IEC,
ISO and OIML
National Institute of Standards and Technology (NIST)
National Environmental Laboratory Accreditation Conference (NELAC), July 1998 Standards
Random House College Dictionary
US EPA Quality Assurance Management Section (QAMS), Glossary of Terms of Quality Assurance
Terms, 8/31/92 and 12/6/95
US EPA Quality Assurance Division (QAD)
Webster's New World Dictionary of the American Language

-------
m



CD

O

1—H

<'

CD

C_

c



J-*-

ro

o

o

00

c
>
3
"O

"O
CD

C/)
o
C/)

O
I-H
CD
Q.
CD

03
<

c75"
ro
CD
cn
3
O
I-H
ro
o
CD
o
National Environmental
Laboratory Accreditation
Conference

-------
Note that the NELAC standards now have two significant dates: 1) the
date the standards were approved at the annual meeting, and 2) the date
the standards are effective and must be implemented. This is especially
important as some portions of the standards have different effective
dates. The approval date is part of the document control header on each
page. The cover of each chapter shows both the approval date and the
effective date. Changes approved for implementation at a time other than
the effective date (on the chapter cover) are noted in the chapter,
showing the approved text and its effective date.

-------
NELAC
Proficiency Testing
Revision 16
May 25, 2001
Page i of v
TABLE OF CONTENTS
PROFICIENCY TESTING
2.0	PROFICIENCY TESTING PROGRAM: INTERIM STANDARDS		1
2.1	INTRODUCTION, SCOPE, AND APPLICABILITY 		1
2.1.1	Purpose		2
2.1.2	Goals		2
2.1.3	Fields of Proficiency Testing 		2
2.2	MAJOR PT GROUPS AND THEIR RESPONSIBILITIES 		2
2.2.1	Proficiency Testing Study Providers 		4
2.2.2	Proficiency Testing Oversight Body (PTOB)/Proficiency Test Provider Accreditor
(PTPA)		4
2.2.3	Laboratories		4
2.2.4	Accrediting Authorities (AA)		4
2.3	REQUIREMENTS FOR PT PROVIDERS 		4
2.3.1	PT Provider Accreditation 		4
2.3.2	On-site Inspection of PT Providers 		5
2.3.3	Sample Requirements and Design 		5
2.3.3.1	Sample Analytes		5
2.3.3.2	PT Provider Sample Testing		5
2.3.4	PT Study Data Analysis 		5
2.3.4.1 Data Acceptance Criteria 		5
2.3.5	Generation of Study Reports		5
2.3.6	Provider Conflict of Interest		5
2.3.7	Disapproval of PT Providers 		6
2.3.8	PTOB/PTPA Listing of PT Providers		6
2.4	LABORATORY ENROLLMENT IN PROFICIENCY TESTING PROGRAM(S) 		6
2.4.1	Required Level of Participation 		6
2.4.2	Requesting Accreditation		6
2.4.3	Reporting Results 		6
2.5	REQUIREMENTS FOR LABORATORY TESTING OF PT STUDY SAMPLES		6
2.5.1	Restrictions on Exchanging Information 		6
2.5.2	Maintenance of Records 		7
2.6	EVALUATION OF PROFICIENCY TESTING RESULTS		7
2.7	PT CRITERIA FOR LABORATORY ACCREDITATION		8
2.7.1	Result Categories 		8
2.7.2	Initial or Continuing PT Studies		8
2.7.3	Supplemental PT Studies 		8
2.7.4	Failed Studies and Corrective Action		9
2.7.5	Second Failed Study 		9
2.7.6	Scheduling of PT Studies 	 10
2.7.7	Withdrawal from PT Studies 	 10
2.7.8	Process for Handling Questionable PT Samples 	 10
Appendix A - PT PROVIDER APPROVAL CRITERIA 	 A-1

-------
NELAC
Proficiency Testing Program
Revision 16
May 25, 2001
Page ii of v
A.O SCOPE 	A-1
A.1 APPROVAL PROCESS	A-1
A.2 QUALITY SYSTEM REQUIREMENTS 	A-1
A.3 PROVIDER FACILITIES AND PERSONNEL 	A-1
A.4 SAMPLE FORMULATION REVIEW	A-2
A.4.1 Release of Information 	A-2
A.5 PROVIDER CONFLICT-OF-INTEREST REQUIREMENTS	A-2
A.5.1 Ban on Distribution of Samples 	A-2
A.6 CONFIDENTIALITY OF PT STUDY DATA	A-2
A.7 DATA REVIEW AND EVALUATION 	A-3
A.8 COMPLAINTS & CORRECTIVE ACTION	A-3
A.9 LOSS OF PROVIDER APPROVAL	A-3
A.9.1 Periodic Review of PT Providers 	A-3
A.9.2	Revocation of Approval	A-3
A.10	NOTIFICATION OF SAMPLE INTEGRITY	A-4
Appendix B - PT SAMPLE DESIGN & ACCEPTANCE GUIDELINES 	B-1
B.O	INTRODUCTION	B-1
B.1 SAMPLE FORMULATION APPROVAL	B-1
B.1.1	Adequacy of the Sample Formulation	B-1
B.1.2 PT Sample Composition	B-1
B.1.3 PT Sample Matrix	B-2
B.1.4 PT Sample Composition for Solid Matrices 	B-2
B.2 VERIFICATION OF ASSIGNED VALUE	B-2
B.2.1 Relative Standard Deviation of Verification Analysis 	B-2
B.2.2 Quality Control Check of the Assigned Value	B-2
B.3 HOMOGENEITY TESTING	B-2
B.3.1 Homogeneity Testing Procedure 	B-2
B.3.2 Suitable Homogeneity Testing Procedures 	B-3
B.4 STABILITY TESTING 	B-3
B.5	DATA REPORTING BY PT PROVIDERS	B-3
B.5.1 Verification and Homogeneity Reports	B-3
B.5.2 Laboratory Data and Stability Reports 	B-3
Appendix C - PT ACCEPTANCE CRITERIA AND PT PASS/FAIL CRITERIA 	C-1
C.O	PURPOSE, SCOPE, AND APPLICABILITY 	C-1

-------
NELAC
Proficiency Testing
Revision 16
May 25, 2001
Page iii of v
C.1 ANALYTE ACCEPTANCE LIMITS		C-1
C.1.1 Analyte Acceptance Limit Categories 		C-1
C.1.1.1 Drinking Water, Waste Water, and Ambient Water Analytes with USEPA
Established Acceptance Limits		C-1
C.1.1.2 Analytes with Acceptance Limits Established by the NELAC Standing Committee
on Proficiency Testing 		C-1
C.1.1.3	Experimental Data: Analytes without Promulgated Acceptance Limits or
Established Regression Equations		C-1
C.2 ACCEPTABLE PT RESULTS FOR CHEMICAL ANALYTES IN POTABLE WATER AND
NON-POTABLE WATER PT SAMPLES 		C-2
C.3 NOT ACCEPTABLE PT RESULTS FOR POTABLE WATER AND NON-POTABLE WATER
PT SAMPLES		C-2
C.4 ADDITIONAL REQUIREMENTS FOR PT PROVIDERS 		C-2
C.4.1 Additional Matrix/Analyte Groups		C-2
C.5.0	NELAC PT Study Pass/Fail Criteria		C-3
C.5.1 Analyte Group PT Studies 		C-3
C.5.2 Promulgated USEPA Pass/fail Criteria 		C-3
C.5.3	Pass/fail Criteria For Analyte Group PT Samples 		C-3
Appendix D - PROFICIENCY TESTING OVERSIGHT BODY/
PROFICIENCY TEST PROVIDER ACCREDITOR 		D-1
D.O	PURPOSE, SCOPE, AND APPLICABILITY 		D-1
D.1 TECHNICAL AND ADMINISTRATIVE QUALIFICATIONS 		D-1
D.2 PTOB/PTPA RESPONSIBILITIES REGARDING INITIAL ASSESSMENT OF PT
PROVIDERS		D-1
D.2.1	Development of Standard Operating Procedures and Forms		D-1
D.2.1.1	SOP(s) for the Assessment Process 		D-2
D.2.1.2 Initial Application		D-2
D.2.1.3 SOP(s) for On-site Inspections and Checklist(s) 		D-2
D.2.2 Initial Application Review and On-site Inspections 		D-2
D.3 PTOB/PTPA RESPONSIBILITIES REGARDING APPROVAL OF PT PROVIDERS ....	D-3
D.4 PTOB/PTPA RESPONSIBILITIES FOR ONGOING OVERSIGHT OF PT PROVIDERS .	D-3
D.5 DEVELOPMENT AND MAINTENANCE OF A COMPREHENSIVE PT DATABASE		D-4
D.6 COMPLAINTS AND CORRECTIVE ACTION 		D-4
D.7 LIST OF APPROVED PT PROVIDERS		D-4
D.8 SPONSORSHIP OF ANNUAL NELAC PROFICIENCY TESTING CAUCUS 		D-4
D.9 PTOB/PTPA ETHICS		D-4
D.10 CONFIDENTIALITY		D-5

-------
NELAC
Proficiency Testing Program
Revision 16
May 25, 2001
Page iv of v
Appendix E - MICROBIOLOGY	E-1
E.O PURPOSE	E-1
E.1 SAMPLES 	E-1
E.1.1 SDWA Samples 	E-1
E.1.2 CWA Samples 	E-1
E.2 SAMPLE PREPARATION AND QUALITY CONTROL 	E-1
E.3	SCORING 	E-2
E.3.1 Qualitative Analyses, SDWA Samples	E-2
E.3.2	Quantitative Analyses	E-2
E.3.2.1	Requirement for Quantitative Data Set Size 	E-2
Appendix F - ENVIRONMENTAL TOXICOLOGY 	 F-1
F.O	PURPOSE, SCOPE, AND APPLICABILITY 	 F-1
F.1 RATIONALE	 F-1
F.2 LABORATORY ENROLLMENT IN PROFICIENCY TESTING PROGRAMS 	 F-1
F.2.1	Required Level of Participation	 F-1
F.2.2 Requirements for Laboratory Testing of PT Study Samples 	 F-1
F.3 PT CRITERIA FOR LABORATORY ACCREDITATION 	 F-1
F.3.1 Initial and Continuing Accreditation	 F-1
F.4	Fields of Accreditation 		F-2
F.4.1 Whole Effluent Toxicity (WET) Method Codes		F-2
F.4.2 Test Conditions for Sediment Toxicity (Solid Phase		F-2
F.4.2.1	Sediment Toxicity PT Samples 		F-2
F.4.3	Test Conditions for Soil Toxicity	 F-3
F.4.3.1 Soil Toxicity PT Samples 	 F-3
Appendix G - RADIOCHEMISTRY	G-1
G.O	PURPOSE	G-1
G.1 PROFICIENCY TESTING PROVIDER LICENSING	G-1
G.2 SDWA SAMPLE DESIGN 	G-1
G.2.1	ASSIGNED VALUES	G-1
G.3 SCORING	G-1
G.4	STUDY TIMETABLES	G-1
Appendix H - PERFORMANCE TESTING REQUIREMENTS FOR FIELD AIR MEASUREMENT
	H-1
H.O	INTRODUCTION: PURPOSE, SCOPE, AND APPLICABILITY	H-1
H.1 Proficiency Testing for Field Air Measurement 	H-1

-------
NELAC
Proficiency Testing
Revision 16
May 25, 2001
Page v of v
H.2 ACCEPTANCE LIMITS	 H-2
H.2.1 Analyte Acceptance Limit Categories 	 H-2
H.2.1.1 Analytes with USEPA Established Acceptance Limits (Prepared ± fixed
percentage or Mean ± 2 standard deviations)	 H-2
H.2.1.2 Analytes with acceptance limits derived from regression equations established by
the NELAC Standing Committee on Proficiency Testing 	 H-2
H.2.1.3 Experimental Data: Analytes without promulgated acceptance limits or
established regression equations 	 H-2
H.3 ACCEPTABLE PT RESULTS FOR CHEMICAL ANALYTES IN FIELD AIR PT
MEASUREMENTS	 H-3
H.4 NOT ACCEPTABLE PT RESULTS FOR SOURCE AND AMBIENT PT SAMPLES 	 H-3
H.5 NELAC PT STUDY PASS/FAIL CRITERIA 	 H-3
H.5.1 Interdependent Analyte PT Samples	 H-3
H.5.2 Non-interdependent Analyte PT Samples	 H-4
H.5.3 Promulgated USEPA Pass/fail Criteria 	 H-4
H.5.4 Pass/fail Criteria For Interdependent Analyte PT Samples	 H-4
H.5.5 Pass/fail Criteria For Non-Interdependent Analyte PT Samples 	 H-4
FIGURES
Figure 2-1
NELAP Proficiency Testing
3

-------
NELAC
Proficiency Testing
Revision 16
May 25, 2001
Page 1 of 10
2.0	PROFICIENCY TESTING PROGRAM: INTERIM STANDARDS
For fields of accreditation for which proficiency testing (PT) samples are not available from a NELAP
Proficiency Testing Oversight Body (PTOB)/Proficiency Test Provider Accreditor (PTPA) (e.g.,
National Institute of Standards and Technology (NIST)) accredited PT Provider, a Primary Accrediting
Authority may accept PT results from non-accredited PT Providers. In these cases, the Secondary
Accrediting Authority shall accept the decision of the Primary Accrediting Authority.
2.1	INTRODUCTION, SCOPE, AND APPLICABILITY
This chapter and the associated appendices define the major participating organizations and
components of the NELAC PT Program. In addition to complying with the requirements of this
chapter, any person, private party or government entity seeking to participate as a NELAP-designated
PTOB/PTPA-approved PT Provider shall also comply with the requirements of the applicable
Appendices A (PT Provider Approval Criteria), B (PT Sample Design and Acceptance Guidelines),
C (Proficiency Testing Acceptance Criteria), D (Proficiency Testing Oversight Body/Proficiency Test
Provider Accreditor), E (Microbiology), and F (Environmental Toxicology). The criteria set forth in
these standards shall be used by laboratories and PT Providers for the purposes of obtaining or
maintaining NELAP accreditation or NELAP approval.
In addition to complying with the requirements of this chapter and appendices, any entity seeking to
participate as a NELAP-designated PTOB/PTPA-approved PT Provider shall also comply with all
applicable requirements of "National Standards for Water Proficiency Testing Studies, Criteria
Document", U.S. Environmental Protection Agency or other NELAC documents that define analytes,
analyte numbers, concentrations, and acceptance criteria as required in Section C.1.1.2.
Proficiency testing (PT) is defined for the purpose of this chapter as a means of evaluating a
laboratory's performance under controlled conditions relative to a given set of criteria through analysis
of unknown samples provided by an external source. PT is not the sole criterion for determining
accreditation status. Additional essential elements of the overall NELAP accreditation process,
including the on-site assessment, are discussed in other chapters of the NELAC standards. The PT
program is intended to cover all types of federal and State environmental analyses. However, the
body of the PT standard applies primarily to chemistry.
The major components of the NELAC PT program include:
a)	multiple PT Providers who shall meet stringent criteria to become approved by a Proficiency
Testing Oversight Body (PTOB)/Proficiency Test Provider Accreditor (PTPA), as described in
Section 2.3 and Appendix A;
b)	specific requirements for the design of PT samples and studies, to ensure that all samples
provide a consistent, fair and known challenge to laboratories seeking accreditation from a
NELAP-approved Accrediting Authority, as described in Section 2.3 and Appendix B;
c)	specifically defined acceptable/not acceptable criteria for evaluating PT sample results, as
described in Section 2.3 and Appendix C;
d)	initial approval and ongoing oversight of PT Providers by a Proficiency Testing Oversight Body
(PTOB)/Proficiency Test Provider Accreditor (PTPA), Section 2.3 and Appendix D;
e)	specific requirements for laboratories participating in PTOB/PTPA-approved PT programs, as
described in Sections 2.4, 2.5, and 2.7; and,
f)	oversight of all PT program activities by the PTOB(s)/PTPA(s), as described in Section 2.2.2.

-------
NELAC
Proficiency Testing
Revision 16
May 25, 2001
Page 2 of 10
2.1.1	Purpose
The PT program incorporates several practical purposes, which include:
a)	the production and supply of test samples that are procedure-sensitive; that is, the samples
challenge the critical components of each analytical procedure, ranging from initial sample
preparation to final data analysis;
b)	the production and supply of test samples that are as similar to real-world samples as is
reasonably possible; it is further expected that the PT samples shall be representative of
materials analyzed for environmental regulatory programs, agencies, and communities;
c)	a program which is affordable by all participants;
d)	the yielding of PT data that are technically defensible on the basis of the type and quality of the
samples provided; and,
e)	the preparation of samples such that the identification and quantitation of analytes in the samples
pose equivalent difficulty and challenge regardless of the manner in which the samples are
designed and manufactured by the PT Providers, e.g., samples prepared for analysis by a
drinking water or wastewater method would pose equal challenge whether prepared as whole
volume or as a concentrate in ampules.
2.1.2	Goals
The PT program incorporates several practical goals, which include:
a)	the generation of data at a quality level required by environmental and regulatory programs;
b)	the generation of data, at a minimum, comparable in quality to that of currently certified and/or
accredited laboratories; and
c)	the improvement of the overall performance of laboratories over time.
2.1.3	Fields of Proficiency Testing
The PT program is organized by fields of proficiency testing. The following elements collectively
define fields of proficiency testing:
a)	matrix type,
b)	technology/method, and
c)	analyte/analyte group.
Current NELAC fields of proficiency testing are located on the NELAC Website.
Note: Laboratories are permitted to analyze and report multiple method specific results forthe same
analytes from one PT sample. If a laboratory reports more than one method pertechnology perstudy
for a field of proficiency testing, an unacceptable result by any method would be considered a failed
study for that technology
2.2 MAJOR PT GROUPS AND THEIR RESPONSIBILITIES
The PT program structure incorporates five major groups with separate and distinct roles and
responsibilities. The groups are NELAC, the PTOB/PTPA, the PT Providers, the testing laboratories,

-------
NELAC
Proficiency Testing
Revision 16
May 25, 2001
Page 3 of 10
and the Primary Accrediting Authorities (AA). The lines of interaction among these groups are shown
in Figure 2-1.
Primary Accrediting
Authority
States/EPA
PTOB/PTPA
PT Providers
Standard-Setting
Authority
NELAC
Laboratories
(Private Sector,
Non-Profits, and/
or States
Figure 2-1. NELAP Proficiency Testing

-------
NELAC
Proficiency Testing
Revision 16
May 25, 2001
Page 4 of 10
The NELAC Standing Committee on Proficiency Testing is responsible for Chapter 2 and related
appendices. This includes:
a)	establishing which analytes are included in the NELAC PT program,
b)	establishing the concentration ranges for each analyte,
c)	establishing acceptance criteria to be used to evaluate PT results, and
d)	maintaining a comprehensive list of NELAC fields of proficiency testing.
2.2.1	Proficiency Testing Study Providers
The PT Providers shall produce and distribute PT samples, evaluate study results against published
performance criteria, and report the results to the laboratories, the respective Primary Accrediting
Authorities, the appropriate PTOB/PTPA, and NELAP. The PT Provider shall meet the requirements
of Appendix A, manufacture samples that meet the requirements of Appendix B, and score sample
results in accordance with the requirements of Appendix C.
2.2.2	Proficiency Testing Oversight Body (PTOB)/Proficiency Test Provider Accreditor
(PTPA)
The PTOB/PTPA establishes and implements a program to accredit PT Providers and to monitor
accredited providers to ensure that their studies and practices meet all applicable standards. The
PTOB/PTPA shall meet the requirements of Appendix D. Organizations meeting the requirements of
this standard and its appendices, as determined by any NELAP-recognized Accrediting Authority, may
be nominated by the committee to the NELAC Board of Directors to be designated as a PTOB/PTPA.
2.2.3	Laboratories
Laboratories that seek to obtain or maintain accreditation shall perform analyses of PT samples for
each field of proficiency testing as defined in Section 2.1.3. PT samples shall be obtained from
NELAP designated PTOB/PTPA-approved PT Providers. The laboratory shall obtain PT samples
from any so approved PT Provider. The results of the analyses shall be submitted to the PT Provider
for scoring.
2.2.4	Accrediting Authorities (AA)
The Primary Accrediting Authorities shall make all decisions regarding a laboratory's accreditation
status. They are responsible for taking action to make these determinations including ensuring that
laboratories seeking or holding their accreditations have participated in the PT program. Accrediting
authorities shall accept for the purposes of initial and continuing accreditation, PT results from any
NELAP-designated PTOB/PTPA-approved PT Providerthat meets the requirements ofthis standard.
2.3 REQUIREMENTS FOR PT PROVIDERS
This section and associated Appendix A describe the criteria that all PT Providers shall meet in order
to be approved by the PTOB/PTPA as PT Providers. A PTOB/PTPA shall grant approval to PT
Providers on a field-of-proficiency testing basis, as described in Section 2.1.3.
2.3.1 PT Provider Accreditation
For all compounds/matrices for which NIST National Voluntary Laboratory Accreditation Program
(NVLAP) accreditation is available, the PT Provider must be accredited by NIST NVLAP. The
Provider's NIST NVLAP Scope of Accreditation shall cover the specific PT samples being supplied
to the laboratories. For all other programs and compounds for which NIST/NVLAP accreditation is
not available, a provider of PT samples for NELAC accreditation must be accredited by a Proficiency

-------
NELAC
Proficiency Testing
Revision 16
May 25, 2001
Page 5 of 10
Testing Oversight Body (PTOB)/PTPA that meets the NELAC PTOB/PTPA requirements contained
in this Chapter and associated appendices. The names of PTOB/PTPA organizations that meet the
NELAC requirements are communicated to the NELAC Standing Committee on Proficiency Testing
and the NELAC Board of Directors. A listing of organizations that meet the NELAC PTOB/PTPA
requirements is available from the Chair of NELAC.
2.3.2	On-site Inspection of PT Providers
A PTOB/PTPA shall conduct an on-site inspection of any organization seeking to participate as a PT
Provider, as described in Appendix D. The PTOB/PTPA shall determine whetherthe provider meets
the applicable requirements described in this chapter and Appendices A, B, and C. Approval of a PT
Provider shall be the responsibility of a PTOB/PTPA. A PTOB/PTPA shall conduct ongoing oversight
of the PT Providers as necessary to ensure conformance with all applicable standards.
2.3.3	Sample Requirements and Design
This section and associated Appendix B describe PT sample design and acceptance criteria. The
matrices of all PT samples shall, to the extent possible, resemble the matrices forwhich the laboratory
seeks to obtain or maintain accreditation. Samples may not be reused in any subsequent NELAC PT
study except as described in Section 2.7.3.
2.3.3.1	Sample Analytes
The PT Provider shall prepare each sample lot such that the prepared concentration of each analyte
in each lot is unique. The required group of analytes covering each field of proficiency testing shall
be determined by the NELAC Standing Committee on Proficiency Testing and shall be evaluated and
updated, as necessary.
2.3.3.2	PT Provider Sample Testing
The PT Provider shall design, manufacture, and test the samples for homogeneity, stability, and
verification of assigned values as required by Appendix B. This testing shall verify that the quality of
all samples is acceptable for use in each field of proficiency testing.
2.3.4	PT Study Data Analysis
This section and associated Appendix C describe the criteria to be used by PT Providers when
scoring and evaluating NELAC PT sample results.
2.3.4.1 Data Acceptance Criteria
PT Providers shall use the data acceptance criteria described in Appendix C to evaluate laboratories'
PT data to ensure a laboratory's performance shall be judged fairly and consistently.
2.3.5	Generation of Study Reports
Each PT Provider shall evaluate the data and issue a report within 21 calendar days of the close of
each study.
2.3.6	Provider Conflict of Interest
Each PT Provider shall certify that it is free of any organizational conflict of interest. A PT Provider
shall never split a sample lot and offer these samples for sale as known-value check samples before
the unknown samples are used in a PT study. In addition, each PT Provider shall follow procedures

-------
NELAC
Proficiency Testing
Revision 16
May 25, 2001
Page 6 of 10
and have systems in place that maintain confidentiality and security of all assigned values through
the closing date of each study. All records shall be retained for a period of five years.
2.3.7	Disapproval of PT Providers
A PT Provider's approval may be subjected to revocation per the procedures outlined in Appendix A,
Section A.9.2.
2.3.8	PTOB/PTPA Listing of PT Providers
PTOBs/PTPAs shall maintain a list of approved PT Providers. PTOBs/PTPAs shall evaluate, update,
and publish this list as specified in Appendix D.
2.4	LABORATORY ENROLLMENT IN PROFICIENCY TESTING PROGRAM(S)
2.4.1	Required Level of Participation
To be accredited initially and to maintain accreditation, a laboratory shall participate in two single-
blind, single-concentration PT studies, where available, per year for each field of proficiency testing
for which it seeks or wants to maintain accreditation. Laboratories must obtain PT samples from a
PTOB/PTPA-approved PT Provider. Each laboratory shall participate in at least two PT studies for
each field of proficiency testing per year unless a different frequency for a given program is defined
in the appendices. Section 2.5 describes the time period in which a laboratory shall analyze the PT
samples and report the results. Data and laboratory evaluation criteria are discussed in Sections 2.6
and 2.7 of this chapter.
2.4.2	Requesting Accreditation
At the time each laboratory applies for accreditation, it shall notify the Primary Accrediting Authority
which field(s) of testing it chooses to become accredited for and shall participate in the appropriate
PT studies. For all fields of proficiency testing, including those for which PT samples are not
available, the laboratory shall ensure the reliability of its testing procedures by maintaining a total
quality management system that meets all applicable requirements of Chapter Five of the NELAC
standards.
2.4.3	Reporting Results
Each laboratory shall authorize the PT Provider to release all accreditation and remediation results
and acceptable/not acceptable status directly to the Primary Accrediting Authority, NELAP and the
PTOB/PTPA, in addition to the laboratory.
2.5	REQUIREMENTS FOR LABORATORY TESTING OF PT STUDY SAMPLES
The samples shall be analyzed and the results returned to the PT Provider no later than 45 calendar
days from the scheduled study shipment date. The laboratory's management and all analysts shall
ensure that all PT samples are handled (i.e., managed, analyzed, and reported) in the same manner
as real environmental samples utilizing the same staff, methods as used for routine analysis of that
analyte, procedures, equipment, facilities, and frequency of analysis.
2.5.1 Restrictions on Exchanging Information
Laboratories shall comply with the following restrictions on the transfer of PT samples and
communication of PT sample results prior to the time the results of the study are released:

-------
NELAC
Proficiency Testing
Revision 16
May 25, 2001
Page 7 of 10
a)	A laboratory shall not send any PT sample, or a portion of a PT sample, to another laboratory for
any analysis for which it seeks accreditation, or is accredited;
b)	A laboratory shall not knowingly receive any PT sample or portion of a PT sample from another
laboratory for any analysis for which the sending laboratory seeks accreditation, oris accredited;
c)	Laboratory management or staff shall not communicate with any individual at another laboratory
(including intracompany communication) concerning the PT sample; and
d)	Laboratory management or staff shall not attempt to obtain the assigned value of any PT sample
from their PT Provider.
2.5.2 Maintenance of Records
The laboratory shall maintain copies of all written, printed, and electronic records, including but not
limited to bench sheets, instrument strip charts or printouts, data calculations, and data reports,
resulting from the analysis of any PT sample for five years or for as long as is required by the
applicable regulatory program, whichever is greater. These records shall include a copy of the PT
study report forms used by the laboratory to record PT results. All of these laboratory records shall
be made available to the assessors of the Primary Accrediting Authority during on-site audits of the
laboratory.
2.6 EVALUATION OF PROFICIENCY TESTING RESULTS
PT Providers shall evaluate results from all PT studies using NELAC-mandated acceptance criteria
described in Appendix C. The NELAC Standing Committee on Proficiency Testing shall provide, and
update as necessary, the data acceptance criteria that all PT Providers shall use for all PT studies.
Each result shall be scored on an acceptable/not acceptable basis. The PT Provider shall provide
the participant laboratories and the Primary Accrediting Authority a report showing at a minimum:
a.	Provider information:
! Provider name and NIST/NVLAP acccreditation number in the header.
b.	Laboratory information:
! Laboratory name and address (location) of the laboratory, in the header. Note: This is not
the address ofthe corporate headquarters but the address ofthe actual laboratory completing
the testing.
! Primary Accrediting Authority ID or USEPA ID, if applicable, in the header.Name, title and
telephone number of the laboratory point of contact, in the header or cover letter.
c.	Study information:
! Study number and study type, in the header.
! Shipment date and closing date ofthe study, in the header.
! Date of amended report, if applicable, in the header.
d.	Report information:
! Analyte name for each analyte included in the standard.
! Method description.
! Laboratory value as reported.
! Assigned values and acceptance values reported to three significant figures.
! The acceptable/not acceptable status.
! A "No evaluation" score for reported values containing alpha characters.
! An indication of "Not reported" when an analyte within a PT sample is left blank.
! An indication ofthe length ofthe report, presented by either Page X of Y or the total number
of pages with each page consecutively numbered.

-------
NELAC
Proficiency Testing
Revision 16
May 25, 2001
Page 8 of 10
This report shall be sent no later than 21 calendar days from the study closing date. Upon request
by either the Primary Accrediting Authorities or laboratories, the PT Provider shall make available a
report listing the total number of participating laboratories and the number of laboratories scoring not
acceptable for each analyte. The PT Providers shall not disclose specific laboratory results or
evaluations to any other parties without the written release of the laboratory.
2.7 PT CRITERIA FOR LABORATORY ACCREDITATION
2.7.1	Result Categories
The criteria described in this section apply individually to each field of proficiency testing, as defined
by the laboratory seeking to obtain or maintain accreditation in its accreditation request. These criteria
apply only to the PT portion of the overall accreditation standard, and the Primary Accrediting
Authority shall consider PT results along with the other elements of the NELAC standards when
determining a laboratory's accreditation status. The Primary Accrediting Authority ultimately makes
all decisions regarding the accreditation status of the laboratory. There are two PT result categories:
"acceptable" and "not acceptable."
2.7.2	Initial or Continuing PT Studies
A laboratory seeking to obtain or maintain accreditation shall successfully complete two initial or
continuing PT studies for each requested field of proficiency testing within the most recent three
rounds attempted. For a laboratory seeking to obtain accreditation, the most recent three rounds
attempted shall have occurred within 18 months of the laboratory's application date. Successful
performance is described in Appendix C. When a laboratory has been granted accreditation status,
it shall continue to complete PT studies for each field of proficiency testing and maintain a history of
at least two acceptable PT studies for each field of proficiency testing out of the most recent three.
For initial accreditation, the laboratory must successfully analyze two sets of PT studies, the analyses
to be performed at least 15 calendar days apart from the closing date of one study to the shipment
date of another study for the same field of proficiency testing. For continuing accreditation,
completion dates of successive proficiency rounds for a given field of proficiency testing shall be
approximately six months apart. Failure to meet the semiannual schedule is regarded as a failed
study.
Initial or continuing PT Studies must meet all applicable criteria described in this chapter and
associated appendices.
2.7.3	Supplemental PT Studies
A NELAP-accredited laboratory may elect to participate in supplemental PT studies when the
laboratory desires to add field(s) of proficiency testing to their scope or when the laboratory fails an
initial or continuing PT study and wishes to re-establish its history of successful performance.
These additional studies are not distinguished from the initial or continuing PT studies except as
described in this section.
Analysis dates of supplemental PT studies must be at least 15 calendar days apart from the closing
date of one study to the shipment date of another study for the same field of proficiency testing. For
supplemental studies, laboratories report to their PT Provider results for all analytes for which they
are demonstrating corrective action or requesting an expansion of their existing accreditation.

-------
NELAC
Proficiency Testing
Revision 16
May 25, 2001
Page 9 of 10
2.7.3.1	Supplemental PT Studies for Demonstrating Corrective Action
A laboratory that has attained NELAP accreditation is required to maintain acceptable performance
in PT studies conducted on a semiannual schedule. If an accredited laboratory fails to maintain a
record of passing two out of the most recent three PT studies, it may be subject to loss of
accreditation for one or more fields of accreditation in it's current scope of accreditation. Alaboratory
that is out of compliance with this PT requirement may choose to participate in a Supplemental PT
Study for Demonstrating Corrective Action. Corrective Action PT samples must meet the following
criteria.
a.	The standard must be obtained from a PT Provider that meets the accreditation requirements
of NELAC.
b.	The standard must be from a lot that has been demonstrated to have met all of the design,
testing and verification requirements of Chapter2 and associated Appendices. PT samples from
previously released NELAC compliant PT studies may be used in Corrective Action PT Studies.
c.	The PT Provider cannot supply the laboratory with a sample that has previously been sent to the
laboratory. The original sample tracking ID must be masked and the sample tracking ID shall
be unique.
d.	For corrective action supplemental studies, the assigned values for all analytes requested by the
laboratory must not be equal to zero with the exception of the qualitative PCB group and
qualitative microbiology.
All other aspects of Supplemental PT studies for Demonstrating Corrective Action including scoring
and distribution of final reports must meet all other requirements of the NELAC PT program.
2.7.3.2	Supplemental PT Studies for Expanding an Accredited Laboratory's Scope of
Accreditation
A laboratory that has attained NELAC accreditation may add fields of accreditation to its current scope
of accreditation. As part ofthe requestto expand its scope of accreditation, the laboratory is required
to submit to its Primary Accrediting Authority, results of participation in two successful PT studies.
The laboratory may use the results of a PT study that meets the requirements of either Section 2.7.2
or 2.7.3.1. After the laboratory is granted accreditation for the requested FOT, the laboratory is
required to participate in regular semiannual PT studies.
2.7.4	Failed Studies and Corrective Action
Whenever a laboratory fails a study, it shall determine the cause for the failure and take any
necessary corrective action. It shall then document in its own records and provide to the Primary
Accrediting Authority both the investigation and the action taken. If a laboratory fails two out ofthe
three most recent studies for a given field of proficiency testing, its performance is considered
unacceptable under the NELAC PT standard for that field. The laboratory shall then meet the
requirements of initial accreditation as described in Section 2.7.2 - Initial or Continuing Accreditation.
2.7.5	Second Failed Study
The PT Provider reports laboratory PT performance results to the Primary Accrediting Authority at the
same time that it reports the results to the laboratory. If a laboratory fails a second study out of the
most recent three, as described in Section 2.7.4, the Primary Accrediting Authority shall take action,
pursuantto Chapter Four, within 60 calendardays to determine the accreditation status of all methods
for the unacceptable analyte(s) for that program and matrix.

-------
NELAC
Proficiency Testing
Revision 16
May 25, 2001
Page 10 of 10
2.7.6	Scheduling of PT Studies
A Primary Accrediting Authority may specify which months that laboratories within its authority are
required to participate in NELAC PT programs. If the Primary Accrediting Authority chooses to specify
the months, then it shall adhere to the required semiannual schedule. If the Primary Accrediting
Authority does not specify the months, then the laboratory shall determine the semiannual schedule.
2.7.7	Withdrawal from PT Studies
A laboratory may withdraw from a PT study for an analyte(s) or for the entire study if the laboratory
notifies both the PT Provider and the Primary Accrediting Authority before the closing date of the PT
study. This does not exempt the laboratory from participating in the semiannual schedule.
2.7.8	Process for Handling Questionable PT Samples
There may be occasions in which the PT Provider has shipped one or more samples for NELAP
accreditation which do not meet the quality control requirements of Appendix B, and the provider has
not in a timely manner notified all affected laboratories or Accrediting Authorities as described in
Section A.10 of this standard. In this case, an AA, upon review of summary data or other relevant
documentation, may choose not to use the results of the analyte(s)/matrices to support the
accreditation status of the laboratories. In order to justify not using the results, the AA shall first
contact the PT Provider and attempt to resolve the situation. If after notifying the PT Provider, the AA
still chooses to pursue a complaint against the provider, the AA shall submit a written complaint to the
Accrediting Authority Review Board (AARB). The AARB shall evaluate the complaint. If the complaint
is determined to be valid, then the AA shall submit the written complaint to the PTOB/PTPA which
initially accredited the provider for the particular analyte(s) and matrices. The AA shall follow all
procedures for filing complaints as specified by the PTOB/PTPA. The AA may determine that the
affected laboratories shall either wait until the next regularly scheduled PT testing round to analyze
another PT forthat field of accreditation, or may require the labs to obtain and analyze a supplemental
sample, and repeat the test.

-------
PROFICIENCY TESTING
APPENDIX A
PT PROVIDER APPROVAL CRITERIA

-------
NELAC
Proficiency Testing
Appendix A
Revision 16
May 25, 2001
Page 2A-1 of 4
Appendix A - PT PROVIDER APPROVAL CRITERIA
A.O SCOPE
This appendixdescribes the responsibilities and requirements a proficiency testing (PT) providershall
meet in order to be a Proficiency Testing Oversight Body (PTOB) /Proficiency Test Provider
Accreditor (PTPA) Approved PT Provider. In order for a PT Provider to participate in the NELAC PT
program, a provider shall be approved by a PTOB/PTPA. The criteria provided below are designated
to ensure the integrity and technical excellence of the NELAC PT program while allowing all qualified
providers to participate in the program.
A.1 APPROVAL PROCESS
The process for approval of a PT Provider includes a biennial on-site inspection by a PTOB/PTPA
to ensure that the technical criteria of this appendix are being met. At the discretion of the
PTOB/PTPA, the PT Provider may be requested to confirm their ability to perform analyses within the
required limits through participation in a proficiency testing program operated by the PTOB/PTPA, or
through the analysis of unknown samples provided by the PTOB/PTPA. Providers are also required
to submit the results of PT programs operated for NELAC to the PTOB/PTPA for review and
evaluation. The PT Provider agrees to accept the findings and decisions of the PTOB/PTPA as final.
A.2 QUALITY SYSTEM REQUIREMENTS
The manufacturing quality system used by the PT Provider shall meet the requirements of both
International Organization for Standardization (ISO) 9001 for the design, production, testing, and
distribution of performance evaluation samples and the requirements of ISO Guide 34, Quality System
Guidelines for the Production of Reference Materials. The design and operation of the PT Provider's
proficiency testing program shall meet the requirements of ISO Guide 43, Proficiency Testing by
Interlaboratory Comparisons. The testing facilities used to support the verification, homogeneity, and
stability testing required in Appendix B of this document shall meet the requirements of both ISO
Guide 25, General Requirements for the Competency of Testing and Calibration Laboratories and
Chapter Five, Quality Systems, of the NELAC standards. The ability to meet the ISO 9001 quality
system requirement may be fulfilled through registration of the PT Provider's quality system to
American National Standards Institute (ANSI) standards by a Registrar Accreditation Board (RAB)-
accredited registrar. However, a biennial on-site inspection by the PTOB/PTPA demonstrating
continuing conformance is required.
A.3 PROVIDER FACILITIES AND PERSONNEL
Each provider is required to have systems in place to produce, test, distribute, and provide data
analysis and reporting functions for any series of samples for which they are requesting approval.
Similarly, the providershall have in place sufficient technical staff, instrumentation, and computer
capabilities as may be required by the PTOB/PTPA to support the production, distribution, analysis,
data collection, data analysis, and reporting functions of the samples. No portion of the production,
testing, distribution, data collection, data analysis, nor data reporting functions may be outside the
control of the PT Provider for any particular study, since it is essential that the confidentiality of the
samples be maintained throughout the PT study. Forthe purposes of this requirement "control" can
mean ownership or that the subcontracted service is performed under an agreement which
specifically ensures the ability of the provider to access and restrict the distribution of information
related to these services. Any subcontracted services shall be assessed by a PTOB/PTPA and meet
the same criteria as the PT Provider.

-------
NELAC
Proficiency Testing
Appendix A
Revision 16
May 25, 2001
Page 2A-2 of 4
A.4 SAMPLE FORMULATION REVIEW
The PT Provider shall demonstrate to the PTOB/PTPA, by the submission of appropriate data, that
the sample formulation for which the PT Provider is seeking approval shall permit participating
laboratories to generate results that fall within the sample acceptance ranges established by the
NELAC Standing Committee on Proficiency Testing and meet the criteria of the "National Standards
for Water Proficiency Testing Studies, Criteria Document" (USEPA).
A.4.1 Release of Information
In support of the requirement in Section A.4.0, PTOBs/PTPAs shall treat all sample formulation
information submitted to them for review as the proprietary information of the PT Provider submitting
the information. Such formulation information shall not be released by a PTOB/PTPA without the prior
written consent of the PT Provider.
A.5 PROVIDER CONFLICT-OF-INTEREST REQUIREMENTS
PT Providers seeking approval shall document to the satisfaction of the PTOB/PTPA that they do not
have a conflict of interest with any laboratory seeking, or having, NELAP accreditation. PT Providers
shall notify the PTOB/PTPA of any actual or potential organizational conflicts of interest, including but
not limited to:
a)	Any financial interest in a laboratory seeking, or having, NELAP accreditation;
b)	The sharing of personnel, facilities or instrumentation with a laboratory seeking, or having,
NELAP accreditation.
The PT Provider is also required to inform all internal and contract personnel who perform work on
NELAC PT samples of their obligation to report personal and organizational conflicts of interest to the
PTOB/PTPA. The provider shall have a continuing obligation to identify and report any actual or
potential conflicts of interest arising during the performance of work in support of NELAC PT
programs. If an actual or potential organizational conflict of interest is identified during performance
of work in support of NELAC PT programs, the PT Provider shall immediately make a full disclosure
to the PTOB/PTPA. The disclosure shall include a description of any action which the provider has
taken or proposes to take, after consultation with the PTOB/PTPA, to avoid, mitigate or neutralize the
actual or potential conflict of interest. The PTOB/PTPA may reevaluate a PT Provider's approval
status as a result of unresolved conflict of interest situations. Any conflict of interest disputes between
the PT Provider and the PTOB/PTPA may be appealed to NELAP for a final determination.
A.5.1 Ban on Distribution of Samples
PT Providers shall not sell, distribute, or provide samples used in the NELAC PT program priorto the
conclusion of the study for which they were designed. Providers shall not sell, distribute, or provide
samples of identical formulation and concentration to those samples which it is currently using in a
NELAC study.
A.6 CONFIDENTIALITY OF PT STUDY DATA
The PT Provider shall demonstrate to the PTOB/PTPA that it has systems in place to ensure that the
confidentiality of data associated with NELAC PT samples and programs are not compromised. PT
Providers shall not release the assigned value of any sample currently being used in a NELAC PT
study priorto the conclusion of the study.

-------
NELAC
Proficiency Testing
Appendix A
Revision 16
May 25, 2001
Page 2A-3 of 4
A.7 DATA REVIEW AND EVALUATION
The NELAP designated PTOB/PTPA shall review the data from every PT Provider's studies to ensure
that acceptance limits used to evaluate laboratories are consistent with national standards as
established by NELAC. The PTOB/PTPA shall also evaluate the performance of the PT Providers
by monitoring, and reporting, to both the providers and the NELAC Standing Committee on
Proficiency Testing the pass/fail rates of all providers on all samples tested. A PTOB/PTPA is
required to investigate any PT Provider whose pass/fail rate is statistically different from the national
average.
A.8 COMPLAINTS & CORRECTIVE ACTION
Written complaints received by the PT Provider regarding technical or procedural aspects of the
studies they conduct shall be submitted to the PTOB/PTPA within 30 calendar days of receiving the
complaint. The PT Provider shall resolve the complaint to the satisfaction of the PTOB/PTPA. The
PTOB/PTPA is the sole judge of the adequacy of the corrective action taken by the PT Provider. The
PTOB/PTPA shall provide NELAP with an annual summary of all PT Provider complaints received
during the prior year.
A.9 LOSS OF PROVIDER APPROVAL
PT Providers who fail to meet the requirements of these standards may be subject to loss of their
approval as a NELAC PT Provider. Providers may lose approval to provide individual sample sets
based upon review of PT study data by a PTOB/PTPA as required in Appendix A, Section A.7.
Similarly, PT Providers who fail to meet the requirements of Appendix A, Sections A.2 through A.6,
on a continuous basis may lose their approval as a PTOB/PTPA-approved PT Provider for all
samples.
A.9.1 Periodic Review of PT Providers
A PTOB/PTPA may at any time, review the performance of any approved PT Provider against these
standards. Based upon this review, the PTOB/PTPA may decide that the approval status of a PT
Provider be revoked, adjusted, limited, or otherwise changed based upon failure to meet one or more
of the specified requirements.
A.9.2 Revocation of Approval
Should a PTOB/PTPA propose to revoke or suspend a provider's approval for failure to meet the
requirements of these standards, the PTOB/PTPA shall inform the provider of the reasons for the
proposed revocation or suspension and the procedures for appeal of such a decision. The due
process rights of the provider shall be protected during any revocation or suspension proceedings.
The final decision on the revocation or suspension of a provider's approval to supply PT samples for
the NELAP accreditation resides with the Director of NELAP. If the provider loses PTOB/PTPA
approval it shall lose NELAP approval to supply samples for the NELAC PT program.

-------
NELAC
Proficiency Testing
Appendix A
Revision 16
May 25, 2001
Page 2A-4 of 4
A.10 NOTIFICATION OF SAMPLE INTEGRITY
The provider is responsible for notifying all laboratories and Primary Accrediting Authorities when a
particular analyte was determined not to meet the requirements of Appendix B or is deemed of
unacceptable quality for NELAC purposes, within 30 calendar days of the study closing date.

-------
PROFICIENCY TESTING
APPENDIX B
PT SAMPLE DESIGN
& ACCEPTANCE GUIDELINES

-------
NELAC
Proficiency Testing
Appendix B
Revision 16
May 25, 2001
Page 2B-1 of 3
Appendix B - PT SAMPLE DESIGN & ACCEPTANCE GUIDELINES
B.O INTRODUCTION
An integral element of the NELAC PT program standards is the assurance of PT samples which are
of high quality, well documented, homogeneous, and stable. To meet the goals of NELAC, the PT
samples used in the program shall also provide all laboratories with samples which offer a consistent
challenge. All PT samples shall meet all applicable specifications of these standards.
B.1 SAMPLE FORMULATION APPROVAL
The PT Provider shall demonstrate the adequacy of sample formulation to the satisfaction of the
PTOB/PTPA. The criteria for formulation adequacy are that the sample shall provide equivalent
challenge to the laboratories under test as similar samples for the same parameters as other
providers, and that the sample shall exhibit laboratory acceptance rates, measured as provider
percentage pass/fail performance, consistent with other samples used in the program for the same
parameters.
B.1.1 Adequacy of the Sample Formulation
The testing and verification protocol required to establish sample equivalency shall be agreed to by
both the PT Provider and the PTOB/PTPA on a case-by-case basis. It is the responsibility of the PT
Provider to demonstrate the adequacy of sample formulation to the satisfaction of the PTOB/PTPA.
B.1.2 PT Sample Composition
PT Providers may choose to leave one or more specific analyte(s) out of PT samples, yet shall still
include those analyte(s) in the PT study to be counted and scored with the present analytes. The
guidelines in this section apply only to PT samples that contain analyte groups as defined in the
NELAC Field of Proficiency Testing tables located on the NELAC website. Analytes from different
groups may not be combined when determining the minimum number of analytes that must be
present in a sample. The value assigned to these unspiked analytes would be zero. A PT Provider
may choose not to include analytes; however, a minimum number of analytes shall be present in
every PT sample. The PT Provider shall prepare samples according to the following criteria:
a)	PT samples that are to be scored for one to ten analytes must include all of these analytes.
b)	PT samples that are to be scored for ten to twenty analytes must include at least ten of these
analytes or 80% of the total, whichever number is greater.
c)	PT samples that are to be scored for more than twenty analytes must include at least sixteen
of these analytes, or 60% of the total analytes, whichever number is greater.
d)	If following (b) or (c) above and a percentage of the total number of analytes in the sample is a
fraction, the fraction shall be rounded up to the next whole number. For example: 16 analytes
x 0.80 = 12.8 = 13 analytes in sample.
e)	PT Providers shall use a random selection process to determine which parameters will be
assigned zero values within any given PT sample.
All other PT samples must contain all the analytes of interest within the concentration ranges as
required by this standard.

-------
NELAC
Proficiency Testing
Appendix B
Proposed Changes
May 25, 2001
Page 2B-2 of 3
B.1.3 PT Sample Matrix
Referto the NELAC Glossary for definition of matrices. Note: PT samples are not currently available
for all matrices. Refer to the NELAC field of proficiency testing lists for sample availability.
B.1.4 PT Sample Composition for Solid Matrices
Soil PT samples shall be well-characterized natural soil and cannot contain 100% sand.
B.2 VERIFICATION OF ASSIGNED VALUE
All PT samples used for obtaining or maintaining NELAP accreditation shall be analyzed by the PT
Provider prior to shipment to the laboratories to ensure suitability for use in the program. The
assigned value of the sample shall be used to establish acceptance criteria, and it shall be verified
by analysis. PT Providers shall verify the assigned value by direct analysis against National Institute
of Standards and Technology (NIST) Standard Reference Materials (SRM), if a suitable NIST SRM
is available for use. If a NIST SRM is not available then verification shall be performed against an
independently prepared calibration material. An independently prepared calibrant is one prepared
from a separate raw material source, or one prepared and documented by a source external to the
provider.
B.2.1 Relative Standard Deviation of Verification Analysis
The method used by the PT Provider for verification analysis shall have a relative standard deviation
of not more than 50% of the relative standard deviation predicted at the assigned value by the
laboratory acceptance criteria being used by NELAC for each parameter. The relative standard
deviation of the provider's verification method shall be established by a method validation study, and
the suitability for use shall be approved by the NELAP designated Proficiency Testing Oversight Body
(PTOB)/Proficiency Test Provider Accreditor (PTPA).
B.2.2 Quality Control Check of the Assigned Value
The assigned value for every parameter in all PT samples shall be verified by analysis. The assigned
value of the analyte is verified if the mean of the verification analyses is within 1.5 standard
deviations, as calculated as described in Sections C.1.1.1 or C.1.1.2, of either a) the assigned value
if an unbiased verification method is used or b) the mean value for the analyte as calculated in
Sections C.1.1.1 or C.1.1.2 if a biased method is used. The standard deviation of the verification
analyses also shall be less than one standard deviation as calculated in Sections C.1.1.1 or C.1.1.2.
For analytes that are evaluated using fixed percentages as defined in Section C.1.1.1, standard
deviations are calculated by assuming that the fixed percentage is equal to two standard deviations.
B.3 HOMOGENEITY TESTING
PT sample homogeneity is essential to ensuring that all laboratories are treated fairly. Therefore, the
purpose of the homogeneity testing procedure is to establish at the 95% confidence level that all
samples distributed to the laboratories have the same assigned value for every parameter to be
evaluated. Homogeneity testing is required on all PT samples prior to sample shipment to the
laboratories.
B.3.1 Homogeneity Testing Procedure
The homogeneity of the samples shall be established using a generally accepted statistical
procedure. The procedure selected by the PT Provider shall be capable of evaluating the relative

-------
NELAC
Proficiency Testing
Appendix B
Revision 16
May 25, 2001
Page 2B-3 of 3
consistency of each analyte across the production run, and shall be performed on the final packaged
samples. The procedure shall establish at the 95% confidence level that the assigned value is
consistent across the production run. Samples, or parameters, which fail to pass the homogeneity
testing criteria cannot be used in the NELAC PT program to evaluate laboratories.
B.3.2 Suitable Homogeneity Testing Procedures
A suitable homogeneity testing procedure shall be capable of comparing the between sample to within
sample standard deviation across the PT Provider's packaging run, and shall ensure comparability
with 95% confidence. Suitable homogeneity testing procedures are available in both ISO Guide 35
for the Certification of Reference Materials and in the ISO Reference Material Committee (REMCO)-
Association of Official Analytical Chemists (AOAC) Harmonized Protocol for the Proficiency Testing
of Analytical Laboratories. However, the homogeneity testing procedure used by the PT Provider
shall be approved for use by the PTOB/PTPA.
B.4 STABILITY TESTING
The samples used in the NELAC PT program shall be verified as stable for the period of each study.
Therefore, the stability of all samples and parameters shall be established by the PT Provider
following the close of data submission from the laboratories. The samples are considered stable for
the period of the study if the mean analytical value as determined after the study for each parameter
falls within the 95% Confidence Interval calculated for the prior to shipment verification testing used
to establish the assigned value. The testing procedure used for stability testing shall be approved for
use by the PTOB/PTPA.
B.5 DATA REPORTING BY PT PROVIDERS
The results of sample assigned value verification, homogeneity, and stability testing shall be available
to the participating laboratories. All data developed by the provider in support of verification testing,
homogeneity testing, and stability analysis shall be provided to any laboratory participating in the
program upon request after the close of the study. Providers shall supply PT data to the Primary
Accrediting Authorities, as per Section 2.6, in a format acceptable to the Primary Accrediting
Authority.
B.5.1 Verification and Homogeneity Reports
The data developed by the PT Provider in support of verification and homogeneity testing shall be
supplied in summary format to the PTOB/PTPA in an electronic format to be determined by the
PTOB/PTPA. Verification and homogeneity data shall be supplied to the PTOB/PTPA priorto sample
distribution to the laboratories.
B.5.2 Laboratory Data and Stability Reports
All summary data from the laboratories and the results of stability testing shall be provided to the
PTOB/PTPA in an electronic format to be determined by the PTOB/PTPA within 30 calendar days of
the close of the study.

-------
PROFICIENCY TESTING
APPENDIX C
PT ACCEPTANCE CRITERIA
AND
PT PASS/FAIL CRITERIA

-------
NELAC
Proficiency Testing
Appendix C
Revision 16
May 25, 2001
Page 2C-1 of 3
Appendix C - PT ACCEPTANCE CRITERIA AND PT PASS/FAIL CRITERIA
C.O PURPOSE, SCOPE, AND APPLICABILITY
This appendix defines the criteria to be used by any entity which seeks to participate as a NELAP-
designated PTOB/PTPA-approved Proficiency Test Provider for scoring the results obtained from the
analyses of samples in any NELAC PT study. The PT Providers shall submit all laboratories'
performance rating(s) to the Primary Accrediting Authority, as described in Chapter Two of the
NELAC standards, to be used as a tool for determining a laboratory's accreditation status. PT
acceptance limits and pass/fail criteria are established on a field of proficiency testing basis.
C.1 ANALYTE ACCEPTANCE LIMITS
Acceptance limits are established for each analyte as described in this appendix. The tables
containing all analyte acceptance limits established by the NELAC Standing Committee on Proficiency
Testing and from the USEPA Criteria Document shall be posted on the NELAC Website and reviewed
annually by the NELAC Standing Committee on Proficiency Testing.
C.1.1 Analyte Acceptance Limit Categories
Acceptance limits are separated into two categories. Results for analytes with acceptance limits
determined as described in Sections C.1.1.1 and C.1.1.2 shall be used in the determination of a
laboratory's field of proficiency testing pass/fail evaluation. Results for analytes with acceptance limits
determined as described in Section C.1.1.3 shall not be used as part of the field of proficiency testing
acceptable/not acceptable evaluation.
C.1.1.1 Drinking Water, Waste Water, and Ambient Water Analytes with USEPA Established
Acceptance Limits
PT Providers shall utilize the proficiency test acceptance limits that have been established by USEPA
in the "National Standards for Water Proficiency Testing, Criteria Document" where they apply. The
"National Standards for Water Proficiency Testing, Criteria Document" is incorporated into this
appendix by reference.
C.1.1.2 Analytes with Acceptance Limits Established by the NELAC Standing Committee on
Proficiency Testing
For analytes not included in the "National Standards for Water Proficiency Testing, Criteria
Document," Proficiency Test providers shall use acceptance limits established by the NELAC
Standing Committee on Proficiency Testing and shall be made available to PTOB/PTPA-approved
PT Providers by the PT Committee Chair or the Director of NELAP. Data from sources such as the
USEPA Proficiency Evaluation (PE) studies, interlaboratory results from professional organizations
such as ASTM, other Proficiency Test Providers, commercial and non-profit organizations, shall be
used to establish the evaluation criteria. All evaluation criteria shall be approved by the NELAC
Standing Committee on Proficiency Testing prior to use by a PTOB/PTPA-approved PT Provider.
C.1.1.3 Experimental Data: Analytes without Promulgated Acceptance Limits or Established
Regression Equations
For those analytes not included in categories C.1.1.1 or C.1.1.2, e.g., newly regulated analytes, or
analytes in a matrix that have not been fully evaluated in interlaboratory studies, NELAC acceptance
limits shall be established only after interlaboratory data has been collected for a minimum of one
year unless the NELAC Standing Committee on Proficiency Testing determines that sufficient data

-------
NELAC
Proficiency Testing
Appendix C
Revision 16
May 25, 2001
Page 2C-2 of 3
have been collected in less time. The data obtained during the one-year period shall be referred to
as "experimental data". The NELAC Standing Committee on Proficiency Testing shall derive
regression equations to be used to establish acceptance limits for analytes in the experimental
category after sufficient data have been collected. The laboratory shall receive a copy of its own
experimental data from the PT Provider at the conclusion of the PT study.
C.2 ACCEPTABLE PT RESULTS FOR CHEMICAL ANALYTES IN POTABLE WATER AND NON-
POTABLE WATER PT SAMPLES
A laboratory's PT analyte result is acceptable when it falls within the regulatory promulgated
acceptance limits (Section C.1.1.1). For Section C.1.1.2 analytes, PT Providers shall use the PT
sample's verified assigned value and said regression equations to determine the mean and standard
deviation. Acceptance limits shall be set at the mean ± two standard deviations for potable water
analytes and the mean ± three standard deviations for non-potable water analytes. A result is
acceptable when it falls within these derived acceptance limits.
C.3 NOT ACCEPTABLE PT RESULTS FOR POTABLE WATER AND NON-POTABLE WATER
PT SAMPLES
A laboratory's result for any analyte is considered unacceptable if it meets any of the following criteria:
a)	the result falls outside the acceptance limits;
b)	the laboratory reports a result for an analyte not present in the PT sample (i.e., a false positive);
or,
c)	the laboratory does not withdraw from a study as described in Section 2.7.7, and fails to submit
its results to the PT Provider on or before the deadline for the PT study.
C.4 ADDITIONAL REQUIREMENTS FOR PT PROVIDERS
PT Providers shall examine all data sets for bimodal distribution and/or situations where results from
a given method have disproportionally large failure rates or reporting anomalies to the Proficiency
Testing Oversight Body/Proficiency Test Provider Accreditor. If bimodal or multimodal distribution is
found and acceptance criteria are calculated using robust statistical analysis, data should be scored
by method specific robust statistical analysis. All proficiency test data are to be submitted to the
PTOB/PTPA in the format specified by the PTOB/PTPA and shall be reviewed annually by the NELAC
Standing Committee for Proficiency Testing forthe purpose of revising existing and establishing new
evaluation criteria.
C.4.1 Additional Matrix/Analyte Groups
Additional matrices and/or analytes may be added to the NELAC PT fields of accreditation at the
request of any Accrediting Authority, USEPA program office, orPTOB/PTPA-approved PT Provider.
The request for the addition of an analyte must include at a minimum ten sets of interlaboratory data
on the analyte in the particular matrix. Each data set must contain a minimum of twenty valid data
points. The NELAC Standing Committee on Proficiency Testing shall review the data and develop
an initial set of laboratory acceptance limits based upon the needs of the Accrediting Authorities,
USEPA, and the laboratories. Laboratory acceptance limits developed by the PT Committee on any
new matrix/analyte combinations shall be reviewed annually by the PT Committee. The purpose of
this annual review is to ensure that the limits represent the actual capabilities of the laboratories. For
any additional matrix or analyte groups added to the NELAC field of proficiency testing by the NELAC

-------
NELAC
Proficiency Testing
Appendix C
Revision 16
May 25, 2001
Page 2C-3 of 3
PT Committee, laboratories shall complete two successful PT studies within 12 months of the date
the additional groups were added.
C.5.0 NELAC PT Study Pass/Fail Criteria
NELAC PT studies are designed to meet the requirements of Chapter 2 and associated appendices.
Once data acceptability has been determined as described in Sections C.1 through C.3 of this
appendix, the laboratory's PT "Pass" or "Fail" evaluation is determined as described in this section.
Pass/Fail criteria are used when groups of analytes are evaluated as a unit for the laboratory's initial
demonstration of proficiency.
C.5.1 Analyte Group PT Studies
Analyte Group PT Studies are those that are analyzed using methods in which the ability to correctly
identify and quantitate a series of analytes is indicative of the laboratory's ability to correctly determine
the presence or absence of similar analytes. Analyte groups for proficiency testing are defined in the
NELAC Field of Proficiency Testing tables located on the NELAC website.
C.5.2 Promulgated USEPA Pass/fail Criteria
In all cases, promulgated EPA pass/fail criteria, e.g., drinking water volatiles as listed in 40 CFR
141.61(a), subsection (m)(1), will be used as NELAC PT pass/fail criteria as applicable. The criteria
described in Section C.5.3 shall be used in the absence of promulgated USEPA pass/fail guidelines.
C.5.3 Pass/fail Criteria For Analyte Group PT Samples
Proficiency testing pass/fail evaluations for Analyte Group PT studies shall be determined as follows.
To receive a score of "Pass", a laboratory must produce "Acceptable" results as defined in Section
C.1 for 80% of the analytes in an Analyte Group PT Study. Greater than 20% "Not Acceptable"
results shall result in the laboratory receiving a score of "Fail" for that group of analytes. For example,
a laboratory must report all "Acceptable" results for an Analyte Group PT Study containing 1-4
analytes, may report no more then one "Not Acceptable" result for a study containing 5-9 analytes,
two "Not Acceptable" results for a study containing 10-14 analytes. A "Not Acceptable" result for the
same analyte in two out of three consecutive PT studies shall also result in the laboratory receiving
a score of "Fail" for that analyte. The PCB analyte group is exempt from the 80% pass/fail criteria.

-------
PROFICIENCY TESTING
APPENDIX D
PROFICIENCY TESTING
OVERSIGHT BODYI
PROFICIENCY TEST PROVIDER
ACCREDITOR

-------
NELAC
Proficiency Testing
Appendix D
Revision 16
May 25, 2001
Page 2D-1 of 5
Appendix D - PROFICIENCY TESTING OVERSIGHT BODY/
PROFICIENCY TEST PROVIDER ACCREDITOR
D.O	PURPOSE, SCOPE, AND APPLICABILITY
This appendix defines the qualifications, scope of responsibilities and requirements for a NELAP
designated Proficiency Testing Oversight Body (PTOB)/Proficiency Test Provider Accreditor (PTPA)
as defined in Section 2.2.2 of the NELAC document. In addition to complying with the requirements
of this appendix, a PTOB/PTPA, for this oversight function, shall comply with the applicable
requirements described in Chapter2 and associated Appendices A (PT Provider Acceptance Criteria),
B (PT Sample Design and Acceptance Guidelines), and C (Criteria for Setting PT Data Acceptance
Limits). Organizations meeting the requirements of this standard and its appendices, as determined
by any NELAC-recognized Accrediting Authority may be nominated to the NELAC Board of Directors
to be listed as a NELAP PTOB/PTPA.
D.1 TECHNICAL AND ADMINISTRATIVE QUALIFICATIONS
An organization shall demonstrate to the NELAC Standing Committee on Proficiency Testing by the
submission of a current Statement of Qualifications that it has the technical expertise, administrative
capacity, and financial resources sufficient to implement and operate a national program of PT
Provider evaluation and oversight. In the event that the organization is not a nationally or
internationally recognized authority, the NELAC Standing Committee on Proficiency Testing reserves
the right to request further documentation detailing the organization's qualifications. The organization
shall meet the following general requirements:
a)	Demonstrate the capability to manage and evaluate complexenvironmental reference materials
in a variety of matrices;
b)	Demonstrate expertise in statistical applications as related to large interlaboratory performance
evaluation programs;
c)	Demonstrate the capability to conduct on-site audits of PT Providers;
d)	Demonstrate the capability to conduct technical reviews of Initial Applications;
e)	Demonstrate a knowledge and understanding of the ISO guides 9001, 34,43, and Chapter Two
of the NELAC standards including Appendices A, B, and C.
D.2 PTOB/PTPA RESPONSIBILITIES REGARDING INITIAL ASSESSMENT OF PT
PROVIDERS
PTOB/PTPA responsibilities are described in this section. The primary responsibility of a PTOB/PTPA
is the oversight and ongoing monitoring and evaluation of the PT Providers. The oversight activities
of a PTOB/PTPA shall be designed to ensure that the PT Provider meets the requirements specified
in Chapter Two and Appendices A, B and C. Any variations from these requirements shall be
approved by the NELAC Standing Committee on Proficiency Testing prior to a body being approved
as a NELAC PTOB/PTPA. All activities described herein shall be conducted by a PTOB/PTPA.
D.2.1 Development of Standard Operating Procedures and Forms
PTOBs/PTPAs shall develop the Standard Operating Procedures (SOPs) necessary to conduct the
PT Provider evaluation process. These documents shall be based upon the requirements of Chapter
Two of the NELAC standards and the associated Appendices A, B, and C. The NELAC Standing

-------
NELAC
Proficiency Testing
Appendix D
Revision 16
May 25, 2001
Page 2D-2 of 5
Committee on Proficiency Testing has the authority to review and approve, as necessary, the SOPs
developed by a PTOB/PTPA.
D.2.1.1 SOP(s) for the Assessment Process
The PTOB/PTPA shall develop and implement SOP(s) including but not limited to: the initial
application submittal and review process, on-site inspection, submittal of final reports to NELAP, the
procedures for determining that a PT Provider's approval be revoked, the procedures for appealing
approval determinations, and any other procedures deemed necessary by NELAC.
D.2.1.2 Initial Application
A PTOB/PTPA shall develop the initial application process to be submitted by PT Providers applying
for approval as PT Providers of NELAC samples. The application shall include questions regarding
the qualifications of the organization seeking approval. In addition to completing the initial application
process, a PTOB/PTPA shall require that the PT Provider submit copies of its current ISO 9001
registration certificate or any other documents which detail the quality systems required by the
provisions of Chapter Two and associated appendices.
D.2.1.3 SOP(s) for On-site Inspections and Checklist(s)
A PTOB/PTPA shall develop SOP(s) for conducting consistent, effective, on-site inspections of PT
Providers. The SOP shall include policies which describe the circumstances for conducting any
additional inspections, and circumstances for determining whether on-site inspections shall be
announced or unannounced. A PTOB/PTPA shall develop standard, consistent checklists) to be used
during any and all inspections of PT Providers.
D.2.2 Initial Application Review and On-site Inspections
A PTOB/PTPA shall follow the procedures described in this section for the review of applications and
on-site inspections of any candidate PT Provider.
a)	A PTOB/PTPA shall review the initial application documents, described in D.2.1.2, for
compliance with the PT Provider qualifications described in Appendix A and other applicable
documents.
b)	A PTOB/PTPA shall review the sample designs used by the PT Provider for compliance with
Appendix B and other applicable documents.
c)	A PTOB/PTPA shall review the PT analyte and sample scoring procedures used by the PT
Provider for compliance with Appendix C and other applicable documents.
d)	Following the review of the Initial Application and associated documents, a PTOB/PTPA shall
conduct an on-site inspection of the PT Provider. The PT Provider shall be provided with
checklists) to be used during the inspection as part of the initial application process.
e)	Following the inspection, a PTOB/PTPA shall conduct an exit meeting with the PT Provider,
which shall include discussion of deficiencies and discrepancies found; however, a PTOB/PTPA
may further revise the findings after the closing of the exit meeting, if necessary.

-------
NELAC
Proficiency Testing
Appendix D
Revision 16
May 25, 2001
Page 2D-3 of 5
The inspection shall include, at a minimum:
1)	Review of the quality system for adherence to the requirements of Appendices A, B and C;
2)	Review of staff qualifications and technical expertise necessary to produce acceptable
proficiency testing samples;
3)	Review of the sample manufacturing and verification procedures to ensure that the
requirements of Appendices A and B are met;
4)	Review of the procedures in place to ensure that all personnel are aware of and abide by
standards of conduct for PT Providers and confidentiality of sample values; and,
5)	Review of data reporting systems to ensure that the requirements of Appendix C are met
within the time periods specified in Chapter Two.
f)	A PTOB/PTPA shall send a draft report to the PT Provider after the completion date of the
inspection. A PTOB/PTPA shall allow the PT Providerto review and comment on the draft if the
PT Provider finds any discrepancies and determines that revisions are necessary. A
PTOB/PTPA shall then submit a final inspection report to the PT Provider after the completion
of the on-site inspection. The final report may only contain discrepancies and findings identified
during the on-site inspection or discussed during the exit briefing.
g)	A PTOB/PTPA shall allow the provider to submit their response to the report. In order for the
provider's response to be considered acceptable, a PTOB/PTPA shall require that it include a
description of corrective actions necessaryto meet the criteria of Chapter Two, and Appendices
A, B, and C.
D.3 PTOB/PTPA RESPONSIBILITIES REGARDING APPROVAL OF PT PROVIDERS
A PTOB/PTPA shall utilize the appropriate final report and associated documents submitted by the
PT Provider to grant or deny approval to that provider.
D.4 PTOB/PTPA RESPONSIBILITIES FOR ONGOING OVERSIGHT OF PT PROVIDERS
A PTOB/PTPA shall conduct ongoing oversight of all approved PT Providers. The oversight shall
include at a minimum:
a)	the use of referee laboratories to verify the concentrations of analytes in randomly selected PT
Provider samples;
b)	the statistical monitoring of PT Provider's study data to detect occurrences which indicate
samples of unacceptable quality, i.e., failure rates that exceed expected norms, analyte standard
deviations that exceed expected intervals, and analyte mean recoveries which are significantly
above or below historical trends. The ongoing monitoring criteria to be used by a PTOB/PTPA
shall be developed by NELAC.
c)	biennial on-site inspections of the PT Provider review and monitoring of critical operational
parameters of the PT Provider, i.e., change in senior management, sale of the company.
d)	on-site inspections of the PT Provider for cause.

-------
NELAC
Proficiency Testing
Appendix D
Revision 16
May 25, 2001
Page 2D-4 of 5
Based upon the results of its ongoing oversight, the PTOB/PTPA may determine that the provider's
approval status be reevaluated.
D.5 DEVELOPMENT AND MAINTENANCE OF A COMPREHENSIVE PT DATABASE
A comprehensive PT database shall be developed and maintained by the PTOB(s)/PTPA(s) in
conjunction with NELAC.
D.6 COMPLAINTS AND CORRECTIVE ACTION
A PTOB/PTPA shall evaluate all complaints that it receives regarding either approved or candidate
PT Providers. If the PTOB/PTPA determines that a complaint warrants investigation, the PTOB/PTPA
shall notify the provider of the complaint. The PT Provider is required to resolve the complaint to the
satisfaction of the PTOB/PTPA. A PTOB/PTPA shall provide to the NELAC Standing Committee on
Proficiency Testing a summary of all PT Provider complaints received the previous year.
D.7 LIST OF APPROVED PT PROVIDERS
A PTOB/PTPA shall maintain a list of approved PT Providers. The list shall be maintained on a
continuing basis on an electronic bulletin board or similar means and shall be readily available to
laboratories seeking NELAP accreditation, State Accrediting Authorities and other interested parties.
PT Providers shall agree to abide by the provisions of NELAC regarding the advertising and
marketing use of the designation, "NELAP-designated PTOB/PTPA Approved Proficiency Test
Provider".
D.8 SPONSORSHIP OF ANNUAL NELAC PROFICIENCY TESTING CAUCUS
The PTOB(s)/PTPA(s) shall, in conjunction with NELAC, sponsor an annual NELAC Proficiency
Testing Caucus. The Caucus shall, if possible, be held in conjunction with the annual NELAC
meeting. The purpose of the Caucus is to provide a forum for PT Providers, Accrediting Authorities,
laboratories, federal agencies, and other interested parties to exchange information regarding the PT
study results of the previous year. The Caucus shall include technical presentations and open
discussions on means to improve the proficiency testing aspect of NELAC with a continuing goal of
improving the quality of environmental data generated by the NELAC accredited laboratories.
D.9 PTOB/PTPA ETHICS
This section describes the overall ethics and standards of conduct that shall be adhered to for a
PTOB/PTPA to implement and administer a successful PT Provider oversight program. A
PTOB/PTPA shall serve as an impartial body designed to objectively evaluate information about PT
Providers and use this information to make sound determinations regarding providers' approval
status. A PTOB/PTPA shall be able to certify to any interested party that it is free of any
organizational or financial conflict of interest, which would prevent it from complying with the
requirements of Appendix D. A PTOB/PTPA shall remain unbiased in evaluating information gathered
and received including inspection reports, referee sample results, complaints, and any other
information obtained regarding a PT Provider. The PTOB/PTPA shall evaluate all information
gathered and received about a provider related to providing NELAC PT samples, and determine
which information is relevant to the approval status of a provider, and provide that information to
NELAP, the Primary Accrediting Authorities, the laboratories, and the public as appropriate.

-------
NELAC
Proficiency Testing
Appendix D
Revision 16
May 25, 2001
Page 2D-5 of 5
D.10 CONFIDENTIALITY
A portion of the information provided to a PTOB/PTPA by the PT Provider in the course of its
inspection and oversight activities shall be proprietary in nature. A PTOB/PTPA shall agree to
maintain the confidentiality of proprietary information provided to it by the PT Provider.

-------
PROFICIENCY TESTING
APPENDIX E
MICROBIOLOGY

-------
NELAC
Proficiency Testing
Appendix E
Revision 16
May 25, 2001
Page 2E-1 of 2
Appendix E - MICROBIOLOGY
E.O PURPOSE
This appendix outlines the requirements for microbiological proficiency testing underthe Safe Drinking
Water Act (SDWA) and the Clean Water Act (CWA). Microbiological testing for other USEPA
programs shall be added as required. Semi-annual proficiency testing is required per the schedule
contained in Section 2.4.
E.1 SAMPLES
E.1.1 SDWA Samples
PT Providers shall present samples either as full volume samples or preparations easily reconstituted
to full volume samples. For the SDWA, there shall be ten 100+ ml. samples (as presented or after
reconstitution) for the qualitative determination (Presence/Absence) of total coliform and fecal coliform
(or E. coli). Sample sets which are provided to the laboratories shall contain bacteria that produce
the following:
Verification as total and fecal conforms (E. coli).
Verification as total conforms, but not as fecal conforms.
Bacterial contaminates which shall not verify as total or fecal conforms.
Furthermore, each set shall contain the following samples:
One to four samples containing an aerogenic strain of Escherichia coli fortotal and fecal coliform
positive results using all USEPA approved methods.
One to four samples containing Enterobacter sp. or other microorganisms ensuring a total
coliform positive and fecal coliform negative result using all USEPA approved methods.
One to four samples containing Pseudomonassp. or other microorganisms ensuring a total and
fecal coliform negative result using all USEPA approved methods.
One to four blank samples.
Optionally, one sample for the quantitative determination of Heterotrophic Plate Count.
Sample sets for qualitative analysis shall be randomly composed of samples that are Total coliform
absent, Total coliform only present and Fecal coliform (E. coli) present.
E.1.2 CWA Samples
For the CWA, one sample shall be provided for the quantitative determination of Total coliform or
Fecal coliform. Providers may require laboratories to analyze samples during a fixed time period after
sample shipment or at any time during the testing period which shall not exceed the time limit set in
Chapter Two.
E.2 SAMPLE PREPARATION AND QUALITY CONTROL
Proficiency test sample providers shall select bacterial strains and holding media that produce the
appropriate biochemical reactions for all approved analytical methods. This shall be documented by
analyses performed by the provider prior to sample shipment. The provider shall also demonstrate

-------
NELAC
Proficiency Testing
Appendix E
Revision 16
May 25, 2001
Page 2E-2 of 2
that the samples are stable by analysis of a randomly selected set either after the study closing date
or in the case of a study with a fixed testing period, on the last working day of the testing period.
E.3 SCORING
E.3.1 Qualitative Analyses, SDWA Samples
Participating laboratory results shall be considered Acceptable or Unacceptable when compared to
the known presence or absence of total coliform or fecal coliform (or E. coli) bacteria. Passing shall
be considered as nine out often samples having acceptable results, and no false negatives reported.
E.3.2 Quantitative Analyses
Quantitative result data sets shall be evaluated by analytical method using standard statistical
analysis with outlier rejection. Most Probable Number data shall be transformed to logs prior to
statistical analysis. Acceptable results are those that are within the interval defined by the mean plus
or minus two standard deviations for SDWA analytes or within the 99% confidence limits as set by
the mean, standard deviation and set size (n) for their respective data set for all other analytes.
E.3.2.1 Requirement for Quantitative Data Set Size
Each PT Provider's microbiological data set shall be comprised of at least 20 valid data points for
each method evaluated. Sample sets of less than 20 data points may be used only with the approval
of the PTOB/PTPA.

-------
PROFICIENCY TESTING
APPENDIX F
ENVIRONMENTAL TOXICOLOGY

-------
NELAC
Proficiency Testing
Appendix F
Revision 16
May 25, 2001
Page 2F-1 of 3
Appendix F - ENVIRONMENTAL TOXICOLOGY
F.O PURPOSE, SCOPE, AND APPLICABILITY
This appendix defines the criteria applying the proficiency testing (PT) program to the following
environmental toxicology programs: 1) whole effluent toxicity, 2) sediment toxicity, and 3) soils
toxicity.
F.1 RATIONALE
Accreditation for environmental toxicology testing laboratories shall be based on Proficiency
Testing and on-site audits, the latter including but not limited to an evaluation of personnel
qualifications, facility acceptability, quality system and standard operating procedures, status of
data/reports generated and routine standard toxicant testing. Proficiency Testing provides a
snapshot of the laboratory's capability; however, due to the number of variables inherent to
environmental toxicology testing it cannot carry the same weight as PT samples for chemical
analytes. PT samples shall be comprised of unknown concentrations of EPA's historical
reference toxicant materials. Every effort shall be made by the PTOB/PTPA working together with
the providers to reduce the number of variables in each method (i.e., organism age, etc.) while
following the routine language of the EPA protocols.
F.2 LABORATORY ENROLLMENT IN PROFICIENCY TESTING PROGRAMS
F.2.1 Required Level of Participation
Laboratories seeking accreditation for environmental toxicology shall participate in at least one PT
study per year for each method code as designated (method code includes matrix, organism,
exposure system, and endpoint).
F.2.2 Requirements for Laboratory Testing of PT Study Samples
a)	Analyze within 30 calendar days of sample receipt; report results within 30 calendar days of
completion.
b)	Samples shall be analyzed in the same manner as routine samples within the limits of the
method code - as close to "real world" testing as possible.
F.3 PT CRITERIA FOR LABORATORY ACCREDITATION
F.3.1 Initial and Continuing Accreditation
Laboratories which seek to obtain or maintain accreditation for environmental toxicology shall
successfully complete at least one PT sample per year for a given field of accreditation (i.e., not
more than 12 months apart) and at least 30 calendar days apart (i.e., participation in a second
round or remedial study may not occur within 30 calendar days of the first or failed study). Failure
to meet the annual schedule shall be regarded as a failed study. Results other than
acceptable/not acceptable may apply.

-------
NELAC
Proficiency Testing
Appendix F
Revision 16
May 25, 2001
Page 2F-2 of 3
F.4 Fields of Accreditation
The environmental toxicology PT program shall be organized by fields of accreditation based on
method [including matrix, test organism, and exposure system and endpoint(s)]. Laboratories
may choose to participate in one or more PT fields of accreditation, or portions thereof.
F.4.1 Whole Effluent Toxicity (WET) Method Codes
Prior to NIST accreditation of PT Providers for Environmental Toxicology methods, laboratories
seeking WET accreditation shall be assessed through on-site audit and evaluation of EPA
Discharge Monitoring Report - Quality Assurance (DMR-QA) test results. During this interim
period, a failed DMR-QA endpoint shall require: 1) a formal response to the Accrediting Authority
(AA) with an explanation of probable cause for the endpoint failure and description of corrective
actions to be taken (where appropriate) and 2) a decision by the AA to accept the response or
require additional on-site audits. There shall be no loss of accreditation based solely on PT
results during this interim period.
If a laboratory fails a WET PT endpoint, the laboratory is required to successfully complete a
remedial study. A remedial study must be conducted, at least 30 calendar days from the previous
PT study, until two acceptable results are obtained. The AA may conduct additional onsite audits
as necessary. The default for the WET PT program is accreditation without PT samples.
Interim method codes shall reflect the EPA DMR-QA study codes for the current study year.
F.4.2 Test Conditions for Sediment Toxicity (Solid Phase)
The following table describes the test conditions to be followed for sediment toxicity testing:
Test Organism
Test Conditions
Method Code
Freshwater amphipod
10-d, static, renewal, synthetic MHW
TBS1
Midge larvae
10-d, static, renewal, synthetic MHW
TBS
Saltwater amphipod
10-d, static, non-renewal, synthetic SW
@ 20 %o
TBS
Polychaete worm
10-d, static, non-renewal, synthetic SW
@ 28 %o
TBS
1 TBS = To Be Specified
F.4.2.1 Sediment Toxicity PT Samples
Accreditation for whole sediment toxicity methods shall be based solely on the on-site audit until
further notice.

-------
NELAC
Proficiency Testing
Appendix F
Revision 16
May 25, 2001
Page 2F-3 of 3
F.4.3 Test Conditions for Soil Toxicity
The following table describes the test conditions to be followed for soil toxicity testing:
Test Organism
Test Conditions
Method Code
Eisenia foetida survival test
14-d static, non-renewal, 24L:0D
TBS1
Lettuce (Lactuca sativa) seed
germination test
120-h static, non-renewal, 16L:8D
TBS
Lettuce (Lactuca sativa) root
elongation test
120-h static, non-renewal, 0L:24D
TBS
1 TBS = to be specified
F.4.3.1 Soil Toxicity PT Samples
Accreditation for soil toxicity methods shall be based solely on the on-site audit until further notice.

-------
PROFICIENCY TESTING
APPENDIX G
RADIOCHEMISTRY

-------
NELAC
Proficiency Testing
Appendix G
Revision 16
May 25, 2001
Page 2G-1 of 1
Appendix G - RADIOCHEMISTRY
G.O PURPOSE
This appendix contains the NELAC requirements for radiochemical proficiency testing underthe Safe
Drinking Water Act (SDWA). The appendix supplements the requirements of Chapter 2 and
Appendices A, B, and C with requirements specific for NELAC radiochemical proficiency testing
studies.
Radiochemical proficiency testing for other USEPA Programs shall be added as the necessary
resources, proficiency testing objectives and supporting data are available.
Other pertinent information concerning the SDWA radiochemical proficiency testing samples are
available from the NELAC PT Committee Chair or the Executive Director of NELAP.
G.1 PROFICIENCY TESTING PROVIDER LICENSING
Possession, transfer and use of many radioactive materials is regulated by the Nuclear Regulatory
Commission (NRC) or State radiological departments. The PT Provider shall ensure that they are
licensed not only for the possession and use of radioactive materials in their facility but also for the
explicit distribution of these materials in commerce.
G.2 SDWA SAMPLE DESIGN
The PT Provider must ensure that the sample design used forthe SDWA radiochemical PT samples
meets the applicable criteria contained in the USEPA's "National Standards for Water Proficiency
Testing Studies, Criteria Document".
G.2.1 ASSIGNED VALUES
Assigned values must be within the ranges established by the USEPA in the "National Standards for
Water Proficiency Testing Studies, Criteria Document", where they apply. Assigned values are
selected such that the concentration of each analyte will vary overtime throughout the concentration
range. The PT Provider must also ensure that the method for selecting an assigned value meets the
applicable criteria contained in the EPA's "National Standards for Water Proficiency Testing Studies,
Criteria Document". The assigned value is determined based on the mass of standard added to the
volume of water as follows:
Assigned value (pCi/L) = pCi activity added volume preserved water dilution factor.
G.3 SCORING
Each result from a participating laboratory is classified as "Acceptable" or "Not Acceptable" following
the procedures contained in Chapter 2, Appendix C. The acceptance limits are equal to ±2 "single
determination" standard deviations (USEPA's "National Standards for Water Proficiency Testing
Studies, Criteria Document") and are centered on the assigned values.
G.4 STUDY TIMETABLES
Semi-annual proficiency testing is required per the schedule contained in Section 2.4. The samples
shall be analyzed and the results returned to the PT Provider within the applicable time frames
specified in the USEPA's "National Standards for Water Proficiency Testing Studies, Criteria
Document."

-------
PROFICIENCY TESTING
APPENDIX H
PERFORMANCE TESTING
REQUIREMENTS FOR FIELD AIR
MEASUREMENT

-------
NELAC
Proficiency Testing
Appendix H
Revision 16
May 25, 2001
Page 2H-1 of 4
Appendix H - PERFORMANCE TESTING REQUIREMENTS FOR FIELD AIR MEASUREMENT
H.O INTRODUCTION: PURPOSE, SCOPE, AND APPLICABILITY
This Appendix defines the criteria to be used by any entity which seeks to participate as a
Proficiency Test Provider and score the results obtained from the analyses of samples in an air
measurement NELAC PT Study. This appendix specifically covers performance testing (PT)
requirements for Source and Ambient air field measurement conducted for regulatory compliance.
There are two categories of performance testing performed for compliance related air sample field
measurement: 1) calibration-based performance testing conducted for field instruments for which
delivery of a representative, quality controlled PT sample is not practical, and 2) performance testing
for field instruments for which delivery of a representative, quality controlled PT sample is possible.
For example, EPA Method 5 is used to collect (on a batch, time-integrated basis) particulate matter
from stationary emission sources. The equipment metering box and probe are calibrated per the
method prior to and then upon its return from the field after sampling is completed. During its use in
the field there is no practical means of introducing a controlled PT sample (category 1 example). In
contrast, continuous emission monitors (CEMs) for both ambient air and source emission monitoring
can be challenged with a PT gas in a cylinder to determine performance of that instrument during its
operation in the field (category 2 example).
In category 1 for field measurements in which the delivery of acceptable and appropriate PT samples
is not possible, calibration and maintenance requirements outlined in Chapter 5 Quality Systems or
Chapter 7 Field Activities will be used to assure the quality and representativeness for field
measurement data.
This standard is being developed only forthe category2 performance testing of field measurements
where delivery of a standard PT sample is possible. Calibration-based performance testing will be
a subset of either the NELAC Quality Systems or Field Activities Chapters, as appropriate.
For field measurements that fall under this standard, two distinct sets of scoring criteria are defined:
1) whether or not an individual analyte result is either "Acceptable" or "Not Acceptable" and 2) whether
or not a laboratory's initial PT performance for a group of interdependent analytes can be evaluated
as "Pass" or "Fail." The PT Providers will submit all field measurement performance rating(s) to the
Primary Accrediting Authority, as described in Chapter 2 of the NELAC standards, to be used as a
tool for determining a laboratory's accreditation status. PT acceptance limits and pass/fail criteria are
established on a field of proficiency testing basis.
H.1 Proficiency Testing for Field Air Measurement
Field air measurements referto measurements taken in the field for regulatory compliance. Examples
include continuous emission monitors (CEM) used to obtain real-time measurements of emissions
from industrial point source discharges or from ambient air monitoring. Also included are gaseous
organic emissions by gas chromatography (GC) and Fourier transform infrared (FTIR) spectroscopy
real-time monitors used to monitor criteria pollutants at a Superfund site fence line..
NELAC intends to develop PT criteria for relevant field measurements. The criteria will be developed
to mirror PT criteria for laboratory sample analysis; however, for many field measurements, delivery
of representative, quality controlled PT samples will be problematic. The standard will be developed
to address those field measurements for which PT sample delivery is possible. For field
measurements in which delivery of acceptable PT samples is not possible, calibration and
maintenance requirements outlined in Ch. 5 Quality Systems will be used to assure the quality and
representativeness of field measurement data.

-------
NELAC
Proficiency Testing
Appendix H
Revision 16
May 25, 2001
Page 2H-2 of 4
H.2 ACCEPTANCE LIMITS
Acceptance limits are established for each analyte. Whether or not a laboratory has passed or failed
a group of interdependent analytes is based on the number of results that are determined to be
acceptable.
H.2.1 Analyte Acceptance Limit Categories
Acceptance limits are separated into two categories. Results for analytes with acceptance limits
determined as described in Sections H.2.1.1 and H.2.1.2 shall be used in the determination of a
laboratory's field of proficiency testing pass/fail evaluation. Results for analytes with acceptance limits
determined as described in Section H.2.1.3 shall not be used as part ofthe field of proficiency testing
pass/fail evaluation.
H.2.1.1 Analytes with USEPA Established Acceptance Limits (Prepared ± fixed percentage
or Mean ± 2 standard deviations)
PT Providers shall utilize the proficiency test acceptance limits that have been established by
USEPA in the National Standards for air proficiency testing studies where they apply. The
National Standards are incorporated into this Appendix by reference. EPA's established proficiency
test acceptance limits for chemical analytes are typically expressed in the following manner:
Prepared ± fixed percentage. Acceptance limits shall be set at plus and minus the published
fixed percentage ofthe analyte's verified prepared value.
Mean ± 2 standard deviations. The NELAC Standing Committee on Proficiency Testing has a
process for establishing linear regression equations relating a PT samples prepared value to mean
and prepared value to standard deviation, acceptance limits shall be set using said equations and the
sample's verified prepared value. Linear regression equations may only be used for prepared values
that fall within the range of prepared values used to establish said equations. In the event that there
are no linear regression equations available for a given analyte, that analyte shall be treated as
described in Section H.2.1.3.
H.2.1.2 Analytes with acceptance limits derived from regression equations established by
the NELAC Standing Committee on Proficiency Testing
When USEPA Program regulations for establishing acceptance criteria are not available Proficiency
Test providers shall set acceptance limits using regression equations that predict the mean and
standard deviation for an analyte in a given range of concentrations. Regression equations shall be
derived by the NELAC Standing Committee on Proficiency Testing and shall be made available to
PTPA-approved PT Providers by the PT Committee Chair or the Executive Director of NELAP. Data
from sources such as the USEPA PE studies, interlaboratory results from professional organizations
such as ASTM, other proficiency testing providers, commercial and non-profit organizations, shall be
used to establish the equations. All regression equations shall be approved by the NELAC Standing
Committee on Proficiency Testing priorto use by a PTPA-approved PT Provider. Forthese analytes,
the PT Provider shall use the sample's verified prepared value and said equations to determine the
mean and standard deviation.
H.2.1.3 Experimental Data: Analytes without promulgated acceptance limits orestablished
regression equations
For those analytes not included in categories H.2.1.1 or H.2.1.2, e.g., newly regulated analytes, or
analytes in a matrix that have not been fully evaluated in interlaboratory studies, NELAC acceptance
limits shall be established only after interlaboratory data has been collected for a minimum of one year

-------
NELAC
Proficiency Testing
Appendix H
Revision 16
May 25, 2001
Page 2H-3 of 4
unless the NELAC Standing Committee on Proficiency Testing determines that sufficient data have
been collected in less time. The data obtained during the one-year period shall be referred to as
"experimental data". The NELAC Standing Committee on Proficiency Testing shall derive regression
equations to be used to establish acceptance limits for analytes in the experimental category after
sufficient data have been collected. The laboratory shall receive a copy of its own experimental data
from the PT Provider at the conclusion of the PT study.
H.3 ACCEPTABLE PT RESULTS FOR CHEMICAL ANALYTES IN FIELD AIR PT
MEASUREMENTS
Criteria for acceptable results for will be dependent on the precision and accuracy of the accepted
field measurement method. A laboratory's PT analyte result is acceptable when it falls within the
regulatory promulgated acceptance limits (Section H.2.1.1). For Section H.2.1.2 analytes, PT
Providers shall use the PT sample's verified prepared value and said regression equations to
determine the mean and standard deviation. Acceptance limits shall be set at the mean ± two
standard deviations for ambient air or source sample analytes. A result is acceptable when it falls
within these derived acceptance limits.
H.4 NOT ACCEPTABLE PT RESULTS FOR SOURCE AND AMBIENT PT SAMPLES
Criteria for acceptable results for will be dependent on the precision and accuracy of the accepted
field measurement method. A laboratory's result for any analyte is considered unacceptable if it meets
any of the following criteria:
a)	The result falls outside the USEPA's promulgated acceptance limits (Section H.2.1.1) or
outside prediction interval derived from established regression equations;
b)	The lab reports a result for an analyte not present in the PT sample (i.e., a false positive);
c)	The lab reports a result of "Not Detected", (or similar indication of no detection), for an analyte
present in the PT sample (i.e., a false negative);
NOTE: If a laboratory reports a result less then the lowest concentration contained in the
NELAC-approved PT concentration range for an analyte present in the PT sample at
a concentration within the NELAC-approved PT concentration range, the result shall
be classified as a false negative and scored as "not acceptable".
d)	The lab fails to submit its results to the PT Provider on or before the deadline forthe PT study.
H.5 NELAC PT STUDY PASS/FAIL CRITERIA
NELAC PT samples are designed to meet the requirements of Chapter 2 and associated
appendices. Once data acceptability has been determined as described in Sections H.1 through
H.3 of this appendix, the laboratory's PT "Pass" or "Fail" evaluation is determined as described in this
Section. Pass/Fail criteria are used when groups of interdependent analytes are evaluated as a unit
forthe laboratory's initial demonstration of proficiency.
H.5.1 Interdependent Analyte PT Samples
Interdependent analyte PT Samples are those that are analyzed using methods in which the ability
to correctly identify and quantitate a series of analytes is indicative of the laboratory's ability to
correctly determine the presence or absence of similar analytes.
An example of interdependent PT analytes includes GC monitoring of a suite ofVOC analytes using
EPA Method 18.

-------
NELAC
Proficiency Testing
Appendix H
Revision 16
May 25, 2001
Page 2H-4 of 4
H.5.2 Non-interdependent Analyte PT Samples
Non-interdependent PT Samples are those that are analyzed using methods in which the ability to
correctly identify and quantitate an analyte or a series of analytes in a sample is not indicative of the
laboratory's ability to correctly identify and quantitate similar analytes. Non-interdependent analyte
PT samples may contain a single analyte, or may contain multiple analytes. Currently, non-
interdependent analytes are not expected to apply to the air matrix.
H.5.3 Promulgated USEPA Pass/fail Criteria
In all cases, promulgated USEPA pass/fail criteria, e.g., drinking water volatiles as listed in 40 CFR
141.61(a), subsection (m)(1), shall be used as NELAC PT pass/fail criteria as applicable. The criteria
described in Section 5.4 shall be used in the absence of promulgated USEPA pass/fail guidelines.
H.5.4 Pass/fail Criteria For Interdependent Analyte PT Samples
Proficiency Testing pass/fail evaluations for Interdependent Analyte PT samples shall be determined
as follows. To receive a score of "Pass", a laboratory must produce "Acceptable" results forXX% of
the analytes in an Interdependent Analyte PT Sample. Greater than 100-XX% "Not Acceptable"
results shall result in the laboratory receiving a score of "Fail" forthat series of analytes. For example,
a laboratory must report all "Acceptable" results for an Interdependent Analyte PT Sample containing
1-4 analytes, may report no more then one "Not Acceptable" result for a Sample containing 5-9
analytes, two "Not Acceptable" results for a Sample containing 10-14 analytes. A "Not Acceptable"
result for the same analyte in two consecutive PT studies shall also result in the laboratory receiving
a score of "Fail" forthat analyte.
H.5.5 Pass/fail Criteria For Non-Interdependent Analyte PT Samples
For non-interdependent analytes one unacceptable result would be failing for laboratory analysis.
Currently, non-interdependent analytes are not expected to apply to the air matrix.

-------
ON-SITE
ASSESSMENT
Approved May 25, 2001
Effective July 1, 2003 unless otherwise noted

-------
Note that the NELAC standards now have two significant dates: 1) the
date the standards were approved at the annual meeting, and 2) the
date the standards are effective and must be implemented. This is
especially important as some portions of the standards have different
effective dates. The approval date is part of the document control
header on each page. The cover of each chapter shows both the
approval date and the effective date. Changes approved for
implementation at a time other than the effective date (on the chapter
cover) are noted in the chapter, showing the approved text and its
effective date.

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page i of ii
TABLE OF CONTENTS
3.0	ON-SITE ASSESSMENT	 1
3.1	INTRODUCTION 	 1
3.2	ON-SITE ASSESSMENT PERSONNEL 	 1
3.2.1	Basic Qualifications	 1
3.2.2	Assessor Qualification	 2
3.2.3	Training 	 2
3.2.3.1	Basic Training 	 2
3.2.3.2	Technical Training	 2
3.2.3.3	Refresher Training 	 4
3.3	FREQUENCY AND TYPES OF ON-SITE ASSESSMENTS 	 4
3.3.1	Frequency 		4
3.3.2	Follow-up On-site Assessments 		4
3.3.3	Changes in Laboratory Capabilities		5
3.3.4	Announced and Unannounced Visits		5
3.4	PRE-ASSESSMENT PROCEDURES 	 5
3.4.1	Assessment Planning 	 5
3.4.1.1	Assessment Team	 5
3.4.1.2	Technical Support Personnel	 6
3.4.2	Scope of the Assessment 	 6
3.4.2.1	Laboratory Assessments		6
3.4.2.2	Records Review 		6
3.4.3	Information Collection and Review 		6
3.4.4	Assessment Documents 		7
3.4.5	Confidential Business Information (CBI) Considerations		7
3.4.6	National Security Considerations		8
3.5	ASSESSMENT PROCEDURES	 9
3.5.1	Length of Assessment	 9
3.5.2	Opening Conference 	 9
3.5.3	On-site Laboratory Records Review and Collection 	 10
3.5.4	Staff Interviews 	 10
3.5.5	Closing Conference	 10
3.5.6	Reporting Procedures 	 10
3.5.7	Assessment Closure 	 11
3.6	STANDARDS FOR ASSESSMENT	 11
3.6.1	Areas of Assessment		11
3.6.2	Assessor's Role		11
3.6.3	Use of Checklists		12
3.6.4	Standards of Professional Conduct for Assessors 		12
3.7	DOCUMENTATION OF ON-SITE ASSESSMENT	 13
3.7.1	Checklists/Records 		13
3.7.2	Report Format		13
3.7.3	Distribution		14
3.7.4	Release of On-site Assessment Report 		14
3.7.5	Record Retention Time 		14

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page ii of ii
Appendix A - NELAC BASIC ASSESSOR TRAINING	A-1
A.1 INTRODUCTION	A-1
A.2 COURSE PURPOSE	A-1
A.3 COURSE LOGISTICS	A-1
A.3.1 Duration 	A-1
A.3.2 Providers, Instructors, and Participants 	A-1
A.3.3 Course Documentation Supplied to Participants, Final Examination, and Certificates
	A-2
A.3.4 Final Examination	A-2
A.3.5 Attendance or Completion Certificate	A-2
A.3.6 Appraisal of Course by Participants 	A-2
A.4 COURSE CONTENTS	A-3
A.4.1 Introduction	A-3
A.4.2 Historical Perspective on National Accreditation 	A-3
A.4.3 Fundamentals of NELAC and NELAP 	A-3
A.4.4 Qualifications and Training Requirements for Assessors	A-4
A.4.5 Accreditation of Laboratories 	A-4
A.4.6 Proficiency Testing 	A-4
A.4.7 Ethical Conduct Standards for Assessors 	A-4
A.4.8 Quality Systems 	A-5
A.4.9 NELAC Quality System Checklist	A-5
A.4.10 Interviewing Techniques for Assessors 	A-5
A.4.11 NELAC Laboratory Assessments	A-6
A.4.11.1 Pre-Assessment Activities	A-6
A.4.11.2 On-site Assessment Components	A-6
A.4.11.3 Post On-site Assessment Activities	A-7
A.4.12 Handling Assessment Challenges 	A-7
A.5 COURSE SUMMARY AND CONCLUSIONS 	A-7
A.6 FINAL EXAMINATION	A-7
A.7	REFERENCES 	A-7
Appendix B-TECHNICAL TRAINING COURSES FOR ASSESSORS 	B-1
B.1	INTRODUCTION	B-1
B.2 COURSE CONTENT	B-1
B.3 COURSE OBJECTIVES 	B-2

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 1 of 14
3.0	ON-SITE ASSESSMENT
3.1	INTRODUCTION
The on-site assessment is an integral and requisite part of the NELAC laboratory accreditation
program and is one of the primary means of determining a laboratory's capabilities and qualifications.
During the on-site assessment, the assessment teami collects and evaluates information and makes
observations which are used to judge the laboratory's conformance with established accreditation
standards.
It is essential that the on-site assessments conducted by all accrediting authorities recognized by the
National Environmental Laboratory Accreditation Program be conducted in a uniform, consistent
manner.
This section describes the essential elements that must be included in any acceptable on-site
assessment and the qualifications and requirements for assessors.
The responsibility for promulgating and enforcing occupational safety and health standards rests with
the U.S. Department of Labor. While it is not within the scope of the assessment team to evaluate
all health and safety regulations, any obviously unsafe condition(s) observed should be described to
the appropriate laboratory official and reported to the accrediting authority. The accreditation on-site
assessment is not intended to certify that the laboratory is in compliance with any applicable health
and safety regulations.
3.2	ON-SITE ASSESSMENT PERSONNEL
3.2.1 Basic Qualifications
An assessor must be an experienced professional and hold at least a Bachelor's degree in a scientific
discipline or have equivalent experience in environmental laboratory assessment.
Each assessor must satisfactorily complete a training program approved by the accrediting authority
responsible for on-site assessments. Each accrediting authority shall be responsible for ensuring that
the training course used to train its assessors meets the NELAC standards. This program shall
include:
a)	Participation in the NELAC Basic Training Course (Section 3.2.3.1 and Appendix A), including
attainment of a passing score on the written examination for the course;
b)	Participation in at least four actual NELAC on-site assessments under the supervision of a
qualified assessor (Assessors employed by an accrediting authority [either directly or as a third
party] when the accrediting authority is granted NELAP recognition [See Section 6.7] are exempt
from the requirement to undergo training with a qualified assessor, provided they have previously
conducted four assessments and been judged proficient by the accrediting authority.) and,
c)	Completion of the applicable technical training requirements for at least one field of accreditation
(Section 3.2.3.2 and Appendix B).
xAn assessment team is comprised of a lead assessor, and one or more assessors or
technical specialists. In some cases a single lead assessor may conduct an on-site assessment.
In those instances the single assessor is considered the "team."

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 2 of 14
Assessors must take annual refresher/update training as defined in Section 3.2.3.3. In addition, the
assessors must:
a)	Be familiar with the relevant legal regulations, accreditation procedures, and accreditation
requirements;
b)	Have a thorough knowledge of the relevant assessment methods and assessment documents;
c)	Be thoroughly familiar with the various forms of records described in Section 3.5.3 - Records
Review;
d)	Be thoroughly cognizant of data reporting, analysis, and reduction techniques and procedures;
e)	Have a working knowledge and be conversant with the specific tests or types of tests for which
the accreditation is sought and, where relevant, with the associated sampling and preservation
procedures; and,
f)	Be able to communicate effectively, both orally and in writing.
3.2.2	Assessor Qualification
Before an assessor can conduct on-site assessments, an accrediting authority must qualify the
individual. Each assessor must sign a statement before conducting an assessment certifying that no
conflict of interest exists and provide any supporting information as required by the accrediting
authority. Failure to provide this information makes the proposed assessor ineligible to participate
in the assessment program.
3.2.3	Training
The National Environmental Laboratory Accreditation Conference (NELAC) specifies the minimum
level of education and training for assessors, including refresher/update training. The NELAC also
develops standards for training requirements. The assessor training program is implemented by
either accrediting authorities, assessor bodies, or other entities. All assessortraining programs, must
meet the standards defined in this Chapter.
3.2.3.1	Basic Training
The purpose of the basic assessortraining is to familiarize the assessor with the NELAC standards
and the skills and techniques associated with the laboratory assessment. The basic assessortraining
course shall encompass all the material described in Appendix A.
The specific training associated with the NELAC standards is required and must be successfully
completed. All assessor candidates must pass the written examination.
3.2.3.2	Technical Training
In addition to the basic NELAC assessortraining, each assessor must successfully complete training
in at least one technical discipline.
The technical training program is defined in Appendix B. The purpose of the technical training is to
ensure consistency of knowledge and techniques among the NELAC assessors. The technical
training assumes a level of basic knowledge of the subject and concentrates on the elements of the
technology or methods that are key to properly assure laboratory competency to deliver data of known
and documented quality. The technical training program consists of the following :

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 3 of 14
NELAC Technical Training for Assessors
TECHNICAL DISCIPLINES
1. Microbiology
S Bacteriology
S Viruses/Parasites
S Microscopic Particulate Analysis (MPA)
2.	Biological
S	Aquatic Toxicity Testing
S	Freshwater/Marine/Estuarine Fish
S	Freshwater/Marine/Estuarine Macroinvertebrates
S	Icthyoplankton
S	Macrophytes
S	Periphyton
S	Phytoplankton
S	Zooplankton
S	Biomass
S	Chlorophyll a (Spectrophotometric and Fluorometric)
3.	Inorganic - Nonmetals/Misc.
S	Spectrophotometric
S	Titri metric
S	Potentiometric
S	Colorimetric
S	TOC/TOX
S	Residue/Solids
S	COD/BOD
S	IR
S	IC
4.	Inorganic - Metals
S	FAA
S	GFAA
S	ICP
S	ICP/MS
S	Sample Preparation (Digestion/TCLP/etc.)
5.	Organ ics
S	Sample Preparation
S	HPLC
S	GC
S	GC/MS
S	Instrument Software
6.	Asbestos
S	Bulk
S	Air
S	Water/TEM

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 4 of 14
7.	Radiochemistrv
8.	Field Activities
S Source/Ambient Testing (CAA, RCRA, TSCA)
S e.g. Air Source Testing
S	Basic Principles of Manual Methods
S	Basic Principles of Instrumental Methods
S Soil/Groundwater (SARA, RCRA, TSCA, FIFRA)
S Surface Water (CWA, RCRA, TSCA, FIFRA)
S Drinking Water (SDWA)
S Multi-media (mix of above)
S Biological
3.2.3.3 Refresher Training
The purpose for requiring refresher/update training for all assessors is to ensure that the assessors
are aware of changes to the standards and/or approved analytical methodology as they occur and
to enhance and improve skills associated with assessment. Assessors are expected to maintain
proficiency on an on-going basis. Assessors must complete refresher/update training annually.
Initially, the refresher/update training is conceptualized as follows:
NELAC Refresher/Update Training for Assessors
S Changes to the NELAC Standards and the Resulting Checklist Changes
S New Interpretations of the NELAC Standards
S Technical Changes Associated with Approved Methodology and the Resulting
Checklist Changes
S Assessment Skills and Techniques
S Current Developments
3.3 FREQUENCY AND TYPES OF ON-SITE ASSESSMENTS
3.3.1	Frequency
The accrediting authority must conduct a comprehensive on-site assessment of each laboratory prior
to granting accreditation, except as allowed by interim accreditation (see Section 4.5.1). In addition,
an on-site assessment of each accredited laboratory must be completed at least every two years.
Assessments for cause are conducted more frequently, at the option of the accrediting authority.
3.3.2	Follow-up On-site Assessments
If directed by an accrediting authority, an assessment team must conduct follow-up assessments at
laboratories where a deficiency was identified by the previous assessment. These assessments may
be, but are not necessarily limited to, determining whether a laboratory has corrected its
deficiency(ies), or determining the merit of a formal appeal from the laboratory. When deficiencies
are of such severity as to possibly warrant the downgrading of a laboratory's accreditation status, any
follow-up assessment that is planned or conducted must be completed and reported within thirty (30)
calendar days after the receipt of the laboratory's plan of corrective action.

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 5 of 14
Nothing in this section should be construed as requiring an accrediting authority to reassess a facility
prior to taking a regulatory or administrative action affecting the status of the facility's accreditation.
Nothing in this section should be construed as limiting in any way the accrediting authority's ability
to revoke or otherwise limit a laboratory's accreditation upon the identification of such deficiencies as
to warrant such action.
3.3.3	Changes in Laboratory Capabilities
When a change occurs in a laboratory's ownership, location, key personnel, or major instrumentation,
notification of the accrediting authority is required within 30 days (see Section 4.3.2). The accrediting
authority must evaluate the significance of a change that might alter or impair the laboratory's
capability and quality, and indicate to the laboratory the results of their evaluation in writing. The
accrediting authority must retain records to indicate that such an evaluation was conducted.
3.3.4	Announced and Unannounced Visits
The accrediting authority, at its discretion, conducts either unannounced or announced on-site
assessments. The accrediting authority is not required to provide advance notice of an assessment.
To the maximum extent practical, accrediting authorities shall, when necessary, work with Federal
departments/agencies/contractors to obtain government security clearances for their assessment
team as far in advance as possible. Federal departments/agencies/contractors shall facilitate
expeditious attainment of the necessary clearances.
3.4 PRE-ASSESSMENT PROCEDURES
3.4.1 Assessment Planning
A good assessment begins with planning, which starts before the assessment team visits the
laboratory. Planning is the means by which the lead assessor identifies all the required activities to
be completed during the assessment process. Planning includes conducting a thorough review of
NELAP and/or State records pertaining to the laboratory to be inspected. This saves time because
familiarity with the operation, history, and compliance status of the laboratory increases the efficiency
and focus of an on-site visit.
Pre-assessment activities include: determining the scope of the assessment; reviewing NELAP/State
information; providing advance notification of the assessment to the laboratory, when appropriate;
obtaining any security clearances and determining any special safety procedures which may be
necessary; coordinating the assessment team; and gathering assessment documents. Section 3.4.5
discusses Confidential Business Information (CBI) issues.
3.4.1.1 Assessment Team
It is encouraged that teams directed by a lead assessor perform assessments. A single assessor
knowledgeable in the discipline, methods, and regulations applicable to the laboratories he or she
assesses can competently perform some on-site assessments.
The accrediting authority determines the number and expertise of the assessment team and support
personnel that are required to conduct the on-site assessment based on the type of assessment and
the scope of accreditation of the accredited or applicant laboratory.

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 6 of 14
3.4.1.2 Technical Support Personnel
An assessment team may include technical support personnel approved by the primary accrediting
authority as capable of providing assistance to the assessors. These individuals need not be formally
qualified by the accrediting authority as assessors (see Section 3.2.2). If not so qualified, these
individuals must still meet the requirements of the standards concerning conflicts of interest and
professional conduct. Members of the assessment team who provide technical assistance but are
not qualified as assessors are not eligible to conduct interviews in the absence of the assessor nor
to cite deficiencies.
3.4.2	Scope of the Assessment
The first step in the assessment planning process is deciding the extent of the assessment. The
assessment must include both an appraisal of the laboratory's operations and a review of the
appropriate records. The assessment for a field of accreditation must cover the complete scope of
accreditation for which the laboratory seeks or maintains accreditation within the specific field of
accreditation as authorized by the accrediting authority.
3.4.2.1	Laboratory Assessments
A laboratory assessment must review the ability of the laboratory to conduct environmental testing.
The examination of the systems, processes and procedures of the laboratory should give a general
sense of its past and present capabilities to perform work of known and documented quality. During
a laboratory assessment, the assessment team must identify a number of samples or a recently
completed or on-going project and evaluate to what extent the tests are being conducted according
to the NELAC standards.
3.4.2.2	Records Review
The purpose of a records review is to determine whether the testing laboratory has maintained
necessary documentation of data, the quality system, and other information to technically substantiate
reports previously issued. During a records review, the assessment team conducts an overall
assessment of data and compares the data with submitted reports to determine whether the data
collected, generated, and reported follow the NELAC standards.
3.4.3	Information Collection and Review
Priorto initiating an on-site assessment, the assessment team shall make determinations as to which
laboratory records they wish to review prior to the actual site visit. These records, from the files of
the accrediting authority, the national laboratory accreditation database, or the laboratory itself
include, but are not limited to:
a)	Copies of previous assessment reports and proficiency testing sample results;
b)	General laboratory information such as laboratory submitted self-assessment forms, SOPs and
Quality Manual(s);
c)	Official laboratory communications and associated records with appropriate accrediting authority
staff;
d)	Available documents from recipients of reports from the laboratory;
e) The laboratory's application for accreditation;

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 7 of 14
f)	The existing program regulations (federal and state), and
g)	The most recently approved or in use laboratory methods for which the laboratory has requested
or maintains accreditation.
3.4.4	Assessment Documents
Documents necessary for the assessment must be provided to the laboratory management or staff
and assembled before the assessment, whenever possible. The lead assessor must obtain copies
of all forms required for the assessment, including the appropriate checklist(s). Other types of
documents include:
Assessment Confidentiality Notice;
Conflict of Interest Form;
Assessor Credentials;
Assessment Assignment(s);
Assessment Notification Letter;
Attendance Sheet(s) (opening and closing conference); and
Assessment Appraisal Form.
In addition, the lead assessor must provide information to the laboratory on howto obtain assessment
information from the accrediting authority.
3.4.5	Confidential Business Information (CBI) Considerations
During assessments, if the assessment team comes into possession of information claimed as
business confidential, the following procedures must be implemented. The EPA regulations for
handling confidential business information are detailed in Title 40. Code of Federal Regulations, Part
2. Subpart B and must be followed in NELAP-related matters. Subpart B defines a business
confidentiality claim as "a claim or allegation that business information is entitled to confidential
treatment for reasons of business confidentiality or a request for a determination that such information
is entitled to such treatment."
Consistent with 40 CFR Part 2, NELAC standards must protect Confidential Business Information
(CBI) from disclosure. For this information to be adequately protected, NELAP requires certain
actions of assessors and the laboratory. The lead assessor must provide a NELAP assessment
confidentiality notice to the responsible laboratory official at the beginning of the assessment. This
notice informs laboratory officials of their right to claim any portion of the information requested during
the assessment data as CBI. NELAP personnel, assessors and other users of said information must
have CBI training. The assessors should be familiar with the procedures for asserting a CBI claim
and handling information that contain the information claimed as CBI. The lead assessor must take
custody of all CBI information before leaving the laboratory, and must maintain them in custody, using
all proper procedures and safeguards, until they can be received by the accrediting authority, who
must also treat such information as CBI, until an official determination has been made in accordance
with federal and State laws.
Certain actions are required of the responsible laboratory official when claiming information as
business confidential. The laboratory representative must place on (or attach to) the information at
the time it is submitted to the assessor, a cover sheet, stamped or typed legend, or other suitable form
of notice, employing language such as "trade secret", "proprietary" or "company confidential".
Allegedly confidential portions of otherwise non-confidential information should be clearly identified
by the business, and may be submitted separately to facilitate identification and handling by the
assessor. CBI may be purged of references to client identity by the responsible laboratory official at
the time of removal from the laboratory. However, sample identifiers may not be obscured from the

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 8 of 14
information. If the information claimed as business confidential suggests the need for further action,
the information may be forwarded to the appropriate agency that may take further action outside the
scope of the accreditation process, to obtain the client's identity. If the information claimed as
business confidential suggests the need for further enforcement action, the accrediting authority is
responsible for ensuring that all CBI issues are handled in accordance with NELAC standards.
If a business confidentiality claim is received afterthe on-site assessment by the accrediting authority,
the authority should make such efforts as are administratively practical to associate the late claim with
copies of the previously submitted information in its files. However the accrediting authority cannot
assure that such efforts will be effective in light of the possibility of prior disclosure or dissemination
of the information.
It is not the responsibility of members of the on-site assessment team to make any determination with
respect to the validity of a confidential business information claim; this responsibility rests with the
accrediting authority. The assessor must maintain custody of CBI-claimed information collected
during the assessment until they are delivered to an authorized official of the accrediting authority.
CBI-claimed information may be the intellectual property of the laboratory. Therefore, all CBI-claimed
information must be held in a secure manner throughout the holding period of assessment records
and may not be reproduced or distributed inconsistent with 40 CFR Part 2. If the accrediting authority
questions the claim that certain information is CBI, the host laboratory must be contacted and given
twenty-one (21) calendar days to:
1)	provide justification of their claim to CBI,
2)	remove the claim of CBI,
3)	resolve the issue in a manner agreeable to both the laboratory and the accrediting authority,
4)	engage legal assistance,
5)	appeal the action to NELAP, or
6)	withdraw their NELAC accreditation application for the field of accreditation associated with the
CBI information.
In no instance shall the accrediting authority declassify CBI-claimed information without notification
of the laboratory. If the responsible laboratory official does not consent to declassification of the CBI-
claimed information, the laboratory has the option to pursue any or all of the above stated actions.
3.4.6 National Security Considerations
Assessment teams performing assessments at laboratories owned and/or operated by Federal
departments/agencies/contractors must review the need forsecurity clearances, appropriate badging,
and/or a security briefing before proceeding with the on-site assessment. The laboratory must inform
the assessors in writing of any information, including data, that is controlled for national security
reasons and cannot be released to the public.
NELAP assessmentteams performing an on-site assessment of a Federal agency may need security
clearances, appropriate badging, and/or a security briefing before proceeding with the on-site
assessment. Assessors shall be informed in writing of any information that is controlled for national
security reasons and cannot be released to the public.

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 9 of 14
3.5 ASSESSMENT PROCEDURES
3.5.1	Length of Assessment
The length of an on-site assessment depends upon a number of factors such as the scope of
accreditation, the number of assessors available, the size of the laboratory, the number of problems
encountered during the assessment, and the cooperativeness of the laboratory staff. The accrediting
authority must assign an adequate number of assessors to complete the assessment within a
reasonable period of time. Assessors must strike a balance between thoroughness and practicality,
but in all cases must determine to what extent the laboratories' operations meet NELAC standards.
3.5.2	Opening Conference
Arrival at the facility for routine NELAC assessments occurs during established working hours unless
special arrangements are made with the laboratory.
A laboratory's refusal to admit the assessment team for assessment results in an automatic failure
of the laboratory to receive accreditation or loss of an existing accreditation by the laboratory, unless
there are extenuating circumstances that are accepted and documented by the accrediting authority.
The assessment team leader must notify the accrediting authority as soon as possible after refusal
of entry.
An opening conference must be conducted and shall address the following topics:
a)	the purpose of the assessment;
b)	the identification of the assessment team;
c)	the primary areas that will be examined;
d)	any pertinent records and operating procedures to be examined during the assessment and the
names of the individuals in the laboratory responsible for providing the assessment team with the
necessary documentation;
e)	the roles and responsibilities of key managers and staff in the laboratory;
f)	the procedures related to Confidential Business Information;
g)	any special safety procedures that the laboratory may think necessary for the protection of the
assessment team while in certain parts of the facility (under no circumstance is an assessment
team required or even allowed to sign any waiver of responsibility on the part of the laboratory
for injuries incurred by a member of the assessment team during an inspection to gain access
to the facility);
h)	the standards that will be used by the assessment team in judging the adequacy of the laboratory
operation;
i)	the confirmation of the tentative time for the exit conference;
j) the presentation of the assessment appraisal form to the responsible laboratory official for
submittal to the accrediting authority; and
k) the discussion of any questions the laboratory may have about the assessment process.

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 10 of 14
3.5.3	On-site Laboratory Records Review and Collection
Assessment team members must review laboratory records for accuracy, completeness and the use
of proper methodology. NELAC Chapter 5, Section 5.12 lists the records required for review during
the assessment. The assessors must document the required elements of the records review on the
NELAC assessment checklists.
The laboratory must mark all confidential information. The lead assessor must handle it as required
by appropriate laws and regulations. All other information for all aspects of application, assessment
and accreditation of laboratories is considered public information. If the laboratory requests that
information is confidential, the information must be treated as confidential until a ruling can be made
by the accrediting authority.
3.5.4	Staff Interviews
As an element of the assessment process, the assessment team evaluates the analysis process by
requesting that the analyst(s) normally conducting the test(s) give a step-by-step description of exactly
what is done and what equipment and supplies are needed to complete the analysis. Any
deficiencies shall be noted and discussed with the analyst. The deficiencies must be discussed again
in the closing conference.
The assessment team members shall have the authority to conduct interviews with any/all staff.
Calculations, data transfers, calibration procedures, quality control/assurance practices, adherence
to SOPs and report preparation shall be assessed for the complete scope of accreditation with the
appropriate analyst(s).
3.5.5	Closing Conference
The assessment team must meet with representative(s) of the laboratory following the assessment
for an informal debriefing and discussion of findings. It should be noted that the assessment team
in no way limits its ability to identify additional problem areas in the final report should it become
necessary. The members of the assessment team must describe all deficiencies identified-to-date
during the closing conference with the possible exception of any issues of improper and/or potentially
illegal activity, which may be the subject of further action.
In the event the laboratory disagrees with the findings of the assessor(s), and the team leader
adheres to the original findings, the deficiencies with which the laboratory takes exception shall be
documented by the team leader and included in the report to the accreditation authority for
consideration. The accrediting authority makes a determination as to the validity of the contested
elements.
The assessment team must inform the laboratory representative(s) that an assessment report
encompassing all relevant information concerning the ability of the applicant laboratory to comply with
the accreditation requirements is forthcoming.
3.5.6	Reporting Procedures
The accrediting authority or its authorized third party must present an assessment report to the
laboratory within thirty (30) calendar days of the assessment. The laboratory has thirty (30) calendar
days from the date of receipt of the report to provide a plan of corrective action to the accrediting
authority (see Section 4.1.3). An exception to these deadlines is in those circumstances where a
possible enforcement investigation or other action has been initiated.

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 11 of 14
3.5.7 Assessment Closure
After reviewing the assessment report and any completed corrective action(s) reported by the
laboratory, the accrediting authority makes the determination of the accreditation status for a
laboratory.
If the deficiencies listed in the initial assessment report are substantial or numerous, additional on-site
assessment may be conducted before a final decision for accreditation following the procedures of
the accrediting authorities.
3.6 STANDARDS FOR ASSESSMENT
3.6.1	Areas of Assessment
The areas to be evaluated during an on-site assessment to determine the competence of an
environmental laboratory shall include:
a)	Organization and Management
b)	Quality System - Establishment, Assessments, Essential Quality Controls and Data Verification
c)	Personnel
d)	Physical Facilities - Accommodation and Environment
e)	Equipment and Reference Materials
f)	Measurement Traceability and Calibration
g)	Test Methods and Standard Operating Procedures
h)	Sample Handling, Sample Acceptance Policy and Sample Receipt
I) Records
j) Laboratory Report Format and Contents
k) Subcontracting of Analytical Samples
I) Outside Support Services and Supplies
m) Complaints
These areas must be evaluated against the standards detailed in Chapter 5, Quality Systems,
Chapter 2, Proficiency Testing and Chapter 4, Accreditation Process of the NELAC Standards and
the appropriate method references. Sufficient detail is provided in Chapter Five (5) and/orthe method
referenced) cited to enable accrediting authorities to evaluate laboratories consistently and uniformly.
3.6.2	Assessor's Role
The on-site assessor uses a variety of tools in the assessment process. The experience of the
assessor, his/her observations, interviews with laboratory staff, and examination of SOPs, raw data,
and the laboratory's documentation all play important roles in the assessment. The accreditation of
a particular laboratory depends primarily upon the assessment team's findings. Much of the on-site

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 12 of 14
assessment depends upon the assessor's observations of existing conditions (i.e. observing
operations and processes). The recommendation not to accredit a laboratory, or to change a
laboratory's accreditation status, must be based on factual information and not upon subjective
evaluations. Therefore, it is crucial that the on-site assessor have a clear understanding of the
laboratory's procedures and policies and that the assessor document any deficiencies in the
assessment report of the on-site assessment. The assessment team must use specific documentation
in its reporting of deficiencies.
During the assessment, sufficient information may become available to suspect that a particular
person has violated an environmental law or regulation, such as knowingly making a false statement
on a report. This information must be carefully documented since further action may be necessary.
In the event that evidence of improper and/or potentially illegal activities have or may have occurred,
the assessment team must present such information to the accrediting authority for appropriate
action(s). These issues, at the discretion of the accrediting authority, may or may not be subjects or
issues of the closing conference. However, the assessor must continue to gather the information
necessary to complete the accreditation assessment.
3.6.3	Use of Checklists
Standardized checklists must be used for the on-site assessment. The use of checklists does not
replace the need for assessor observations and staff interviews, but is another tool that assists in
conducting a thorough and efficient assessment. A checklist is not a substitute for assessor training
and experience.
3.6.4	Standards of Professional Conduct for Assessors
Professional standards apply to every NELAC assessor, whether a government employee or an
employee of a third party organization conducting assessments under an agreement with a NELAP
accrediting authority. Assessors that knowingly engage in unprofessional activity may be liable for
punitive actions as initiated by the affected accrediting authority.
The Standards for Professional Conduct, as outlined in this section, are based upon 5 CFR 2635,
"Standards of Ethical Conduct for Employees of the Executive Branch" and will be followed in NELAP
related matters. NELAC assessors shall:
a)	have no interest at play other than that of the accrediting authority and NELAC during the entire
accreditation process;
b)	act impartially and not give preferential treatment to any organization or individual;
c)	provide equal treatment to all persons and organizations regardless of race, color, religion, sex,
national origin, age, and/or disability;
d)	not use their position for private gain;
e)	not solicit or accept any gift or other item of monetary value from any laboratory, laboratory
representative, or any other affected individual or organization doing business with, or affected
by, the actions of the assessor's employer or accrediting authority;
f)	not hold financial interests that conflict with the conscientious performance of their duties;
g)	not engage in financial transactions using information gained through their positions as assessors
to further any private interest;

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 13 of 14
h)	not engage in employment activities (seeking or negotiating for employment) or attempt to
arrange contractual agreements with a laboratory that would conflict with their duties and
responsibilities as an assessor;
i)	not knowingly make unauthorized commitments or promises of any kind purporting to bind the
affected accrediting authority and,
j) attempt to avoid any actions that could create even the appearance that they are violating any
of the standards of professional conduct outlined in this section.
Assessors are reminded that it is their responsibility to report to the affected accrediting authority any
personal issues or activities that constitute a conflict of interest before an assessment occurs. It is up
to the affected accrediting authority to determine if the reported issues and activities regarding a
specific assessor constitute, or be construed as, a conflict of interest. Appeals of decisions made by
accrediting authorities regarding such matters must be directed to the Executive Director of the
NELAC, who shall make the final decision as to the merit of such appeals.
3.7 DOCUMENTATION OF ON-SITE ASSESSMENT
3.7.1	Checklists/Records
The checklists used by the assessors during the assessment shall become a part of the permanent
file kept by the accrediting authority for each laboratory. The assessor shall specify the laboratory
records, documents, equipment, procedures, or staff evaluated and the observations that contributed
to the evaluation of "No" for each assessment checklist item. This information must be documented
in the comments section or referenced on the checklist. The assessment report must contain
sufficient evidence to support all assessment findings and the overall evaluation of the laboratory.
3.7.2	Report Format
The final assessment report shall be written to contain a description of the adequacy of the laboratory
as it relates to the assessment standards in Section 3.6.1. Assessment reports must be generated
in a narrative format. Documentation of existing conditions at the laboratory must be included in each
report to serve as a baseline for future contacts with the facility.
Assessment reports must contain:
a)	Identification of the organization assessed (name and address),
b)	Date of the assessment,
c)	Identification and affiliation of each assessment team member,
d)	Identification of participants in the assessment process,
e)	Statement of the objective of the assessment,
f)	Summary,
g)	Assessment observations, findings (deficiencies) and requirements, and,
h)	Comments and recommendations.

-------
NELAC
On-site Assessment
Revision 16
May 25, 2001
Page 14 of 14
The Findings and Requirements section must be referenced to the NELAC standards so that both the
finding (deficiency) is understood and the specific requirement is outlined. The team leader shall
assure that the results within the final assessment report conform to established standards for the
evaluated parameters.
The Comments and Recommendations section can be used to convey recommendations aimed at
helping the laboratory improve.
3.7.3	Distribution
The accrediting authority shall be recognized as having the responsibility for the distribution of the
assessment reports. The assessment team leader shall compile, edit and submit the final report to
the accrediting authority.
3.7.4	Release of On-site Assessment Report
On-site assessment reports must be released initially by the accrediting authority only. The reports
will be released to the responsible laboratory official(s). The assessment report shall not be released
to the National Accreditation Database and the public until findings of the assessment and the
corrective actions have been finalized, all Confidential Business Information and information related
to national security has been stricken from the report in accordance with prescribed procedures, and
the report has been provided to the laboratory (see Section 4.1.3).
In accordance with the Freedom of Information requirements, any documentation adjudged to be
proprietary, financial and/or trade information, or relevant to an ongoing enforcement investigation,
must be considered exempt from release to the public.
3.7.5	Record Retention Time
Copies of all assessment reports, checklists, and laboratory responses must be retained by the
accrediting authority for a period of at least five (5) years, or longer if required by specific State or
Federal regulations (see Sections 4.3.3 & 5.12.2(b)).

-------
ON-SITE ASSESSMENT
APPENDIX A
NELAC BASIC ASSESSOR TRAINING
(EFFECTIVE JULY 1, 2001)

-------
NELAC
On-site Assessment
Appendix A
Revision 16
May 25, 2001
Page 3A-1 of 7
(Effective July 1, 2001)
Appendix A - NELAC BASIC ASSESSOR TRAINING
A.1 INTRODUCTION
Appendix A specifies the minimum standards for NELAC Basic Assessor Training Courses. This
appendix must be used to design basic training courses for laboratory assessors. Appendix A and
its technical counterpart, Appendix B, specify the principal elements of NELAC laboratory assessor
training courses.
A.2 COURSE PURPOSE
The purpose of the NELAC Basic Assessor Training Course is to fulfill the Basic Training requirement
for assessors specified in Section 3.1 of the NELAC Standards.
The Basic Assessor Training Course:
Instructs assessors on the basic elements of performing NELAC assessments by focusing on
evaluating laboratory quality systems and the competency of the laboratory to perform the test
methods on the scope of accreditation.
Provides an overview of the NELAC Standards and the NELAP laboratory accreditation process.
Promotes uniformity of laboratory assessments performed to obtain NELAP accreditation.
Facilitates information exchange among assessors.
A.3 COURSE LOGISTICS
The course subject matter and content must be organized in modules or discrete units. Although the
order of instructional modules or units is not strictly prescribed, courses must be organized
systematically and logically to allow the best assimilation and comprehension of their subject matter.
The course contents can be delivered in a traditional classroom, by teleconferencing, in computer on-
line sessions, or by a combination of any of these media. The format for instruction modules or units
must be appropriate to the subject matter and can include, but is not limited to, lectures, discussions,
demonstrations, critiques, group exercises, written assignments, simulations, fictitious reenactments,
or a combination of any of these. Regardless of the medium or format used for content delivery, all
courses must provide opportunity forample interaction between instructors and participants and, must
include exercises designed to be completed by teams of participants.
A.3.1 Duration
The duration of the course will depend upon the participants' experience and the course's mode of
delivery, but must be sufficient to allow fulfilling all the objectives contained in section A.2 and to cover
the content specified in section A.4.
A.3.2 Providers, Instructors, and Participants
Providers of NELAC Basic Assessor Training Courses shall ensure that the number of instructors
assigned to a course is commensurate with the number of participants attending and the delivery
mode of the course. Although other ratios of instructor to students may be acceptable, atypical Basic
Assessor Training Course delivered in a traditional classroom setting assigns one instructor per every
15 participants.

-------
NELAC
On-site Assessment
Appendix A
Revision 16
May 25, 2001
Page 3A-2 of 7
Instructors must maintain credentials and qualification statements and must make them available to
course participants or other interested parties.
Accrediting authorities shall approve training for their assessors. Providers of NELAC Basic Training
Courses shall not claim NELAP approval of them and are restricted from using the NELAC and
NELAP logos in any course or promotional materials.
This Appendix does not limit course participants to those employed by accrediting authorities. All
participants, regardless of the course delivery mode, must register priorto taking a course. Providers
must maintain records that identify participating students and their status (i.e. whether they have
attended the course or completed one by passing an examination); however, it is the responsibility
of accrediting authorities to qualify and approve their assessors.
Providers must update established courses and existing training materials to reflect any changes in
effect made to the NELAC standards.
A.3.3 Course Documentation Supplied to Participants, Final Examination, and Certificates
After receiving completed registration forms including fees (where charged), providers shall send
participants a course agenda. The course agenda should contain titles of the instructional modules
and units with a timetable, and should be sent to candidates in sufficient time to be read before the
course. Providers must also provide with the agenda a copy of the NELAC Standards and the Quality
System Checklist in effect at the time of the course.
A.3.4 Final Examination
Participants must be offered an opportunity to take a written examination that quantitatively measures
their knowledge of the NELAC standards and the course contents. Until such time as NELAP or a
designated body can maintain a controlled set of questions to be used in written examinations,
providers shall design their own questions and grading criteria. Participants that obtain 70% or more
correct answers in the final examination are classified as successfully completing the course.
A.3.5 Attendance or Completion Certificate
Course providers shall issue certificates to those participants who attend all the offered modules or
instructional units and to those that successfully complete the course. A "Certificate of Attendance"
containing a brief description of the course shall be issued to participants who choose not to take the
final examination or who do not successfully complete the course, but who have attended all the
modules or instructional units.
Participants that attend all the instruction modules and who successfully complete the course shall
be issued a "Certificate of Completion".
A.3.6 Appraisal of Course by Participants
Participants shall be offered an evaluation form at the end of the course to invite feedback to
providers about the course's quality and content. Such forms shall be available to accrediting
authorities and to NELAP upon request.
Providers are also encouraged to include in their courses an open session where participants
evaluate a course and offer direct feedback to instructors.

-------
NELAC
On-site Assessment
Appendix A
Revision 16
May 25, 2001
Page 3A-3 of 7
A.4 COURSE CONTENTS
The contents of the Basic Assessor Training Course must address the following items.
A.4.1 Introduction
The purpose of this module is to establish the intent and tone of the course. It should create an
atmosphere that will encourage participation, feedback, and questions, and should clarify participant
expectations about the intent and content of the course.
This module should provide an opportunity to:
1.	Welcome participants
2	Introduce course content
3.	Describe method of assessment of participants
4.	Describe administrative and physical arrangements (e.g. lunches, telephone, timing)
5.	Have participants introduce themselves
A.4.2 Historical Perspective on National Accreditation
This course module will provide a background on laboratory accreditation and the history included
Chapter 1 of the NELAC standard. The historical perspective and overview of the requirements of
assessors should enable participants to understand the benefits of national accreditation and how a
uniform national accreditation process will improve the quality of environmental data.
1.	The Need for National Accreditation
2.	Past Efforts toward National Consistency
3.	Genesis of the National Environmental Laboratory Accreditation Program (NELAP)
A.4.3 Fundamentals of NELAC and NELAP
The purpose of this module is to familiarize the course participants with the function and structure of
NELAC, NELAP, and the essential role that the accrediting authorities have in the laboratory
accreditation process. The module should establish for each participant a working knowledge of
NELAC and the mechanics of the program.
What is NELAC?
1.	Objectives of NELAC
2.	Structure and Operation of NELAC
a. NELAC Standards
3.	What is NELAP?
a. Current Status of NELAP
4.	Structure and Operation of NELAP
5.	Primary Accrediting Authorities
a.	Requirements and Functions of Primary Accrediting Authorities
b.	Process for Recognition of Accrediting Authorities
6.	Secondary Accrediting Authorities
a.	Requirements and Functions of Secondary Accrediting Authorities
b.	Reciprocal Accreditation
7.	National Accreditation Database
8.	Scope of Accreditation

-------
NELAC
On-site Assessment
Appendix A
Revision 16
May 25, 2001
Page 3A-4 of 7
A.4.4 Qualifications and Training Requirements for Assessors
The purpose of this module is to examine the requirements for becoming a qualified NELAC Assessor
as defined in Chapter 3. At the end of the session each participant should understand the process
and timing involved for becoming a NELAC assessor.
1.	Basic Qualifications
a.	Qualification by an Accrediting Authority
b.	Absence of Conflict of Interest Certification
2.	Purpose of Training Assessors
3.	Basic Assessor Training
4.	Technical Training
5.	Refresher Training
A.4.5 Accreditation of Laboratories
The purpose of this module is to define the NELAC laboratory accreditation process. Participants
should understand the requirements of laboratories seeking accreditation and the process through
which accreditation is granted.
1.	Accred itati o n Req u i re me nts
2.	Order of the Accreditation Process
3.	Role of the Laboratory Assessor in Accreditation of Laboratories
4.	Personnel Qualifications
A.4.6 Proficiency Testing
The purpose of this module is to provide a comprehensive view of the role that the proficiency testing
(PT) plays in the accreditation process. Participants should understand the importance of proficiency
testing, the requirements for PT providers and laboratories, and the elements of the PT process that
should be assessed during the on-site assessment.
1.	Purpose of Proficiency Testing
2.	Definitions
3.	Mechanisms, Criteria, Current Programs, Follow-Up Actions
4.	Oversight and Delivery of Proficiency Testing Program
a.	Proficiency Testing Providers
b.	Proficiency Testing Oversight Body
c.	Primary Accrediting Authorities
5.	Laboratory Requirements
a.	Types of PT Samples Required to be Analyzed
i. PT Fields of Testing
b.	Frequency of PT Sample Analysis
c.	Requirements for Handling and Analyzing PT Samples
6.	Role of the Laboratory Assessor in Reviewing PT Sample Data
A.4.7 Ethical Conduct Standards for Assessors
This module will review the elements of ethical conduct of assessors, establishing an expectation that
assessor conduct be "above reproach," and the consequences of unethical conduct. In addition, the
module will examine circumstances when an assessor activity might constitute a potential conflict of
interest, and the need for disclosure. At the end of this session, participants should know the NELAC
expectations and requirements for assessor conduct.

-------
NELAC
On-site Assessment
Appendix A
Revision 16
May 25, 2001
Page 3A-5 of 7
1.	Professional Conduct of Assessors
2.	Defining, Determining, and Avoiding Conflicts of Interest for Assessors
A.4.8 Quality Systems
This module establishes the fundamental components of a quality system and trains assessors on
how to evaluate them. It requires a group exercise in which a laboratory's quality manual is evaluated
for conformance with the NELAC Standards. This case study can be used to emphasize the
importance of key quality system elements.
1.	Definition of a Quality System
a.	Quality Assurance
b.	Quality Control
c.	Elements of a Quality System
2.	Quality System Requirements for Laboratories
a.	Quality Manual
b.	Quality Assurance Policies and Procedures
c.	Standard Operating procedures
d.	Corrective Actions
e.	Document and Records Control
f.	Data Review and Evaluation
3.	Monitoring and Effectiveness of the Quality System
a.	Internal Audits
b.	Management review
A.4.9 NELAC Quality System Checklist
This module will explore the proper use of the Quality Systems Checklist, including how and when
the checklist should be completed, and the techniques that a good assessor follows when using any
checklist. At the end of this module, participants should be familiarwith the Quality Systems Checklist
and how it relates to NELAC Chapter 5. Participants will learn how to use the Quality Systems
Checklist as an assessment tool, rather than as the primary vehicle of the assessment.
1.	Purpose
2.	Mandatory Use
3.	Use of the Quality Systems Checklist Before, During, and After Laboratory Assessments
4.	Procedure for Documentation of Findings
A.4.10 Interviewing Techniques for Assessors
The purpose of this module is to instruct participants on good interviewing techniques and the
personal dynamics of an on-site assessment. Participants will learn communication skills, including
effective questioning techniques; methods forgathering information in an objective and professional
manner; and potential ethical concerns, group exercises and simulations are particularly effective in
this sub-unit.
1.	Utility of Interviews During Laboratory Assessments
2.	Interview Structure
3.	Verbal and Non-Verbal Communication
4.	Modes of Gathering Information
5.	Ways of Asking Questions
6.	Dealing with Difficult Interviewees

-------
NELAC
On-site Assessment
Appendix A
Revision 16
May 25, 2001
Page 3A-6 of 7
A.4.11 NELAC Laboratory Assessments
This module of the course presents all phases of the assessment process: pre-assessment, on-site
assessment, and post-assessment activities. The session should instruct participants in the use of
assessment tools (e.g., observation, interviewing, documentation review, and tracking) to review the
quality system, documented test procedures, test method validation, and the technical competence
of a laboratory.
1.	Purpose of Assessments
2.	Frequency and Types of Assessments
3.	Phases of an Assessment
A.4.11.1 Pre-Assessment Activities
1.	Planning an Assessment
a.	Scope of an Assessment
b.	Appointment of Lead Assessor and other Team Members
c.	Roles of Assessment Team Members
2.	Document review
a.	PT Sample results
b.	Quality Manual
c.	Corrective Action Reports and Plans
3.	Previous Assessment Reports
4.	Preparation of Agenda and Schedule
5.	Notifications
A.4.11.2 On-site Assessment Components
A "mock" assessment exercise can be used during this sub-unit to instruct participants on the
components of on-site assessments.
A.4.11.2.1 Opening Conference
1.	Schedule and Agenda
2.	Assessment Appraisal Form
3.	Confidential Business Information (CBI)
A.4.11.2.2 Facility Walk-Through
A.4.11.2.3 On-site Assessment Proper
1.	Use of the Quality Systems Checklist
2.	Detailed Tour and Observation of Operations
3.	Staff Interviews
4.	Calibration and Traceability of measurements
5.	Data and Document review
6.	Records retention and Reporting
A.4.11.2.4 Assessment Team Meetings
A.4.11.2.5 Closing Conference
1. Reporting Non-Conformances

-------
NELAC
On-site Assessment
Appendix A
Revision 16
May 25, 2001
Page 3A-7 of 7
A.4.11.3 Post On-site Assessment Activities
During this sub-unit participants should be instructed on how to correctly cite instances of non-
conformance in assessment reports as well as effective ways of formatting them. Critiques of
fictitious reports, or a writing assignment in which participants write a report of a "mock" assessment
are particularly effective in this sub-unit.
1.	On-site Assessment Report
2.	Report Format
3.	Report Release
4.	Corrective Action Reports in Response to On-site Assessment
5.	Surveillance and Re-Assessment
6.	Retention of Assessment Documents
A.4.12 Handling Assessment Challenges
The purpose of this sub-unit is to identify effective methods of handling potential problems during an
assessment. Participants should gain useful conflict resolution tools during this session. Group
exercises and simulations can be used effectively in this sub-unit.
1.	Dealing with Improper Practices and potentially Illegal Activities
2.	Dealing with Unexpected Circumstances
3.	Technical Disagreements
4.	Absence of Key Laboratory Personnel
5.	Hostile Reception
6.	Conduct of Assessors During On-site Assessments
A.5 COURSE SUMMARY AND CONCLUSIONS
This module should conclude the instructional components of the course. It should present a course
review that gives a global perspective of the purpose of NELAC and the laboratory assessment
process. Participants should be given an opportunity to ask final questions about specific aspects of
the assessment and accreditation process at this time.
A.6 FINAL EXAMINATION
The last module of the course is the final examination. The examination determines whether a
participant has sufficient knowledge of the NELAC Standards and effective assessment procedures
to be a NELAC assessor.
A.7 REFERENCES
1. ILAC-G3; 1994, "Guidelines fortraining Courses for Assessors Used by Laboratory Accreditation
Schemes"

-------
ON-SITE ASSESSMENT
APPENDIX B
TECHNICAL TRAINING COURSES FOR
ASSESSORS
(EFFECTIVE JULY 1, 2001)

-------
NELAC
On-site Assessment
Appendix B
Revision 16
May 25, 2001
Page 3B-1 of 2
(Effective July 1,2001)
Appendix B - TECHNICAL TRAINING COURSES FOR ASSESSORS
B.1 INTRODUCTION
The purpose of the technical training courses is to ensure consistency of technical knowledge among
the NELAC assessors. Prerequisites for the training course for the assessor are:
1.	Basic knowledge of the technology, i.e. familiarity with the principles and application of the
technology used by the laboratory.
2.	An understanding of Quality Systems.
The technical courses must concentrate on the elements and details of the technology and/or methods
that are critical to assuring that the laboratory is implementing it or them properly.
Technical training courses provided to meet the requirements defined in Section 3.2.3 of the NELAC
Standard must address the elements listed below. Assessor technical training courses must also focus
on how to review these elements during the on-site assessment. The skills obtained during these
training courses must also enable assessors to evaluate quality systems components present in the
laboratory, as they relate to technical disciplines, to ensure compliance with the NELAC Standard.
B.2 COURSE CONTENT
Technical training courses must provide, identify, or review:
Basic theoretical and operating principles of the analytical technology and associated
instrumentation and software.
Critical steps and processes of the analytical technology or technique that must be executed to
ensure quality data, including critical quality control (QC) measures and QC criteria based on the
technology.
Major sources of error, and how to control them, for the analytical technology or technique.
Inappropriate procedures or practices for the analytical technology or technique.
Key information required to document completely the reported results.
Essential elements for assessing data generated.
Ways to detect improper practices.
Exercises in the evaluation of raw data to reported results.
The training course must also include an examination covering the material presented to ensure an
understanding of the above elements. Results of the examination will be submitted to the accrediting
authority for action. All attendees will receive a course certificate.

-------
NELAC
On-site Assessment
Appendix B
Revision 16
May 25, 2001
Page 3B-2 of 2
B.3 COURSE OBJECTIVES
The assessors successfully completing the course shall have acquired the following:
1.	Knowledge sufficient to assess the implementation of the technology by the laboratory.
2.	An understanding as to how the technology is used in the various methods.
3.	An understanding of the key elements of data packages, and raw data to review and check
effectively.

-------
ACCREDITATION
PROCESS
Approved May 25, 2001
Effective July 1, 2003 unless otherwise noted

-------
Note that the NELAC standards now have two significant dates: 1) the date
the standards were approved at the annual meeting, and 2) the date the
standards are effective and must be implemented. This is especially
important as some portions of the standards have different effective dates.
The approval date is part of the document control header on each page. The
cover of each chapter shows both the approval date and the effective date.
Changes approved for implementation at a time other than the effective date
(on the chapter cover) are noted in the chapter, showing the approved text and
its effective date.

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page i of i
TABLE OF CONTENTS
ACCREDITATION PROCESS
4.0	ACCREDITATION PROCESS 	 1
4.1	COMPONENTS OF ACCREDITATION	 1
4.1.1	Personnel Qualifications 		1
4.1.1.1	Definition, Technical Director(s) 		2
4.1.1.2	Personnel Qualification Clarifications and Exceptions 		3
4.1.2	On-site Assessments		3
4.1.3	Corrective Action Reports In Response to On-Site Assessment		4
4.1.4	Proficiency Testing Samples		5
4.1.5	Accountability for Analytical Standards 		5
4.1.6	Fee Process for National Accreditation		5
4.1.7	Application		6
4.1.7.1	Primary Application Package 		6
4.1.7.2	Secondary Accreditation Package 		6
4.1.8	Change of Ownership and/or Location of Laboratory 		7
4.1.9	"Certification of Compliance" Statement 		7
4.2	PERIOD OF ACCREDITATION 	 8
4.3	MAINTAINING ACCREDITATION 	 8
4.3.1	Quality Systems	 8
4.3.2	Notification and Reporting Requirements 	 9
4.3.3	Record Keeping and Retention 	 9
4.4	DENIAL, SUSPENSION, AND REVOCATION OF ACCREDITATION 	 9
4.4.1	Denial	 9
4.4.2	Suspension 	 10
4.4.3	Revocation	 10
4.4.4	Voluntary Withdrawal	 11
4.5	INTERIM ACCREDITATION 	 11
4.5.1	Interim Accreditation 	 11
4.5.2	Revocation of Interim Accreditation 	 12
4.6	AWARDING OF ACCREDITATION	 12
4.6.1	Use of NELAC Accreditation by Accredited Laboratories 	 12
4.6.2	Changes in Fields of Accreditation 	 12
4.7	DUE PROCESS 	 12
4.8	ENFORCEMENT 	 13

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page 1 of 13
4.0	ACCREDITATION PROCESS
(NB. MANY OF THE STANDARDS AND ELEMENTS LISTED IN THIS CHAPTER ARE
REFLECTIVE OF STANDARDS SET FORTH IN CHAPTERS DEALING WITH DETAILED
EXPLANATIONS OF THESE ELEMENTS. THEREFORE, IT IS ANTICIPATED THAT SOME OF
THE DETAILS MAY CHANGE AS THE DISCUSSIONS AND CONCLUSIONS IN THESE
CHAPTERS CHANGE.)
Laboratories applying for accreditation may be fixed-base or mobile.
a)	An individual fixed-base laboratory requires a separate accreditation. The primary accrediting
authority shall determine what constitutes an individual fixed-base laboratory when noncontiguous
laboratory facilities operate underthe same ownership, technical directorship, and quality system
as the parent laboratory.
b)	The primary accrediting authority shall determine if a separate accreditation is required for a
mobile laboratory that is owned by an accredited fixed-base laboratory, operates underthe same
quality system as the fixed-based laboratory, performs a subset of the analyses for which the
fixed-base laboratory is accredited, and analyzes samples exclusively from within the state in
which the parent fixed-base laboratory is located.
c)	Separate accreditation by the primary accrediting authority is required for a mobile laboratory that
is owned by an accredited fixed-base laboratory, operates under the same a quality system as
the fixed-based laboratory, performs a subset of the analyses for which the fixed-base laboratory
is accredited, and analyzes samples from outside of the state in which the parent fixed-base
laboratory is located.
d)	Separate accreditation by the primary accrediting authority is required for a mobile laboratory that
is owned by a fixed-base laboratory but operates under a different quality system or performs
analyses for which the parent fixed-base laboratory is not accredited.
e)	Separate accreditation by the primary accrediting authority is required for a mobile laboratory that
is not owned and operated by a fixed-base laboratory.
4.1	COMPONENTS OF ACCREDITATION
The components of accreditation include review of personnel qualifications, on-site assessment,
proficiency testing and quality assurance/quality control standards. These criteria must be fulfilled
for accreditation. The components and criteria are herein described. Details of some of the
requirements described below will be found in other sections of these Standards.
4.1.1 Personnel Qualifications
Persons who do not meet the education credential requirements but possess the requisite experience
of Section 4.1.1.1 of the NELAC standards shall qualify as technical director(s) subject to the
following conditions.
a)	The person must be a technical director of the laboratory on the date the laboratory applies for
NELAP accreditation and/or becomes subject to NELAP accreditation, and must have been a
technical director in that laboratory continuously for the previous 12 months or more.
b)	The person will be approved as atechnical director for only those fields of accreditation forwhich
he/she has been technical director in that laboratory for the previous 12 months or more.
c)	A person who is admitted as a technical director under these conditions, and leaves the
laboratory, will be admitted as technical director for the same fields of accreditation in another
NELAP laboratory.

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page 2 of 13
d) A person may initially be admitted as a technical director under the provisions of this section
during the first twelve months that the primary accrediting authority offers the NELAP fields of
accreditation for which the person seeks to be technical director or during the first twelve months
that the program is required by the state in which the laboratory is located.
4.1.1.1 Definition, Technical Director(s)
The technical director(s) means a full-time member of the staff of an environmental laboratory who
exercises actual day-to-day supervision of laboratory operations for the appropriate fields of
accreditation and reporting of results. The title of such person may include but is not limited to
laboratory director, technical director, laboratory supervisor or laboratory manager. A laboratory may
appoint one or more technical directors for the appropriate fields of accreditation for which they are
seeking accreditation. His/her name must appear in the national database. This person's duties shall
include, but not be limited to, monitoring standards of performance in quality control and quality
assurance; monitoring the validity of the analyses performed and data generated in the laboratory to
assure reliable data. An individual shall not be the technical directors) of more than one accredited
environmental laboratory without authorization from the primary Accrediting Authority. Circumstances
to be considered in the decision to grant such authorization shall include, but not be limited to, the
extent to which operating hours of the laboratories to be directed overlap, adequacy of supervision
in each laboratory, and the availability of environmental laboratory services in the area served. The
technical director(s) who is absent for a period of time exceeding 15 consecutive calendar days shall
designate another full-time staff member meeting the qualifications of the technical director(s) to
temporarily perform this function. If this absence exceeds 65 consecutive calendar days, the primary
accrediting authority shall be notified in writing.
Qualifications of the technical director(s).
f)	Any technical director of an accredited environmental laboratory engaged in chemical analysis
shall be a person with a bachelors degree in the chemical, environmental, biological sciences,
physical sciences or engineering, with at least 24 college semester credit hours in chemistry and
at least two years of experience in the environmental analysis of representative inorganic and
organic analytes for which the laboratory seeks or maintains accreditation. A masters or doctoral
degree in one of the above disciplines may be substituted for one year of experience.
g)	Any technical director of an accredited environmental laboratory limited to inorganic chemical
analysis, other than metals analysis, shall be a person with at least an earned associate's degree
in the chemical, physical or environmental sciences, or two years of equivalent and successful
college education, with a minimum of 16 college semester credit hours in chemistry. In addition,
such a person shall have at least two years of experience performing such analysis.
h)	Any technical director of an accredited environmental laboratory engaged in microbiological or
biological analysis shall be a person with a bachelors degree in microbiology, biology, chemistry,
environmental sciences, physical sciences or engineering with a minimum of 16 college semester
credit hours in general microbiology and biology and at least two years of experience in the
environmental analysis of representative analytes for which the laboratory seeks or maintains
accreditation. A masters or doctoral degree in one of the above disciplines may be substituted
for one year of experience.
A person with an associate's degree in an appropriate field of the sciences or applied sciences,
with a minimum of four college semester credit hours in general microbiology may be the
technical director(s) of a laboratory engaged in microbiological analysis limited to fecal coliform,
total coliform and standard plate count. Two years of equivalent and successful college
education, including the microbiology requirement, may be substituted forthe associate's degree.
In addition, each person shall have one year of experience in environmental analysis.
i)	Any technical director of an accredited environmental laboratory engaged in radiological analysis
shall be a person with a bachelor's degree in chemistry, physics or engineering with 24 college

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page 3 of 13
semester credit hours of chemistry with two or more years of experience in the radiological
analysis of environmental samples. A masters or doctoral degree in one of the above disciplines
may be substituted for one year experience.
e)	The technical director(s) of an accredited environmental laboratory engaged in microscopic
examination of asbestos and/or airborne fibers shall meet the following requirements:
i)	For procedures requiring the use of a transmission electron microscope, a bachelor's
degree, successful completion of courses in the use of the instrument, and one year of
experience, under supervision, in the use of the instrument. Such experience shall
include the identification of minerals.
ii)	For procedures requiring the use of a polarized light microscope, an associate's degree
or two years of college study, successful completion of formal coursework in polarized
light microscopy, and one year of experience, under supervision, in the use of the
instrument. Such experience shall include the identification of minerals.
iii)	For procedures requiring the use of a phase contrast microscope, as in the determination
of airborne fibers, an associate's degree or two years of college study, documentation
of successful completion of formal coursework in phase contrast microscopy, and one
year of experience, under supervision, in the use of the instrument.
f)	Any technical director of an accredited environmental laboratory engaged in the examination of
radon in air shall have at least an associate's degree or two years of college and one year of
experience in radiation measurements, including at least one year of experience in the
measurement of radon and/or radon progeny.
4.1.1.2 Personnel Qualification Clarifications and Exceptions
a)	Notwithstanding any other provision of this section, a full-time employee of a drinking water or
sewage treatment facility who holds a valid treatment plant operator's certificate appropriate to
the nature and size of such facility shall be deemed to meet the educational and experience
requirements serving as the director of the accredited laboratory devoted exclusively to the
examination of environmental samples taken within such facility system. Such accreditation for
a water treatment facility and/or a sewage treatment facility shall be limited to the scope of that
facility's regulatory permit, and when the facility's laboratory is analyzing water treatment/sewage
treatment samples collected within the state where the laboratory is situated, the scope of
accreditation shall be determined by the accrediting authority.
b)	A full-time employee of an industrial waste treatment facility with a minimum of one year of
experience under supervision in environmental analysis shall be deemed to meet the
requirements for serving as the director of an accredited laboratory devoted exclusively to the
examination of environmental samples taken within such facility for the scope of that facility's
regulatory permit. Such accreditation for a industrial waste treatment facility shall be limited to
laboratories analyzing industrial waste treatment samples collected within the state where the
laboratory is situated, and the scope of accreditation shall be determined by the state accrediting
authority.
4.1.2 On-site Assessments
On-site assessments are a requirement of the Accreditation Process and a summary of the process
requirements are described. Refer to On-site Assessment (Chapter 3) for additional information
regarding frequency, procedures, criteria, scheduling and documentation of on-site assessments. On-
site assessments shall be of two types: announced and unannounced. The on-site assessment of
each accredited laboratory must be performed a minimum of one time per two years. On-site
assessments may be conducted more frequently for cause or at the option of the primary accrediting
authority. Situations which might trigger more frequent on-site assessments include, review of a

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page 4 of 13
previously deficient on-site assessment, poor performance on a proficiency testing (PT) sample,
change in other accreditation elements, or other information concerning the capabilities or practices
of the accredited laboratory. The on-site assessment ensures that the environmental laboratory is
in compliance with NELAC standards.
The primary accrediting authority has the responsibility for conducting on-site assessments for
national accreditation based on the following factors:
a)	The assessment may consist of all of the fields of accreditation and/or methods for which the
laboratory wants to obtain accreditation.
b)	The number of assessors conducting the on-site assessment should be appropriate for the
laboratory's scope and testing.
c)	The on-site assessment should be conducted during normal working hours.
Laboratories shall be furnished with a report documenting any deficiencies found by the assessor.
This report shall be known as an assessment report.
4.1.3 Corrective Action Reports In Response to On-Site Assessment
A corrective action report must be submitted by the laboratory to the primary accrediting authority in
response to any assessment report received by the laboratory after an on-site assessment. The
corrective action report shall include the action that the laboratory shall implement to correct each
deficiency and the time period required to accomplish the corrective action.
a)	The primary accrediting authority shall present an assessment report to the laboratory within 30
calendar days of the on-site assessment.
b)	After being notified of deficiencies, the laboratory shall have 30 calendar days from the date of
receipt of the assessment report to provide a corrective action report.
c)	The primary accrediting authority shall respond to the action noted in the corrective action report
within 30 calendar days of receipt.
d)	If the corrective action report (or a portion) is deemed unacceptable to remediate a deficiency,
the laboratory shall have an additional 30 calendar days to submit a revised corrective action
report.
e)	If the corrective action report is not acceptable to the primary accrediting authority after the
second submittal, the laboratory shall have accreditation revoked pursuant to Section 4.4.3 for
all or any portion of its scope of accreditation for any or all of a field of accreditation, a method,
or analyte within a field of accreditation.
f)	All information included and documented in an assessment report and the corrective action report
are considered to be public information and are to be released pursuant to Chapter 3, Section
3.7.4.
g)	If the laboratory fails to implement the corrective actions as stated in their corrective action report,
accreditation for fields of accreditation, specific methods, or analytes within those fields of
accreditation shall be revoked.
h) Proprietary data, Confidential Business Information and classified national security information
will be excluded from all public records.

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page 5 of 13
4.1.4	Proficiency Testing Samples
A critical component of laboratory assessments is the analysis of PT samples. Refer to Proficiency
Testing (Chapter 2) for additional information. PT samples are used and evaluated in the
accreditation process as follows:
a)	Each laboratory seeking accreditation must receive, and analyze initial PT samples from a
NELAP approved PT study provider for each field of accreditation (program-matrix-analyte) in
which it is requesting accreditation.
b)	Unless otherwise specified by the proficiency testing standard, each laboratory seeking or
maintaining accreditation shall be required to perform analysis of one PT sample twice per year
in each field of accreditation (program-matrix-analyte) for which it has applied for accreditation
or for which it is currently accredited.
c)	The laboratory shall be informed of its score on the PT samples by the primary accrediting
authority or the NELAP approved PT provider within 21 calendar days from the closing date of
submission. The results of all of the PT sample tests including acceptable or not acceptable
shall be part of the public record. PT sample results shall apply to all accredited methods for an
analyte in a particular matrix.
d)	When a laboratory initially requests accreditation, it must successfully analyze two sets of PT
samples, the analyses to be performed 30 calendar days apart. Each set shall contain one
sample foreach requested field of accreditation (program-matrix-analyte). When a laboratory has
been granted accreditation status, it must maintain a history of at least two passing results out
of the most recent three foreach field of accreditation (program-matrix-analyte).
e)	The results of the PT sample analyses shall be considered by the primary accrediting authority,
in determining whether accreditation should be granted, denied, revoked, orsuspended pursuant
to this Chapter, for a field of accreditation (program-matrix-analyte) or an analyte within a field of
accreditation (program-method-analyte).
4.1.5	Accountability for Analytical Standards
Elements in NELAP that shall ensure consistency and promote the use of quality assurance/quality
control procedures to generate quality data for regulatory purposes are:
a)	In accordance with Chapter 5, each laboratory seeking or maintaining NELAP accreditation shall
have a named quality assurance officer or a person designated as accountable for data quality.
b)	NELAC requires that each laboratory seeking or maintaining NELAP accreditation have a
developed and maintained Quality Assurance Manual on-site, as required in Chapter 5.
c)	The primary accrediting authority shall consider that the accountability for negligence and the
falsification of data shall rest upon the analyst, the laboratory management and the company.
4.1.6	Fee Process for National Accreditation
Referto Policy and Structure, Chapter 1, for specific information on funding of this program (Section
1.5.2.3.3).
Where required, and if applicable, the level and timing of fee payments shall be established by the
primary accrediting authority (ies) to which the laboratory is applying for accreditation. Additional fees
on the laboratory may be levied by other secondary accrediting authorities with which the laboratory
chooses to seek accreditation.

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page 6 of 13
4.1.7 Application
The NELAP encompasses a standardized set of elements in each application for accreditation that
shall be reported to and recorded in the national database. The application package includes any
specific State regulatory requirements that are essential for accreditation within an individual State.
4.1.7.1	Primary Application Package
A laboratory seeking accreditation shall complete and submit an application package to the primary
accrediting authority(ies). An accrediting authority participating in NELAP shall include in its
application form the following:
a)	Legal name of laboratory,
b)	Laboratory mailing address,
c)	Billing address (if different from b),
d)	Name of owner,
e)	Address of owner,
f)	Location (full address) of laboratory,
g)	Name and phone number of technical directors), however named, and the lead technical director
(if applicable),
h)	Name and phone number of Quality Assurance Officer,
i)	Name and phone number of laboratory contact person,
j) Laboratory hours of operation,
k) Primary Accrediting Authority,
I) Fields of accreditation for which the laboratory is requesting accreditation,
m) Methods employed including analytes,
n) Description of laboratory type (for example),
Commercial
Federal
Hospital or health care
State
Academic Institutes
Public water system
Public wastewater system
Industrial (an industry with discharge permits)
Mobile
Other (Describe)	
o) Certification of compliance by laboratory management
(vide infra: 4.1.9),
p) Fee enclosed (if applicable),
q) Description of geographical location,
r) FAX number,
s) Lab identification number,
t) Unique vehicle identification number, such as manufacturer's Vehicle Identification Number
(VIN#), serial number, or license number (if a mobile laboratory), and
u) Quality Manual enclosed (if required with application)
A laboratory seeking renewal of accreditation shall follow the process outlined by the accrediting
authority by which they are currently accredited.
4.1.7.2	Secondary Accreditation Package
A laboratory seeking accreditation from a secondary accrediting authority (ies) shall complete and
submit a secondary application package as required by the secondary accrediting authority. Refer
to Section 4.2 forthe assessment of fees (if applicable) and Section 4.4.1 (1) and (2) forthe reasons
to deny a secondary application package.

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page 7 of 13
4.1.8	Change of Ownership and/or Location of Laboratory
Accreditation may be transferred when the legal status or ownership of an accredited laboratory
changes without affecting its staff, equipment, and organization. The primary accrediting authority
may charge a transfer fee and may conduct an on-site assessment to verify affects of such changes
on laboratory performance.
The following conditions apply to the change in ownership and/or the change in location of a
laboratory that has national accreditation.
a)	Any change in ownership and/or location of an accredited laboratory must be reported in writing
to the primary accrediting authority within 30 calendar days and entered into the national
database by the primary accrediting authority. Required notification for change in location shall
apply only to fixed-based laboratories.
b)	Such a change in ownership and/or location shall not necessarily require reaccreditation or
reapplication in any or all of the categories in which the laboratory is currently accredited.
c)	Change in ownership and/or location may require an on-site assessment with the elements of the
assessment being determined by the primary accrediting authority.
d)	Any change in ownership must assure historical traceability of the laboratory accreditation
number(s).
e)	When there is a change in ownership all records and analyses performed pertaining to
accreditation must be kept for a minimum of 5 years and are subject to inspection by the
accrediting authorities during this period without prior notification to the laboratory. This
stipulation is applicable regardless of change in ownership, accountability or liability.
4.1.9	"Certification of Compliance" Statement
The following "Certification of Compliance" statement must accompany the application for laboratory
accreditation. It must be signed and dated by both the laboratory management and the quality
assurance officer, or other designated person, for that laboratory.
CERTIFICATION BY APPLICANT
The applicant understands and acknowledges that the laboratory is required to be continually in
compliance with the (insert the name of the primary accrediting authority) standards and is subject
to the enforcement and penalty provisions of that accrediting authority.

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page 8 of 13
I hereby certify that I am authorized to sign this application on behalf of the applicant/owner and that
there are no misrepresentations in my answer to the questions on this application.
Signature Quality Assurance Officer	Name of Quality Assurance Officer
or other designated individual
Print Name of Applicant Laboratory	Date
(Legal Name)
Authorized Agent (Title)
Signature	Name
Technical Director(s)	Technical Director(s)
4.2	PERIOD OF ACCREDITATION
For a laboratory in good standing, the period for accreditation within fields of accreditation for
methods or analytes shall be 12 months and will be considered to be ongoing once a laboratory has
been accredited for that field of accreditation method or analyte within a field of accreditation. To
maintain accreditation the laboratory shall meet the requirements of Section 4.3, Maintaining
Accreditation. Failure to meet the requirements delineated in Section 4.3 shall constitute grounds for
suspension or revocation of accreditation as specified in Section 4.4. Additionally, failure to pay the
required fees to the primary accrediting authority(ies) within the stipulated deadlines or by the
stipulated dates shall result in revocation of accreditation by all the accrediting authorities (primary
and secondary) with which the laboratory maintains accreditation. Failure to pay required fees to a
secondary accrediting authority shall result in revocation of accreditation by that secondary
accrediting authority. This information may be entered into the national database in a timely and
effective manner. The NELAP recognizes that different accrediting authorities operate the yearly
period with different start times. The individual laboratory being accredited is responsible fortracking
an accrediting authority's period of accreditation and is responsible for paying the necessary fees (if
applicable) to those accrediting authorities to maintain accreditation.
4.3	MAINTAINING ACCREDITATION
Accreditation remains in effect until revoked by the accrediting authority, withdrawn at the written
request of the accredited laboratory, or until expiration of the accreditation period. To maintain
accreditation, the accredited laboratory shall complete or comply with Section/elements 4.3.1 to 4.3.3.
Failure to complete or comply with these elements shall be cause for suspending or revoking
accreditation as specified in Section 4.4 of this Chapter.
4.3.1 Quality Systems
Laboratories seeking accreditation under NELAP must assure consistency and promote the use of
quality assurance/quality control procedures. Chapter 5, Quality Systems provides the details
concerning quality assurance and quality control requirements forthe evaluation of laboratories. The
quality assurance policies, which establish essential quality control procedures, are applicable to all
environmental laboratories regardless of size, volume of business and fields of accreditation. Failure
to maintain, revise, or replace any ofthese key components may be cause for suspending or revoking
a laboratory's accreditation status, as specified in Section 4.4 of this Chapter.

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page 9 of 13
4.3.2	Notification and Reporting Requirements
The accredited laboratory shall notify the accrediting authority of any changes in key accreditation
criteria within 30 calendar days of the change. This written notification includes but is not limited to
changes in the laboratory ownership, location, key personnel, and major instrumentation. All such
updates are public record, and any or all of the information contained therein may be placed in the
national database.
4.3.3	Record Keeping and Retention
All laboratory records associated with accreditation parameters shall meet the requirements of
Chapter 5, Section 5.12 and shall be maintained for a minimum of five years unless otherwise
designated for a longer period in another regulation or authority. In the case of data used in litigation,
the laboratory is required to store such records for a longer period upon written notification from the
accrediting authority.
4.4 DENIAL, SUSPENSION, AND REVOCATION OF ACCREDITATION
4.4.1 Denial
Denial - shall mean to refuse to accredit in total or in part a laboratory applying for initial accreditation
or resubmission of initial application.
a) Reasons to deny an initial application shall include:
1)	Failure to submit a completed application;
2)	Failure to pay required fees;
3)	Failure of laboratory staff to meet the personnel qualifications of education, training, and
experience as required by the NELAC standards;
4)	Failure to successfully analyze and report proficiency testing samples as required by the
NELAC standards, Chapter 2;
5)	Failure to respond to an assessment report from the on-site assessment with a corrective
action report within the required 30 calendar days after receipt of the assessment report;
6)	Failure to implement the corrective actions detailed in the corrective action report within the
time frame as approved by the primary accrediting authority;
7)	Failure to implement a quality system as defined in Chapter 5;
8)	Failure to pass required on-site assessment(s) as specified in the NELAC standards, Chapter
3.
9)	Misrepresentation of any fact pertinent to receiving or maintaining accreditation;
10)	Denial of entry during normal business hours for an on-site assessment as required by the
NELAC standards, Chapter 3.
b)	If the laboratory is not successful in correcting the deficiencies as required by the NELAC
standards, the laboratory must wait six months before again reapplying for accreditation.
c)	Upon reapplication, the laboratory may again be responsible for all or part of the fees as
applicable incurred as part of the initial application for accreditation.

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page 10 of 13
d) No laboratory's accreditation shall be denied without the right to due process.
4.4.2	Suspension
Suspension - shall mean the temporary removal of a laboratory's accreditation for a defined period
of time which shall not exceed six months. The purpose of suspension is to allow a laboratory time
to correct deficiencies or an area of non-compliance with the NELAC standards.
a)	A laboratory's accreditation shall be suspended in total or in part. The laboratory shall retain
accreditation for the field of accreditations, methods and analytes where it continues to meet the
requirements of the NELAC standards.
b)	Reasons for suspension shall include:
1)	If the primary accrediting authority finds during the on-site assessment that the public interest,
safety or welfare imperatively requires emergency action;
2)	Failure to complete proficiency testing studies and maintain a history of at least two
successful proficiency testing studies for each affected accredited field of accreditation out
of the three most recent proficiency testing studies as defined in NELAC, Chapter 2; or,
3)	Failure to notify the primary accrediting authority of any changes in key accreditation criteria,
as set forth in Section 4.3.2 of this Chapter.
4)	Failure to maintain a Quality System as defined in Chapter 5.
5)	Failure of laboratory to employ staff that to meet the personnel qualifications for education,
training and experience as required by the NELAC standards.
c)	A suspended laboratory cannot continue to analyze samples for the affected fields of
accreditation for which it holds accreditation.
d)	The laboratory's suspended accreditation status will change to accredited when the laboratory
demonstrates to the primary accrediting authority that the laboratory complies with the NELAC
standards.
e)	A suspended laboratory would not have to reapply for accreditation if the cause/causes for
suspension are corrected within six months.
f)	If the laboratory fails to correct the causes of suspension within six months afterthe effective date
of the suspension, the primary accrediting authority shall revoke in total or part the laboratory's
accreditation.
g)	No laboratory's accreditation shall be suspended without the right to due process as set forth by
the primary accrediting authority.
4.4.3	Revocation
Revocation - shall mean the in part or total withdrawal of a laboratory's accreditation by the
accrediting authority. After correcting the reason/cause for revocation and satisfying any legal
remedies, the laboratory may reapply for accreditation.
a) The accrediting authority shall revoke a laboratory's accreditation, in part or in total for failure to
correct the deficiencies as set forth in Section 4.1.3 (e) of this Chapter and for failure to correct
the reasons for being suspended. The laboratory shall retain accreditation for the fields of
accreditation, methods and analytes where it continues to meet the requirements of the NELAC
standards.

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page 11 of 13
b)	Reasons for revocation in part or in total include a laboratory's:
1)	Failure to submit an acceptable corrective action report, in response to an assessment report
and failure to implement corrective action(s) related to any deficiencies found during a
laboratory assessment. The laboratory may submit two corrective action reports within the
time limits specified in Section 4.1.3.
2)	After being suspended due to failure of proficiency testing samples, if the laboratory's
analysis of the next proficiency testing study results in three consecutively failed proficiency
testing studies, the laboratory shall be revoked for each affected accredited field of
accreditation as defined in NELAC Chapter 2.
c)	Reasons for total revocation include a laboratory's:
1)	Failure to respond with a corrective action report within the required 30 calendar days;
2)	Failure to participate in the proficiency testing program as required by the NELAC standards,
Chapter 2;.
3)	Submittal of proficiency test sample results generated by another laboratory as its own;
4)	Misrepresentation of any material fact pertinent to receiving and maintaining accreditation;
5)	Denial of entry during normal business hours for an on-site assessment as required by the
NELAC standards, Chapter 3;
6)	Conviction of charges relating to the falsification of any report relating to a laboratory
analysis; or,
7)	Failure to remit the accreditation fees, if applicable, within the time limit as established by the
accrediting authority.
d)	No laboratory's accreditation shall be revoked without the right to due process.
4.4.4 Voluntary Withdrawal
If an environmental laboratory wishes to withdraw from NELAP, in total or in part, it must notify the
primary accrediting authority in writing no later than 30 calendar days before the end of the
accreditation year.
4.5 INTERIM ACCREDITATION
4.5.1 Interim Accreditation
If a laboratory completes all of the requirements for accreditation except that of an on-site assessment
because the accrediting authority is unable to schedule the assessment, the accrediting authority may
issue an interim accreditation. Interim accreditation shall allow a laboratory to perform analyses and
report results with the same status as an accredited laboratory until the on-site assessment
requirements have been completed. Interim accreditation status shall not exceed twelve months.
The interim accreditation status is a matter of public record and shall be entered into the national
database.

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page 12 of 13
4.5.2 Revocation of Interim Accreditation
Revocation of interim accreditation may be initiated for due cause as described in Section 4.4.3 by
order of the primary accrediting authority.
4.6	AWARDING OF ACCREDITATION
When a participating laboratory has met the requirements specified for receiving accreditation, the
laboratory shall receive a certificate awarded on behalf of the accrediting authority. The certificate
shall be signed by a member of the accrediting authority and shall be considered an official document.
It will be transmitted as a sealed and dated (effective date and expiration date) document containing
the NELAP insignia. The certificate shall include:
a)	name of laboratory,
b)	address of the laboratory,
c)	fields of accreditation (program, method, analyte), and,
d)	addenda or attachments (these shall be considered to be official documents).
The laboratory must have a certificate for each State or federal department/agency for which it is
accredited. The certificate shall explain that continued accredited status depends on successful
ongoing participation in the program. The certificate shall urge a customer to verify the laboratory's
current accreditation standing within a particular State. The certificate must be returned to the
accrediting authority upon loss of accreditation. However, this does not require the return of a
certificate which has simply expired (reached the expiration date). If an accredited laboratory
changes its scope of accreditation, a new certificate shall be issued which details the laboratory's
accreditation^).
4.6.1	Use of NELAC Accreditation by Accredited Laboratories
An accredited laboratory shall not misrepresent its NELAP accredited fields of accreditation, methods,
analytes, or its NELAP accreditation status on any document. This includes laboratory reports,
catalogs, advertising, business solicitations, proposals, quotations or other materials (pursuant to
NELAC Chapter 6, Section 8).
4.6.2	Changes in Fields of Accreditation
An accrediting authority may approve a laboratory's application to add an analyte or method to its
scope of accreditation by performing a data review, without an on-site assessment. An addition to
the scope of accreditation via a data review of proficiency testing performance (if available), quality
control performance, and written standard operating procedure is at the discretion of the accrediting
authority. An addition of a new technology or test method requiring specific equipment may require
an on-site assessment.
4.7	DUE PROCESS
Regardless of the language in this chapter concerning actions such as denial, suspension and
revocation of accreditation, a laboratory is always entitled to the right of due process. Due process
rights are delineated in the appropriate state laws and regulations of the accrediting authorities. Since
these laws and regulations may vary from state to state, laboratories seeking accreditation are
encouraged to become familiar with the specific laws and regulations governing due process for each
of the accrediting authorities of interest.

-------
NELAC
Accreditation Process
Revision 14
May 25, 2001
Page 13 of 13
4.8 ENFORCEMENT
Since NELAC is a standard setting body, it cannot enforce civil or criminal penalties but rather all
enforcement actions are taken independently by the accrediting authorities.
The enforcement component of the accrediting authorities should be based on explicit values, or
principles, with which all participants concur. The proposed basic principles are:
a)	The program should be equitable to all participants.
b)	The rules should be well publicized.
c)	The program needs of the participating agencies must be upheld.
d) The due process rights of participating laboratories must be protected.

-------
m



CD

O

1—H

<'

CD

C_

c



J-*-

ro

o

o

00

c
>
3
"O

"O
CD

C/)
o
C/)

O
I-H
CD
Q.
CD

03
<

c75"
ro
CD
cn
3
O
I-H
ro
o
CD
o
Q.
—
National Environmental
Laboratory Accreditation
Conference
m c
S3

-------
Note that the NELAC standards now have two significant dates: 1) the
date the standards were approved at the annual meeting, and 2) the
date the standards are effective and must be implemented. This is
especially important as some portions of the standards have different
effective dates. The approval date is part of the document control
header on each page. The cover of each chapter shows both the
approval date and the effective date. Changes approved for
implementation at a time other than the effective date (on the chapter
cover) are noted in the chapter, showing the approved text and its
effective date.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page i of iii
TABLE OF CONTENTS
QUALITY SYSTEMS
5.0	QUALITY SYSTEMS 		1
5.1	SCOPE		1
5.2	REFERENCES		1
5.3	DEFINITIONS		1
5.4	ORGANIZATION AND MANAGEMENT 	 2
5.4.1	Legal Definition of Laboratory	 2
5.4.2	Organization 	 2
5.5	QUALITY SYSTEM - ESTABLISHMENT, AUDITS, ESSENTIAL QUALITY CONTROLS
AND DATA VERIFICATION	 3
5.5.1	Establishment 	 3
5.5.2	Quality Manual	 4
5.5.3	Audits, Reviews and Corrective Actions 	 5
5.5.3.1	Internal Audits		5
5.5.3.2	Managerial Review 		5
5.5.3.3	Audit Review		5
5.5.3.4	Performance Audits		6
5.5.3.5	Corrective Actions		6
5.5.4	Essential Quality Control Procedures 		6
5.6	PERSONNEL	 7
5.6.1	General Requirements for Laboratory Staff 	 7
5.6.2	Laboratory Management Responsibilities	 7
5.6.3	Records	 9
5.7	PHYSICAL FACILITIES-ACCOMMODATION AND ENVIRONMENT	 9
5.7.1	Environment 	 9
5.7.2	Work Areas	 9
5.8	EQUIPMENT AND REFERENCE MATERIALS	 10
5.9	MEASUREMENT TRACEABILITY AND CALIBRATION	 10
5.9.1	General Requirements 		10
5.9.2	Traceability of Calibration 		11
5.9.3	Reference Standards 		11
5.9.4	Calibration		11
5.9.4.1	Support Equipment	 11
5.9.4.2	Instrument Calibration	 12
5.10	TEST METHODS AND STANDARD OPERATING PROCEDURES	 14
5.10.1	Methods Documentation	 14
5.10.1.1	Standard Operating Procedures (SOPs)		14
5.10.1.2	Laboratory Method Manual(s)		15
5.10.2	Test Methods 		15
5.10.2.1 Demonstration of Capability 		15
5.10.3	Sample Aliquots 		16

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page ii of iii
5.10.4	Data Verification 	16
5.10.5	Documentation and Labeling of Standards and Reagents 	17
5.10.6	Computers and Electronic Data Related Requirements	17
5.11	SAMPLE HANDLING, SAMPLE ACCEPTANCE POLICY AND SAMPLE RECEIPT	17
5.11.1	Sample Tracking	18
5.11.2	Sample Acceptance Policy 	18
5.11.3	Sample Receipt Protocols 	18
5.11.4	Storage Conditions 	20
5.11.5	Sample Disposal	20
5.12	RECORDS 	20
5.12.1	Record Keeping System and Design	21
5.12.2	Records Management and Storage	21
5.12.3	Laboratory Sample Tracking	22
5.12.3.1	Sample Handling	22
5.12.3.2	Laboratory Support Activities 	22
5.12.3.3	Analytical Records 	23
5.12.3.4	Administrative Records 	23
5.13	LABORATORY REPORT FORMAT AND CONTENTS	23
5.14	SUBCONTRACTING ANALYTICAL SAMPLES	26
5.15	OUTSIDE SUPPORT SERVICES AND SUPPLIES	26
5.16	COMPLAINTS	26
Appendix A - REFERENCES	A-1
Appendix B - (Reserved)	A-3
Appendix C - DEMONSTRATION OF CAPABILITY	C-1
C.1 PROCEDURE FOR DEMONSTRATION OF CAPABILITY 	C-1
C.2	CERTIFICATION STATEMENT	C-2
Appendix D - ESSENTIAL QUALITY CONTROL REQUIREMENTS	 D-1
D.1	CHEMICAL TESTING		D-1
D.1.1 Positive and Negative Controls 		D-1
D.1.2 Detection Limits 		D-5
D.1.3 Data Reduction		D-6
D.1.4 Quality of Standards and Reagents		D-6
D.1.5 Selectivity		D-6
D.1.6 Constant and Consistent Test Conditions 		D-7
D.2 TOXICITY TESTING	 D-7
D.2.1 Positive and Negative Controls 	 D-7
D.2.2 Variability and/or Reproducibility	 D-9
D.2.3 Accuracy 	 D-9
D.2.4 Test Sensitivity	 D-10
D.2.5 Selection of Appropriate Statistical Analysis Methods	 D-10

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page iii of iii
D.2.6 Selection and Use of Reagents and Standards	 D-10
D.2.7 Selectivity	 D-10
D.2.8 Constant and Consistent Test Conditions 	 D-10
D.3 MICROBIOLOGY TESTING 		D-13
D.3.1 Sterility Checks and Blanks, Positive and Negative Controls		D-13
D.3.2 Test Variability/Reproducibility 		D-14
D.3.3 Method Evaluation		D-14
D.3.4 Test Performance 		D-15
D.3.5 Data Reduction		D-15
D.3.6 Quality of Standards, Reagents and Media 		D-15
D.3.7 Selectivity		D-15
D.3.8 Constant and Consistent Test Conditions 		D-16
D.4 RADIOCHEMICAL TESTING		D-18
D.4.1 Negative and Positive Controls 		D-18
D.4.2 Analytical Variability/Reproducibility 		D-20
D.4.3 Method Evaluation		D-20
D.4.4 Radiation Measurement System Calibration		D-20
D.4.5 Detection Limits 		D-22
D.4.6 Data Reduction		D-22
D.4.7 Quality of Standards and Reagents		D-22
D.4.8 Constant and Consistent Test Conditions 		D-22
D.5 AIR TESTING 		D-23
D.5.1 Negative and Positive Controls 		D-23
D.5.2 Analytical Variability/Reproducibility 		D-23
D.5.3 Method Evaluation		D-23
D.5.4 Detection Limits 		D-24
D.5.5 Data Reduction		D-24
D.5.6 Quality of Standards and Reagents		D-24
D.5.7 Selectivity		D-25
D.5.8 Constant and Consistent Test Conditions 		D-25
Appendix E - ADDITIONAL SOURCES OF INFORMATION	 E-1

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 1 of 26
5.0	QUALITY SYSTEMS
INTRODUCTION
Quality Systems include all quality assurance (QA) policies and quality control (QC) procedures,
which shall be delineated in a Quality Manual and followed to ensure and document the quality of the
analytical data. Laboratories seeking accreditation under NELAP must assure implementation of all
QA policies and the essential applicable QC procedures specified in this Chapter. The QA policies,
which establish essential QC procedures, are applicable to environmental laboratories regardless of
size and complexity.
The intent of this Chapter is to provide sufficient detail concerning quality system requirements so
that all accrediting authorities evaluate laboratories consistently and uniformly.
NELAC is committed to the use of Performance-based Measurement Systems (PBMS) in
environmental testing and provides the foundation for PBMS implementation in these standards.
While this standard may not currently satisfy all the anticipated needs of PBMS, NELAC will address
future needs within the context of State statutory and regulatory requirements and the finalized EPA
implementation plans for PBMS.
Chapter 5 is organized according to the structure of ISO/IEC Guide 25, 1990. Where deemed
necessary, specific areas within this Chapter may contain more information than specified by ISO/IEC
Guide 25.
All items identified in this Chapter shall be available for on-site inspection or data audit.
5.1	SCOPE
a)	This Standard sets out the general requirements that a laboratory has to successfully
demonstrate to be recognized as competent to carry out specific environmental tests.
b)	This Standard includes additional requirements and information for assessing competence or for
determining compliance by the organization or accrediting authority granting the recognition (or
approval).
If more stringent standards or requirements are included in a mandated test method or by
regulation, the laboratory shall demonstrate that such requirements are met. If it is not clear
which requirements are more stringent, the standard from the method or regulation is to be
followed. (See the supplemental accreditation requirements in Section 1.8.2.)
c)	This Standard is for use by environmental testing laboratories in the development and
implementation of their quality systems. It shall be used by accrediting authorities, in assessing
the competence of environmental laboratories.
5.2	REFERENCES
See Appendix A.
5.3	DEFINITIONS
The relevant definitions from ISO/IEC Guide 2, ISO 8402, ANSI/ASQC E-4,1994, the EPA "Glossary
of Quality Assurance Terms and Acronyms", and the International vocabulary of basic and general
terms in metrology (VIM) are applicable, the most relevant being quoted in Appendix A, Glossary, of
Chapter 1 together with further definitions applicable for the purposes of this Standard.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 2 of 26
See Appendix A, Glossary, of Chapter 1.
5.4 ORGANIZATION AND MANAGEMENT
5.4.1	Legal Definition of Laboratory
The laboratory shall be legally identifiable. It shall be organized and shall operate in such a way that
its permanent, temporary and mobile facilities meet the requirements of this Standard.
5.4.2	Organization
The laboratory shall:
a)	have managerial staff with the authority and resources needed to discharge their duties;
b)	have processes to ensure that its personnel are free from any commercial, financial and other
undue pressures which adversely affect the quality of their work;
c)	be organized in such a way that confidence in its independence of judgment and integrity is
maintained at all times;
d)	specify and document the responsibility, authority, and interrelationship of all personnel who
manage, perform or verify work affecting the quality of calibrations and tests;
Such documentation shall include:
1)	a clear description of the lines of responsibility in the laboratory and shall be proportioned
such that adequate supervision is ensured and
2)	job descriptions for all positions.
e)	provide supervision by persons familiar with the calibration or test methods and procedures, the
objective of the calibration or test and the assessment of the results;
The ratio of supervisory to non-supervisory personnel shall be such as to ensure adequate
supervision to ensure adherence to laboratory procedures and accepted techniques.
1) have a technical director(s) (however named) who has overall responsibility for the technical
operation of the environmental testing laboratory;
The technical directors) shall certify that personnel with appropriate educational and/or technical
background perform all tests for which the laboratory is accredited. Such certification shall be
documented.
The technical director(s) shall meet the requirements specified in the Accreditation Process, (see
4.1.1.1)
g) have a quality assurance officer (however named) who has responsibility for the quality system
and its implementation;
The quality assurance officer shall have direct access to the highest level of management at
which decisions are taken on laboratory policy or resources, and to the technical director. Where

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 3 of 26
staffing is limited, the quality assurance officer may also be the technical director or deputy
technical director;
The quality assurance officer (and/or his/her designees) shall:
1)	serve as the focal point for QA/QC and be responsible for the oversight and/or review of
quality control data;
2)	have functions independent from laboratory operations for which they have quality assurance
oversight;
3)	be able to evaluate data objectively and perform assessments without outside (e.g.,
managerial) influence;
4)	have documented training and/or experience in QA/QC procedures and be knowledgeable
in the quality system as defined under NELAC;
5)	have a general knowledge of the analytical test methods for which data review is performed;
6)	arrange for or conduct internal audits as per 5.5.3 annually; and,
7)	notify laboratory management of deficiencies in the quality system and monitor corrective
action.
h)	nominate deputies in case of absence of the technical directors) and/or quality assurance officer;
i)	have documented policy and procedures to ensure the protection of clients' confidential
information and proprietary rights (this may not apply to in-house laboratories);
j) for purposes of qualifying for and maintaining accreditation, each laboratory shall participate in
a proficiency test program as outlined in Chapter 2.
5.5 QUALITY SYSTEM - ESTABLISHMENT, AUDITS, ESSENTIAL QUALITY CONTROLS AND
DATA VERIFICATION
5.5.1 Establishment
The laboratory shall establish and maintain a quality system based on the required elements
contained in this chapter and appropriate to the type, range and volume of environmental testing
activities it undertakes.
a)	The elements of this quality system shall be documented in the organization's quality manual.
b)	The quality documentation shall be available for use by the laboratory personnel.
c)	The laboratory shall define and document its policies and objectives for, and its commitment to
accepted laboratory practices and quality of testing services.
d)	The laboratory management shall ensure that these policies and objectives are documented in
a quality manual and communicated to, understood and implemented by, all laboratory personnel
concerned.
e) The quality manual shall be maintained current under the responsibility of the quality assurance
officer.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 4 of 26
5.5.2 Quality Manual
The quality manual, and related quality documentation, shall state the laboratory's policies and
operational procedures established in order to meet the requirements of this Standard.
The Quality Manual shall list on the title page: a document title; the laboratory's full name and
address; the name, address (if different from above), and telephone number of individuals)
responsible for the laboratory; the name of the quality assurance officer (however named); the
identification of all major organizational units which are to be covered by this quality manual and the
effective date of the version;
The quality manual and related quality documentation shall also contain:
a)	a quality policy statement, including objectives and commitments, by top management;
b)	the organization and management structure of the laboratory, its place in any parent organization
and relevant organizational charts;
c)	the relationship between management, technical operations, support services and the quality
system;
d)	procedures to ensure that all records required under this Chapter are retained, as well as
procedures for control and maintenance of documentation through a document control system
which ensures that all standard operating procedures, manuals, or documents clearly indicate the
time period during which the procedure or document was in force;
e)	job descriptions of key staff and reference to the job descriptions of other staff;
1) identification of the laboratory's approved signatories; at a minimum, the title page of the Quality
Manual must have the signed and dated concurrence, (with appropriate titles) of all responsible
parties including the QA officers), technical directors), and the agent who is in charge of all
laboratory activities such as the laboratory director or laboratory manager;
g)	the laboratory's procedures for achieving traceability of measurements;
h)	a list of all test methods under which the laboratory performs its accredited testing;
i)	mechanisms for ensuring that the laboratory reviews all new work to ensure that it has the
appropriate facilities and resources before commencing such work;
j) reference to the calibration and/or verification test procedures used;
k) procedures for handling submitted samples;
I) reference to the major equipment and reference measurement standards used as well as the
facilities and services used by the laboratory in conducting tests;
m) reference to procedures for calibration, verification and maintenance of equipment;
n) reference to verification practices which may include interlaboratory comparisons, proficiency
testing programs, use of reference materials and internal quality control schemes;
o)
procedures to be followed for feedback and corrective action whenever testing discrepancies are
detected, or departures from documented policies and procedures occur;

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5 of 26
p) the laboratory management arrangements for exceptionally permitting departures from
documented policies and procedures or from standard specifications;
q) procedures for dealing with complaints;
r) procedures for protecting confidentiality (including national security concerns), and proprietary
rights;
s) procedures for audits and data review;
t) processes/procedures for establishing that personnel are adequately experienced in the duties
they are expected to carry out and are receiving any needed training;
u) ethics policy statement developed by the laboratory and processes/procedures for educating and
training personnel in their ethical and legal responsibilities including the potential punishments
and penalties for improper, unethical or illegal actions;
v) reference to procedures for reporting analytical results; and,
w) a table of contents, and applicable lists of references and glossaries, and appendices.
5.5.3 Audits, Reviews and Corrective Actions
5.5.3.1	Internal Audits
The laboratory shall arrange for annual internal audits to verify that its operations continue to comply
with the requirements of the laboratory's quality system. It is the responsibility of the quality
assurance officer to plan and organize audits as required by a predetermined schedule and requested
by management. Such audits shall be carried out by trained and qualified personnel who are,
wherever resources permit, independent of the activity to be audited. Personnel shall not audit their
own activities except when it can be demonstrated that an effective audit will be carried out. Where
the audit findings cast doubt on the correctness or validity of the laboratory's calibrations or test
results, the laboratory shall take immediate corrective action and shall immediately notify, in writing,
any client whose work was involved.
5.5.3.2	Managerial Review
The laboratory management shall conduct a review, at least annually, of its quality system and its
testing and calibration activities to ensure its continuing suitability and effectiveness and to introduce
any necessary changes or improvements in the quality system and laboratory operations. The review
shall take account of reports from managerial and supervisory personnel, the outcome of recent
internal audits, assessments by external bodies, the results of interlaboratory comparisons or
proficiency tests, any changes in the volume and type of work undertaken, feedback from clients,
corrective actions and other relevant factors. The laboratory shall have a procedure for review by
management and maintain records of review findings and actions.
5.5.3.3	Audit Review
All audit and review findings and any corrective actions that arise from them shall be documented.
The laboratory management shall ensure that these actions are discharged within the agreed time
frame as indicated in the quality manual and/or SOPs.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 6 of 26
5.5.3.4	Performance Audits
In addition to periodic audits, the laboratory shall ensure the quality of results provided to clients by
implementing checks to monitorthe quality of the laboratory's analytical activities. Examples of such
checks are:
a)	internal quality control procedures using statistical techniques; (see 5.5.4 below)
b)	participation in proficiency testing or other interlaboratory comparisons (See Chapter 2);
c)	use of certified reference materials and/or in-house quality control using secondary reference
materials as specified in Section 5.5.4;
d)	replicate testings using the same or different test methods;
e)	re-testing of retained samples;
1) correlation of results for different but related analysis of a sample (for example, total phosphorus
should be greater than or equal to orthophosphate).
5.5.3.5	Corrective Actions
a)	In addition to providing acceptance criteria and specific protocols for corrective actions in the
Method Standard Operating Procedures (see 5.10.1.1), the laboratory shall implement general
procedures to be followed to determine when departures from documented policies, procedures
and quality control have occurred. These procedures shall include but are not limited to the
following:
1)	identify the individuals) responsible for assessing each QC data type;
2)	identify the individuals) responsible for initiating and/or recommending corrective actions;
3)	define how the analyst shall treat a data set if the associated QC measurements are
unacceptable;
4)	specify how out-of-control situations and subsequent corrective actions are to be
documented; and,
5)	specify procedures for management (including the QA officer) to review corrective action
reports.
b)	To the extent possible, samples shall be reported only if all quality control measures are
acceptable. If a quality control measure is found to be out of control, and the data is to be
reported, all samples associated with the failed quality control measure shall be reported with the
appropriate data qualifiers).
5.5.4 Essential Quality Control Procedures
These general quality control principles shall apply, where applicable, to all testing laboratories. The
manner in which they are implemented is dependent on the types of tests performed by the laboratory
(i.e., chemical, whole effluent toxicity, microbiological, radiological, air) and are further described in
Appendix D. The standards for any given test type shall assure that the applicable principles are
addressed:

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 7 of 26
a)	All laboratories shall have detailed written protocols in place to monitor the following quality
controls:
1)	Positive and negative controls to monitor tests such as blanks, spikes, reference toxicants;
2)	Tests to define the variability and/or repeatability of the laboratory results such as replicates;
3)	Measures to assure the accuracy of the test method including calibration and/or continuing
calibrations, use of certified reference materials, proficiency test samples, or other measures;
4)	Measures to evaluate test method capability, such as detection limits and quantitation limits
or range of applicability such as linearity;
5)	Selection of appropriate formulae to reduce raw data to final results such as regression
analysis, comparison to internal/external standard calculations, and statistical analyses;
6)	Selection and use of reagents and standards of appropriate quality;
7)	Measures to assure the selectivity of the test for its intended purpose; and
8)	Measures to assure constant and consistent test conditions (both instrumental and
environmental) where required by the test method such as temperature, humidity, light, or
specific instrument conditions.
b)	All quality control measures shall be assessed and evaluated on an on-going basis, and quality
control acceptance criteria shall be used to determine the usability of the data. (See Appendix D.)
c)	The laboratory shall have procedures for the development of acceptance/rejection criteria where
no method or regulatory criteria exist. (See 5.11.2, Sample Acceptance Policy.)
d)	The quality control protocols specified by the laboratory's method manual (5.10.1.2) shall be
followed. The laboratory shall ensure that the essential standards outlined in Appendix D, or
mandated methods or regulations (whichever are more stringent) are incorporated into their
method manuals. When it is not apparent which is more stringent the QC in the mandated method
or regulations is to be followed.
The essential quality control measures for testing are found in Appendix D of this Chapter.
5.6 PERSONNEL
5.6.1	General Requirements for Laboratory Staff
The laboratory shall have sufficient personnel with the necessary education, training, technical
knowledge and experience for their assigned functions.
All personnel shall be responsible for complying with all quality assurance/quality control requirements
that pertain to their organizational/technical function. Each technical staff member must have a
combination of experience and education to adequately demonstrate a specific knowledge of their
particular function and a general knowledge of laboratory operations, test methods, quality
assurance/quality control procedures and records management.
5.6.2	Laboratory Management Responsibilities
In addition to 5.4.2.d, the laboratory management shall be responsible for:

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 8 of 26
a)	Defining the minimal level of qualification, experience and skills necessary for all positions in the
laboratory. In addition to education and/or experience, basic laboratory skills such as using a
balance, colony counting, aseptic or quantitative techniques shall be considered;
b)	Ensuring that all technical laboratory staff have demonstrated capability in the activities for which
they are responsible. Such demonstration shall be documented. (See Appendix C);
Note: In laboratories with specialized "work cells" (a well defined group of analysts that together
perform the method analysis), the group as a unit must meet the above criteria and this demonstration
must be fully documented.
c)	Ensuring that the training of each member of the technical staff is kept up-to-date (on-going) by
the following:
1)	Evidence must be on file that demonstrates that each employee has read, understood, and
is using the latest version of the laboratory's in-house quality documentation, which relates
to his/her job responsibilities.
2)	Training courses or workshops on specific equipment, analytical techniques or laboratory
procedures shall all be documented.
3)	Training courses in ethical and legal responsibilities including the potential punishments and
penalties for improper, unethical or illegal actions. Evidence must also be on file which
demonstrates that each employee has read, acknowledged and understood their personal
ethical and legal responsibilities including the potential punishments and penalties for
improper, unethical or illegal actions.
4)	Analyst training shall be considered up to date if an employee training file contains a
certification that technical personnel have read, understood and agreed to perform the most
recent version of the test method (the approved method or standard operating procedure as
defined by the laboratory document control system, 5.5.2.d) and documentation of continued
proficiency by at least one of the following once per year:
i.	Acceptable performance of a blind sample (single blind to the analyst);
ii.	Another demonstration of capability;
iii.	Successful analysis of a blind performance sample on a similar test method using the
same technology (e.g., GC/MS volatiles by purge and trap for Methods 524.2, 624 or
5035/8260) would only require documentation for one of the test methods;
iv.	At least four consecutive laboratory control samples with acceptable levels of precision
and accuracy;
v.	If i-iv cannot be performed, analysis of authentic samples with results statistically
indistinguishable from those obtained by another trained analyst.
d)	Documenting all analytical and operational activities of the laboratory;
e)	Supervising all personnel employed by the laboratory;
1) Ensuring that all sample acceptance criteria (Section 5.11) are verified and that samples are
logged into the sample tracking system and properly labeled and stored;

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 9 of 26
g)	Documenting the quality of all data reported by the laboratory; and
h)	Developing a proactive program for prevention and detection of improper, unethical or illegal
actions. Components of this program could include: internal proficiency testing (single and
double blind); post-analysis, electronic data and magnetic tape audits; effective reward program
to improve employee vigilance and co-monitoring; and separate SOPs identifying appropriate and
inappropriate laboratory and instrument manipulation practices.
5.6.3 Records
Records on the relevant qualifications, training, skills and experience of the technical personnel shall
be maintained by the laboratory [see 5.6.2.C], including records on demonstrated proficiency for each
laboratory test method, such as the criteria outlined in 5.10.2.1 for chemical testing.
5.7 PHYSICAL FACILITIES - ACCOMMODATION AND ENVIRONMENT
5.7.1	Environment
a)	Laboratory accommodation, test areas, energy sources, lighting, heating and ventilation shall be
such as to facilitate proper performance of tests.
b)	The environment in which these activities are undertaken shall not invalidate the results or
adversely affect the required accuracy of measurement. Particular care shall be taken when such
activities are undertaken at sites other than the permanent laboratory premises.
c)	The laboratory shall provide for the effective monitoring, control and recording of environmental
conditions as appropriate. Such environmental conditions may include biological sterility, dust,
electromagnetic interference, humidity, mains voltage, temperature, and sound and vibration
levels.
d)	In instances where monitoring or control of any of the above mentioned items are specified in a
test method or by regulation, the laboratory shall meet and document adherence to the laboratory
facility requirements.
NOTE: It is the laboratory's responsibility to comply with the relevant health and safety requirements.
This aspect, however, is outside the scope of this Standard.
5.7.2	Work Areas
a)	There shall be effective separation between neighboring areas when the activities therein are
incompatible including culture handling or incubation areas and volatile organic chemicals
handling areas.
b)	Access to and use of all areas affecting the quality of these activities shall be defined and
controlled.
c)	Adequate measures shall be taken to ensure good housekeeping in the laboratory and to ensure
that any contamination does not adversely affect data quality.
d)	Work spaces must be available to ensure an unencumbered work area. Work areas include:
1)	access and entryways to the laboratory;
2)	sample receipt area(s);

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 10 of 26
3)	sample storage area(s);
4)	chemical and waste storage area(s); and,
5)	data handling and storage area(s).
5.8	EQUIPMENT AND REFERENCE MATERIALS
a)	The laboratory shall be furnished with all items of equipment (including reference materials)
required for the correct performance of tests for which accreditation is sought. In those cases
where the laboratory needs to use equipment outside its permanent control it shall ensure that
the relevant requirements of this Standard are met.
b)	All equipment shall be properly maintained, inspected and cleaned. Maintenance procedures
shall be documented.
c)	Any item of the equipment which has been subjected to overloading or mishandling, or which
gives suspect results, or has been shown by verification or otherwise to be defective, shall be
taken out of service, clearly identified and wherever possible stored at a specified place until it
has been repaired and shown by calibration, verification or test to perform satisfactorily. The
laboratory shall examine the effect of this defect on previous calibrations or tests.
d)	Each item of equipment including reference materials shall be labeled, marked or otherwise
identified to indicate its calibration status.
e)	Records shall be maintained of each major item of equipment and all reference materials
significant to the tests performed. These records shall include documentation on all routine and
non-routine maintenance activities and reference material verifications.
The records shall include:
1)	the name of the item of equipment;
2)	the manufacturer's name, type identification, and serial number or other unique identification;
3)	date received and date placed in service (if available);
4)	current location, where appropriate;
5)	if available, condition when received (e.g. new, used, reconditioned);
6)	copy of the manufacturer's instructions, where available;
7)	dates and results of calibrations and/or verifications and date of the next calibration and/or
verification;
8)	details of maintenance carried out to date and planned for the future; and,
9)	history of any damage, malfunction, modification or repair.
5.9	MEASUREMENT TRACEABILITY AND CALIBRATION
5.9.1 General Requirements
All measuring operations and testing equipment having an effect on the accuracy or validity of tests
shall be calibrated and/or verified before being put into service and on a continuing basis. The
laboratory shall have an established program for the calibration and verification of its measuring and
test equipment. This includes balances, thermometers and control standards.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 11 of 26
5.9.2	Traceability of Calibration
a)	The overall program of calibration and/or verification and validation of equipment shall be
designed and operated so as to ensure that measurements made by the laboratory are traceable
to national standards of measurement.
b)	Calibration certificates shall indicate the traceability to national standards of measurement and
shall provide the measurement results and associated uncertainty of measurement and/or a
statement of compliance with an identified metrological specification. The laboratory shall
maintain records of all such certifications.
c)	Where traceability to national standards of measurement is not applicable, the laboratory shall
provide satisfactory evidence of correlation of results, for example by participation in a suitable
program of interlaboratory comparisons, proficiency testing, or independent analysis.
5.9.3	Reference Standards
a)	Reference standards of measurement held by the laboratory (such as Class S or equivalent
weights or traceable thermometers) shall be used for calibration only and for no other purpose,
unless it can be demonstrated that their performance as reference standards have not been
invalidated. Reference standards of measurement shall be calibrated by a body that can provide
traceability. Where possible, this traceability shall be to a national standard of measurement.
b)	There shall be a program of calibration and verification for reference standards.
c)	Where relevant, reference standards and measuring and testing equipment shall be subjected
to in-service checks between calibrations and verifications. Reference materials shall be
traceable. Where possible, traceability shall be to national or international standards of
measurement, or to national or international standard reference materials.
5.9.4	Calibration
Calibration requirements are divided into two parts: (1) requirements for analytical support equipment,
and 2) requirements for instrument calibration. In addition, the requirements for instrument calibration
are divided into initial instrument calibration and continuing instrument calibration verification.
5.9.4.1 Support Equipment
These standards apply to all devices that may not be the actual test instrument, but are necessary
to support laboratory operations. These include but are not limited to: balances, ovens, refrigerators,
freezers, incubators, water baths, temperature measuring devices (including thermometers and
thermistors), thermal/pressure sample preparation devices and volumetric dispensing devices (such
as Eppendorf®, or automatic dilutor/dispensing devices) if quantitative results are dependent on their
accuracy, as in standard preparation and dispensing or dilution into a specified volume.
a)	All support equipment shall be maintained in proper working order. The records of all repair and
maintenance activities including service calls, shall be kept.
b)	All support equipment shall be calibrated or verified at least annually, using NIST traceable
references when available, over the entire range of use. The results of such calibration shall be
within the specifications required of the application for which this equipment is used or:
1) The equipment shall be removed from service until repaired; or

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 12 of 26
2) The laboratory shall maintain records of established correction factors to correct all
measurements.
c)	Raw data records shall be retained to document equipment performance.
d)	Prior to use on each working day, balances, ovens, refrigerators, freezers, and water baths shall
be checked in the expected use range, with NIST traceable references where available. The
acceptability for use or continued use shall be according to the needs of the analysis or
application for which the equipment is being used.
e)	Mechanical volumetric dispensing devices including burettes (except Class A glassware) shall
be checked for accuracy on at least a quarterly use basis. Glass microliter syringes are to be
considered in the same manner as Class A glassware, but must come with a certificate attesting
to established accuracy or the accuracy must be initially demonstrated and documented by the
laboratory.
1) For chemical tests the temperature, cycle time, and pressure of each run of autoclaves must be
documented by the use of appropriate chemical indicators or temperature recorders and pressure
gauges.
g) For biological tests that employ autoclave sterilization see section D.3.8.
5.9.4.2 Instrument Calibration:
This standard specifies the essential elements that shall define the procedures and documentation
for initial instrument calibration and continuing instrument calibration verification to ensure that the
data must be of known quality and be appropriate for a given regulation or decision. This standard
does not specify detailed procedural steps ("how to") for calibration, but establishes the essential
elements for selection of the appropriate technique(s). This approach allows flexibility and permits
the employment of a wide variety of analytical procedures and statistical approaches currently
applicable for calibration. If more stringent standards or requirements are included in a mandated test
method or by regulation, the laboratory shall demonstrate that such requirements are met. If it is not
apparent which standard is more stringent, then the requirements of the regulation or mandated test
method are to be followed.
Note: In the following sections, initial instrument calibration is directly used for quantitation
and continuing instrument calibration verification is used to confirm the continued validity
of the initial calibration.
5.9.4.2.1 Initial Instrument Calibration:
The following items are essential elements of initial instrument calibration:
a)	The details of the initial instrument calibration procedures including calculations, integrations,
acceptance criteria and associated statistics must be included or referenced in the test method
SOP. When initial instrument calibration procedures are referenced in the test method, then the
referenced material must be retained by the laboratory and be available for review.
b)	Sufficient raw data records must be retained to permit reconstruction of the initial instrument
calibration, e.g., calibration date, test method, instrument, analysis date, each analyte name,
analyst's initials or signature; concentration and response, calibration curve or response factor;
or unique equation or coefficient used to reduce instrument responses to concentration.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 13 of 26
c)	Sample results must be quantitated from the initial instrument calibration and may not be
quantitated from any continuing instrument calibration verification.
d)	All initial instrument calibrations must be verified with a standard obtained from a second
manufacturer or lot if the lot can be demonstrated from the manufacturer as prepared
independently from other lots. Traceability shall be to a national standard, when available.
e)	Criteria forthe acceptance of an initial instrument calibration must be established, e.g., correlation
coefficient or relative percent difference. The criteria used must be appropriate to the calibration
technique employed.
1) Results of samples not bracketed by initial instrument calibration standards (within calibration
range) must be reported as having less certainty, e.g., defined qualifiers or flags or explained in
the case narrative. The lowest calibration standard must be above the detection limit.
g)	If the initial instrument calibration results are outside established acceptance criteria, corrective
actions must be performed. Data associated with an unacceptable initial instrument calibration
shall not be reported.
h)	Calibration standards must include concentrations at or below the regulatory limit/decision level,
if these limits/levels are known by the laboratory, unless these concentrations are below the
laboratory's demonstrated detection limits (See D.1.4 Detection Limits)
i)	If a reference or mandated method does not specify the number of calibration standards, the
minimum number is two, not including blanks or a zero standard. The laboratory must have a
standard operating procedure for determining the number of points for establishing the initial
instrument calibration.
5.9.4.2.2 Continuing Instrument Calibration Verification
When an initial instrument calibration is not performed on the day of analysis, the validity of the initial
calibration shall be verified prior to sample analyses by a continuing instrument calibration verification
with each analytical batch. The following items are essential elements of continuing instrument
calibration verification:
a)	The details of the continuing instrument calibration procedure, calculations and associated
statistics must be included or referenced in the test method SOP.
b)	A continuing instrument calibration verification must be repeated at the beginning and end of each
analytical batch. The concentrations of the calibration verification shall be varied within the
established calibration range. If an internal standard is used, only one continuing instrument
calibration verification must be analyzed per analytical batch.
c)	Sufficient raw data records must be retained to permit reconstruction of the continuing instrument
calibration verification, e.g., test method, instrument, analysis date, each analyte name,
concentration and response, calibration curve or response factor, or unique equations or
coefficients used to convert instrument responses into concentrations. Continuing calibration
verification records must explicitly connect the continuing verification data to the initial instrument
calibration.
d)	Criteria for the acceptance of a continuing instrument calibration verification must be established,
e.g., relative percent difference.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 14 of 26
e) If the continuing instrument calibration verification results obtained are outside established
acceptance criteria, corrective actions must be performed. If routine corrective action procedures
fail to produce a second consecutive (immediate) calibration verification within acceptance
criteria, then either the laboratory has to demonstrate performance after corrective action with two
consecutive successful calibration verifications, or a new initial instrument calibration must be
performed. If the laboratory has not demonstrated acceptable performance, sample analyses
shall not occur until a new initial calibration curve is established and verified. However, sample
data associated with an unacceptable calibration verification may be reported as qualified data
under the following special conditions:
i.	When the acceptance criteria for the continuing calibration verification are exceeded high,
i.e., high bias, and there are associated samples that are non-detects, then those non-detects
may be reported. Otherwise the samples affected by the unacceptable calibration verification
shall be reanalyzed after a new calibration curve has been established, evaluated and
accepted.
ii.	When the acceptance criteria forthe continuing calibration verification are exceeded low, i.e.,
low bias, those sample results may be reported if they exceed a maximum regulatory
limit/decision level. Otherwise the samples affected by the unacceptable verification shall be
reanalyzed after a new calibration curve has been established, evaluated and accepted.
5.10 TEST METHODS AND STANDARD OPERATING PROCEDURES
5.10.1 Methods Documentation
a)	The laboratory shall have documented instructions on the use and operation of all relevant
equipment, on the handling and preparation of samples and for calibration and/or testing, where
the absence of such instructions could jeopardize the calibrations or tests.
b)	All instructions, standards, manuals and reference data relevant to the work of the laboratory shall
be maintained up-to-date and be readily available to the staff.
5.10.1.1 Standard Operating Procedures (SOPs)
Laboratories shall maintain standard operating procedures that accurately reflect all phases of current
laboratory activities such as assessing data integrity, corrective actions, handling customer
complaints, and all test methods.
a)	These documents, for example, may be equipment manuals provided by the manufacturer, or
internally written documents.
b)	The test methods may be copies of published methods as long as any changes or selected
options in the methods are documented and included in the methods manual (see 5.10.1.2).
c)	Copies of all SOPs shall be accessible to all personnel.
d)	The SOPs shall be organized .
e)	Each SOP shall clearly indicate the effective date of the document, the revision number and the
signature(s) of the approving authority.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 15 of 26
5.10.1.2 Laboratory Method Manual(s)
a)	The laboratory shall have and maintain an in-house methods manual(s) for each accredited
analyte or test method.
b)	This manual may consist of copies of published or referenced test methods or standard operating
procedures that have been written by the laboratory. In cases where modifications to the
published method have been made by the laboratory or where the referenced test method is
ambiguous or provides insufficient detail, these changes or clarifications shall be clearly
described. Each test method shall include or reference where applicable:
1)	identification of the test method;
2)	applicable matrix or matrices;
3)	detection limit;
4)	scope and application, including components to be analyzed;
5)	summary of the test method;
6)	definitions;
7)	interferences;
8)	safety;
9)	equipment and supplies;
10)	reagents and standards;
11)	sample collection, preservation, shipment and storage;
12)	quality control;
13)	calibration and standardization;
14)	procedure;
15)	calculations;
16)	method performance;
17)	pollution prevention;
18)	data assessment and acceptance criteria for quality control measures;
19)	corrective actions for out-of-control data;
20)	contingencies for handling out-of-control or unacceptable data;
21)	waste management;
22)	references; and,
23)	any tables, diagrams, flowcharts and validation data.
5.10.2 Test Methods
The laboratory shall use appropriate test methods and procedures for all tests and related activities
within its responsibility (including sample collection, sample handling, transport and storage, sample
preparation and sample analysis). The method and procedures shall be consistent with the accuracy
required, and with any standard specifications relevant to the calibrations or tests concerned.
a)	When the use of reference test methods for a sample analysis are mandated or requested, only
those methods shall be used.
b)	Where test methods are employed that are not required, as in the Performance Based
Measurement System approach, the methods shall be fully documented and validated (see
5.10.2.1 and Appendix C), and be available to the client and other recipients of the relevant
reports.
5.10.2.1 Demonstration of Capability
a) Prior to acceptance and institution of any test method, satisfactory demonstration of method
capability is required. (See Appendix C and 5.6.2.b.) In general, this demonstration does not test

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 16 of 26
the performance of the method in real world samples, but in the applicable and available clean
matrix sample of a matrix in which no target analytes or interferences are present at
concentrations that impact the results of a specific test method), e.g., water, solids, biological
tissue and air. In addition, for analytes which do not lend themselves to spiking, the
demonstration of capability may be performed using quality control samples.
b)	Thereafter, continuing demonstration of method performance, as per the quality control
requirements in Appendix D (such as laboratory control samples) is required.
c)	In cases where a laboratory analyzes samples using a test method that has been in use by the
laboratory before July 1999, and there have been no significant changes in instrument type,
personnel or test method, the continuing demonstration of method performance and the analyst's
documentation of continued proficiency shall be acceptable. The laboratory shall have records
on file to demonstrate that a demonstration of capability is not required.
d)	In all cases, the appropriate forms such as the Certification Statement (Appendix C) must be
completed and retained by the laboratory to be made available upon request. All associated
supporting data necessary to reproduce the analytical results summarized in the Certification
Statement must be retained by the laboratory. (See Appendix C for Certification Statement.)
e)	A demonstration of capability must be completed each time there is a change in instrument type,
personnel, or test method.
1) In laboratories with a specialized "work cell(s)" (a group consisting of analysts with specifically
defined tasks that together perform the test method), the group as a unit must meet the above
criteria and this demonstration of capability must be fully documented.
g)	When a work cell(s) is employed, and the members of the cell change, the new employee(s) must
work with experienced analyst(s) in that area of the workcell where they are employed. This new
work cell must demonstrate acceptable performance through acceptable continuing performance
checks (appropriate sections of Appendix D, such as laboratory control samples). Such
performance must be documented and the four preparation batches following the change in
personnel must not result in the failure of any batch acceptance criteria, e.g., method blank and
laboratory control sample, orthe demonstration of capability must be repeated. In addition, ifthe
entire work cell is changed/replaced, the work cell must perform the demonstration of capability
(Appendix C).
h)	When a work cell(s) is employed the performance of the group must be linked to the training
record of the individual members of the work cell (see section 5.6.2).
5.10.3	Sample Aliquots
Where sampling (as in obtaining sample aliquots from a submitted sample) is carried out as part of
the test method, the laboratory shall use documented procedures and appropriate techniques to
obtain representative subsamples.
5.10.4	Data Verification
Calculations and data transfers shall be subject to appropriate checks.
a) The laboratory shall establish Standard Operating Procedure to ensure that the reported data are
free from transcription and calculation errors.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 17 of 26
b)	The laboratory shall establish Standard Operating Procedures to ensure that all quality control
measures are reviewed, and evaluated before data are reported.
c)	The laboratory shall establish Standard Operating Procedures addressing manual calculations
including manual integrations.
5.10.5	Documentation and Labeling of Standards and Reagents
Documented procedures shall exist for the purchase, reception and storage of consumable materials
used for the technical operations of the laboratory.
a)	The laboratory shall retain records for all standards, reagents and media including the
manufacturer/vendor, the manufacturer's Certificate of Analysis or purity (if supplied), the date
of receipt, recommended storage conditions, and an expiration date after which the material shall
not be used unless its reliability is verified by the laboratory.
b)	Original containers (such as provided by the manufacturer or vendor) shall be labeled with an
expiration date.
c)	Records shall be maintained on reagent and standard preparation. These records shall indicate
traceability to purchased stocks or neat compounds, reference to the method of preparation, date
of preparation, expiration date and preparer's initials.
d)	All containers of prepared reagents and standards must bear a unique identifier and expiration
date and be linked to the documentation requirements in 5.10.5.c above.
5.10.6	Computers and Electronic Data Related Requirements
Where computers, automated equipment, or microprocessors, are used for the capture, processing,
manipulation, recording, storage or retrieval of test data, the laboratory shall ensure that:
a)	all requirements of this Standard (i.e. Chapter 5) are met;
b)	computer software is tested and documented to be adequate for use, e.g., internal audits,
personnel training, focus point of QA and QC;
c)	procedures are established and implemented for protecting the integrity of data; such procedures
shall include, but not be limited to, integrity of data entry or capture, data storage, data
transmission and data processing;
d)	computer and automated equipment are maintained to ensure proper functioning and provided
with the environmental and operating conditions necessary to maintain the integrity of calibration
and test data; and,
e)	it establishes and implements appropriate procedures for the maintenance of security of data
including the prevention of unauthorized access to, and the unauthorized amendment of,
computer records.
5.11 SAMPLE HANDLING, SAMPLE ACCEPTANCE POLICY AND SAMPLE RECEIPT
While the laboratory may not have control of field sampling activities, the following are essential to
ensure the validity of the laboratory's data.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 18 of 26
5.11.1	Sample Tracking
a)	The laboratory shall have a documented system for uniquely identifying the items to be tested,
to ensure that there can be no confusion regarding the identity of such items at any time. This
system shall include identification for all samples, subsamples and subsequent extracts and/or
digestates. The laboratory shall assign a unique identification (ID) code to each sample container
received in the laboratory. The use of container shape, size or other physical characteristic, such
as amber glass, or purple top, is not an acceptable means of identifying the sample.
b)	This laboratory code shall maintain an unequivocal link with the unique field ID code assigned
each container.
c)	The laboratory ID code shall be placed on the sample container as a durable label.
d)	The laboratory ID code shall be entered into the laboratory records (see 5.11.3.d) and shall be
the link that associates the sample with related laboratory activities such as sample preparation
or calibration.
e)	In cases where the sample collector and analyst are the same individual or the laboratory
preassigns numbers to sample containers, the laboratory ID code may be the same as the field
ID code.
5.11.2	Sample Acceptance Policy
The laboratory must have a written sample acceptance policy that clearly outlines the circumstances
under which samples shall be accepted or rejected. Data from any samples which do not meet the
following criteria must be flagged in an unambiguous manner clearly defining the nature and
substance of the variation. This sample acceptance policy shall be made available to sample
collection personnel and shall include, but is not limited to, the following areas of concern:
a)	Proper, full, and complete documentation, which shall include sample identification, the location,
date and time of collection, collector's name, preservation type, sample type and any special
remarks concerning the sample;
b)	Proper sample labeling to include unique identification and a labeling system for the samples with
requirements concerning the durability of the labels (water resistant) and the use of indelible ink;
c)	Use of appropriate sample containers;
d)	Adherence to specified holding times;
e)	Adequate sample volume. Sufficient sample volume must be available to perform the necessary
tests; and
1) Procedures to be used when samples show signs of damage, contamination or inadequate
preservation.
5.11.3	Sample Receipt Protocols
a) Upon receipt, the condition of the sample, including any abnormalities or departures from
standard condition as prescribed in the relevant test method, shall be recorded. All items
specified in 5.11.2 above shall be checked.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 19 of 26
1)	All samples which require thermal preservation shall be considered acceptable if the arrival
temperature is either within 2°C of the required temperature or the method specified range.
For samples with a specified temperature of 4°C, samples with a temperature ranging from
just above the freezing temperature of water to 6°C shall be acceptable. Samples that are
hand delivered to the laboratory immediately after collection may not meet this criteria. In
these cases, the samples shall be considered acceptable ifthere is evidence that the chilling
process has begun such as arrival on ice.
2)	The laboratory shall implement procedures for checking chemical preservation using readily
available techniques, such as pH or free chlorine, prior to or during sample preparation or
analysis.
The results of all checks shall be recorded.
Where there is any doubt as to the item's suitability for testing, where the sample does not
conform to the description provided, or where the test required is not fully specified, the laboratory
shall attempt to consult the client for further instruction before proceeding. The laboratory shall
establish whether the sample has received all necessary preparation, or whether the client
requires preparation to be undertaken or arranged by the laboratory. If the sample does not meet
the sample receipt acceptance criteria listed in this standard , the laboratory shall either:
1)	Retain correspondence and/or records of conversations concerning the final disposition of
rejected samples; or
2)	Fully document any decision to proceed with the analysis of samples not meeting acceptance
criteria.
i.	The condition of these samples shall, at a minimum, be noted on the chain of custody or
transmittal form and laboratory receipt documents.
ii.	The analysis data shall be appropriately "qualified" on the final report.
The laboratory shall utilize a permanent chronological record such as a log book or electronic
database to document receipt of all sample containers.
1)	This sample receipt log shall record the following:
i.	Client/Project Name,
ii.	Date and time of laboratory receipt,
iii.	Unique laboratory ID code (see 5.11.1), and,
iv.	Signature or initials of the person making the entries.
2)	During the log-in process, the following information must be unequivocally linked to the log
record or included as a part of the log. If such information is recorded/documented
elsewhere, the records shall be part of the laboratory's permanent records, easily retrievable
upon request and readily available to individuals who will process the sample. Note: the
placement of the laboratory ID number on the sample container is not considered a
permanent record.
i. The field ID code which identifies each container must be linked to the laboratory ID code
in the sample receipt log.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 20 of 26
ii.	The date and time of sample collection must be linked to the sample container and to the
date and time of receipt in the laboratory.
iii.	The requested analyses (including applicable approved test method numbers) must be
linked to the laboratory ID code.
iv.	Any comments resulting from inspection for sample rejection shall be linked to the
laboratory ID code.
e) All documentation, such as memos or transmittal forms, that is transmitted to the laboratory by
the sample transmitter shall be retained.
1) A complete chain of custody record form (Sections 5.12.3 and Appendix E), if utilized, shall be
maintained.
5.11.4	Storage Conditions
The laboratory shall have documented procedures and appropriate facilities to avoid deterioration,
contamination, or damage to the sample during storage, handling, preparation, and testing; any
relevant instructions provided with the item shall be followed. Where items have to be stored or
conditioned under specific environmental conditions, these conditions shall be maintained, monitored
and recorded.
a)	Samples shall be stored according to the conditions specified by preservation protocols:
1)	Samples which require thermal preservation shall be stored under refrigeration which is +1-2°
of the specified preservation temperature unless method specific criteria exist. For samples
with a specified storage temperature of 4°C, storage at a temperature above the freezing
point of water to 6°C shall be acceptable.
2)	Samples shall be stored away from all standards, reagents, food and other potentially
contaminating sources. Samples shall be stored in such a manner to prevent cross
contamination.
b)	Sample fractions, extracts, leachates and other sample preparation products shall be stored
according to 5.11.4.a above or according to specifications in the test method.
c)	Where a sample or portion of the sample is to be held secure (for example, for reasons of record,
safety or value, or to enable check calibrations or tests to be performed later), the laboratory shall
have storage and security arrangements that protect the condition and integrity of the secured
items or portions concerned.
5.11.5	Sample Disposal
The laboratory shall have standard operating procedures for the disposal of samples, digestates,
leachates and extracts or other sample preparation products.
5.12 RECORDS
The laboratory shall maintain a record system to suit its particular circumstances and comply with any
applicable regulations. The system shall produce unequivocal, accurate records which document all
laboratory activities. The laboratory shall retain all original observations, calculations and derived
data, calibration records and a copy of the test report for a minimum of five years.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 21 of 26
There are two levels of sample handling: 1) sample tracking and 2) legal chain of custody protocols,
which are used for evidentiary or legal purposes. All essential requirements for sample tracking (e.g.,
chain of custody form) are outlined in Sections 5.12.1, 5.12.2 and 5.12.3. If a client specifies that a
sample will be used for evidentiary purposes, then a laboratory shall have a written SOP for how that
laboratory will carry out legal chain of custody for example, ASTM D 4840-95 and Manual for the
Certification of Laboratories Analyzing Drinking Water, March 1997, Appendix A.
5.12.1	Record Keeping System and Design
The record keeping system must allow historical reconstruction of all laboratory activities that
produced the analytical data. The history of the sample must be readily understood through the
documentation. This shall include interlaboratory transfers of samples and/or extracts.
a)	The records shall include the identity of personnel involved in sampling, sample receipt,
preparation, calibration or testing.
b)	All information relating to the laboratory facilities equipment, analytical test methods, and related
laboratory activities, such as sample receipt, sample preparation, or data verification shall be
documented.
c)	The record keeping system shall facilitate the retrieval of all working files and archived records
for inspection and verification purposes.,e.g., set format for naming electronic files.
d)	All changes to records shall be signed or initialed by responsible staff. The reason for the
signature or initials shall be clearly indicated in the records such as "sampled by," "prepared by,"
or "reviewed by."
e)	All generated data except those that are generated by automated data collection systems, shall
be recorded directly, promptly and legibly in permanent ink.
1) Entries in records shall not be obliterated by methods such as erasures, overwritten files or
markings. All corrections to record-keeping errors shall be made by one line marked through the
error. The individual making the correction shall sign (or initial) and date the correction. These
criteria also shall apply to electronically maintained records.
g) Refer to 5.10.6 for Computer and Electronic Data.
5.12.2	Records Management and Storage
a)	All records (including those pertaining to calibration and test equipment), certificates and reports
shall be safely stored, held secure and in confidence to the client. NELAP-related records shall
be available to the accrediting authority.
b)	All records, including those specified in 5.12.3 shall be retained for a minimum of five years from
generation of the last entry in the records. All information necessary for the historical
reconstruction of data must be maintained by the laboratory. Records which are stored only on
electronic media must be supported by the hardware and software necessary for their retrieval.
c)	Records that are stored or generated by computers or personal computers shall have hard copy
or write-protected backup copies.
d)	The laboratory shall establish a record management system for control of laboratory notebooks,
instrument logbooks, standards logbooks, and records for data reduction, validation storage and
reporting.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 22 of 26
e) Access to archived information shall be documented with an access log. These records shall be
protected against fire, theft, loss, environmental deterioration, vermin and, in the case of
electronic records, electronic or magnetic sources.
1) The laboratory shall have a plan to ensure that the records are maintained or transferred
according to the clients' instructions (see 4.1.8.e) in the event that a laboratory transfers
ownership or goes out of business. In addition, in cases of bankruptcy, appropriate regulatory
and state legal requirements concerning laboratory records must be followed.
5.12.3 Laboratory Sample Tracking
5.12.3.1	Sample Handling
A record of all procedures to which a sample is subjected while in the possession of the laboratory
shall be maintained. These shall include but are not limited to all records pertaining to:
a)	Sample preservation including appropriateness of sample container and compliance with holding
time requirement;
b)	Sample identification, receipt, acceptance or rejection and log-in;
c)	Sample storage and tracking including shipping receipts, sample transmittal forms, (chain of
custody form); and
d)	The laboratory shall have documented procedures for the receipt and retention of test items,
including all provisions necessary to protect the integrity of samples.
5.12.3.2	Laboratory Support Activities
In addition to documenting all the above-mentioned activities, the following shall be retained:
a)	All original raw data, whether hard copy or electronic, for calibrations, samples and quality control
measures, including analysts work sheets and data output records (chromatograms, strip charts,
and other instrument response readout records);
b)	A written description or reference to the specific test method used which includes a description
of the specific computational steps used to translate parametric observations into a reportable
analytical value;
c)	Copies of final reports;
d)	Archived standard operating procedures;
e)	Correspondence relating to laboratory activities for a specific project;
1)	All corrective action reports, audits and audit responses;
g)	Proficiency test results and raw data; and,
h) Results of data review, verification, and cross-checking procedures.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 23 of 26
5.12.3.3	Analytical Records
The essential information to be associated with analysis, such as strip charts, tabular printouts,
computer data files, analytical notebooks, and run logs, shall include:
a)	Laboratory sample ID code;
b)	Date of analysis and time of analysis is required if the holding time is 72 hours or less or when
time critical steps are included in the analysis, e.g., extractions, and incubations;
c)	Instrumentation identification and instrument operating conditions/parameters (or reference to
such data);
d)	Analysis type;
e)	All manual calculations, e.g., manual integrations; and,
1) Analyst's or operator's initials/signature;
g)	Sample preparation including cleanup, separation protocols, incubation periods or subculture, ID
codes, volumes, weights, instrument printouts, meter readings, calculations, reagents;
h)	Sample analysis;
i)	Standard and reagent origin, receipt, preparation, and use;
j) Calibration criteria, frequency and acceptance criteria;
k) Data and statistical calculations, review, confirmation, interpretation, assessment and reporting
conventions;
I) Quality control protocols and assessment;
m) Electronic data security, software documentation and verification, software and hardware audits,
backups, and records of any changes to automated data entries;
n) Method performance criteria including expected quality control requirements.
5.12.3.4	Administrative Records
The following shall be maintained:
a)	Personnel qualifications, experience and training records;
b)	Records of demonstration of capability for each analyst; and
c)	A log of names, initials and signatures for all individuals who are responsible for signing or
initialing any laboratory record.
5.13 LABORATORY REPORT FORMAT AND CONTENTS
The results of each test, or series of tests carried out by the laboratory shall be reported accurately,
clearly, unambiguously and objectively. The results shall normally be reported in a test report and
shall include all the information necessary for the interpretation of the test results and all information

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 24 of 26
required by the method used. Some regulatory reporting requirements or formats such as monthly
operating reports may not require all items listed below, however, the laboratory shall provide all the
required information to their client for use in preparing such regulatory reports.
a) Except as discussed in 5.13.b, each report to an outside client shall include at least the following
information (those prefaced with "where relevant" are not mandatory):
1)	a title, e.g., "Test Report", or "Test Certificate", "Certificate of Results" or "Laboratory
Results";
2)	name and address of laboratory, and location where the test was carried out if different from
the address of the laboratory and phone number with name of contact person for questions;
3)	unique identification of the certificate or report (such as serial number) and of each page, and
the total number of pages;
This requirement may be presented in several ways:
i.	The total number of pages may be listed on the first page of the report as long as the
subsequent pages are identified by the unique report identification and consecutive
numbers, or
ii.	Each page is identified with the unique report identification, the pages are identified as
a number of the total report pages (example: 3 of 10, or 1 of 20).
Other methods of identifying the pages in the report may be acceptable as long as it is clear
to the reader that discrete pages are associated with a specific report, and that the report
contains a specified number of pages.
4)	name and address of client, where appropriate and project name if applicable;
5)	description and unambiguous identification of the tested sample including the client
identification code;
6)	identification of test results derived from any sample that did not meet NELAC sample
acceptance requirements such as improper container, holding time, or temperature;
7)	date of receipt of sample, date and time of sample collection, date(s) of performance of test,
and time of sample preparation and/or analysis if the required holding time for either activity
is less than or equal to 72 hours;
8)	identification of the test method used, or unambiguous description of any non-standard
method used;
9)	if the laboratory collected the sample, reference to sampling procedure;
10)	any deviations from (such as failed quality control), additions to or exclusions from the test
method (such as environmental conditions), and any non-standard conditions that may have
affected the quality of results, and including the use and definitions of data qualifiers;
11)	measurements, examinations and derived results, supported by tables, graphs, sketches and
photographs as appropriate, and any failures identified; identify whether data are calculated
on a dry weight or wet weight basis; identify the reporting units such as |jg/l or mg/kg; and
for Whole Effluent Toxicity, identify the statistical package used to provide data;

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 25 of 26
12)	when required, a statement of the estimated uncertainty of the test result;
13)	a signature and title, or an equivalent electronic identification of the person(s) accepting
responsibility for the content of the certificate or report (however produced), and date of
issue;
14)	at the laboratory's discretion, a statement to the effect that the results relate only to the items
tested or to the sample as received by the laboratory;
15)	at the laboratory's discretion, a statement that the certificate or report shall not be reproduced
except in full, without the written approval of the laboratory;
16)	clear identification of all test data provided by outside sources, such as subcontracted
laboratories, clients, etc; and,
17)	clear identification of numerical results with values outside of quantitation limits
Laboratories that are operated by a facility and whose sole function is to provide data to the
facility management for compliance purposes (in-house or captive laboratories) shall have all
applicable information specified in 1 through 17 above readily available for review by the
accrediting authority. However formal reports detailing the information are not required if:
1)	The in-house laboratory is itself responsible for preparing the regulatory reports; or
2)	The laboratory provides information to another individual within the organization for
preparation of regulatory reports. The facility management must ensure that the appropriate
report items are in the report to the regulatory authority if such information is required.
Where the certificate or report contains results of tests performed by subcontractors, these results
shall be clearly identified by subcontractor name or applicable accreditation number.
After issuance of the report, the laboratory report shall remain unchanged. Material amendments
to a calibration certificate, test report or test certificate after issue shall be made only in the form
of a further document, or data transfer including the statement "Supplement to Test Report or
Test Certificate, serial number ... [or as otherwise identified]", or equivalent form of wording.
Such amendments shall meet all the relevant requirements of this Standard.
The laboratory shall notify clients promptly, in writing, of any event such as the identification of
defective measuring or test equipment that casts doubt on the validity of results given in any
calibration certificate, test report or test certificate or amendment to a report or certificate.
The laboratory shall, where clients require transmission of test results by telephone, telex,
facsimile or other electronic or electromagnetic means, follow documented procedures that
ensure that the requirements of this Standard are met and that all reasonable steps are taken to
preserve confidentiality.
Laboratories accredited to be in compliance with these standards shall certify that the test results
meet all requirements of NELAC or provide reasons and/or justification if they do not.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 26 of 26
5.14	SUBCONTRACTING ANALYTICAL SAMPLES
a)	The laboratory shall advise the client in writing of its intention to subcontract any portion of the
testing to another party.
b)	Where a laboratory subcontracts any part of the testing covered under NELAP, this work shall
be placed with a laboratory accredited under NELAP for the tests to be performed or with a
laboratory that meets applicable statutory and regulatory requirements for performing the tests
and submitting the results of tests performed. The laboratory performing the subcontracted work
shall be indicated in the final report and non-NELAP accredited work shall be clearly identified.
c)	The laboratory shall retain records demonstrating that the above requirements have been met.
5.15	OUTSIDE SUPPORT SERVICES AND SUPPLIES
a)	Where the laboratory procures outside services and supplies, other than those referred to in this
Standard, in support of tests, the laboratory shall use only those outside support services and
supplies that are of adequate quality to sustain confidence in the laboratory's tests.
b)	Where no independent assurance of the quality of outside support services or supplies is
available, the laboratory shall have procedures to ensure that purchased equipment, materials
and services comply with specified requirements. The laboratory shall, ensure that purchased
equipment and consumable materials are not used until they have been inspected, calibrated or
otherwise verified as complying with any standard specifications relevant to the calibrations or
tests concerned.
c)	The laboratory shall maintain records of all suppliers from whom it obtains support services or
supplies required for tests.
5.16	COMPLAINTS
The laboratory shall have documented policy and procedures for the resolution of complaints received
from clients or other parties about the laboratory's activities. Where a complaint, or any other
circumstance, raises doubt concerning the laboratory's compliance with the laboratory's policies or
procedures, or with the requirements of this Standard or otherwise concerning the quality of the
laboratory's calibrations or tests, the laboratory shall ensure that those areas of activity and
responsibility involved are promptly audited in accordance with Section 5.5.3.1. Records of the
complaint and subsequent actions shall be maintained.

-------
QUALITY SYSTEMS
APPENDIX A
REFERENCES

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5A-1 of 2
Appendix A - REFERENCES
40 CFR Part 136, Appendix A, paragraphs 8.1.1 and 8.2
American Association for Laboratory Accreditation April 1996. General Requirements for
Accreditation
"American National Standards Specification and Guidelines for Quality Systems for Environmental
Data Collection and Environmental Technology Programs (ANSI/ASQC E-4)," 1994
Catalog of Bacteria, American Type Culture Collection, Rockville, MD
EPA 2185 - Good Automated Laboratory Practices, 1995 available at
www.epa.gov/docs/etsdwe1/irm_galp/
"Glossary of Quality Assurance Terms and Acronyms", Quality Assurance Division, Office of
Research and Development, USEPA
"Guidance on the Evaluation of Safe Drinking Water Act Compliance Monitoring Results from
Performance Based Methods", September 30, 1994, Second draft.
International vocabulary of basic and general terms in metrology (VIM): 1984. Issued by BIPM,
IEC, ISO and OIML
ISO Guide 3534-1: "Statistics, vocabulary and symbols - Part 1: Probability and general statistical
terms"
ISO Guide 7218: Microbiology - General Guidance for Microbiological Examinations
ISO Guide 8402: 1986. Quality - Vocabulary
ISO Guide 9000: 1994 Quality management and quality assurance standards - Guidelines for
selection and use
ISO Guide 9001: 1994 Quality Systems - Model for quality assurance in design/development,
production, installation and servicing
ISO Guide 9002: 1994 Quality systems - Model for quality assurance in production and
installation
ISO/I EC Guide 2: 1986. General terms and their definitions concerning standardization and
related activities
ISO/IEC Guide 25: 1990. General requirements for the competence of calibration and testing
laboratories
"Laboratory Biosafety Manual", World Health Organization, Geneva, 1983
Manual for the Certification of Laboratories Analyzing Drinking Water Revision 4, EPA 815-B-97-
001
Manual of Method for General Bacteriology, Philipp Gerhard et al., American Society for
Microbiology, Washington, 1981

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5A-2 of 2
Performance Based Measurement System, EPA EMMC Method Panel, PBMS Workgroup, 1996
EPA/600/4-90/027F Methods for Measuring the Acute Toxicity of Effluents and Receiving Waters
to Freshwater and Marine Organisms, 4th Ed., Office of Research and Development, Washington,
DC, 1993.
EPA/600/4-91/002 Short-term Methods for Estimating the Chronic Toxicity of Effluents and
Receiving Waters to Freshwater Organisms, 3rd Ed., Office of Research and Development,
Washington, DC, 1994.
EPA/600/4-91/003 Short-term Methods for Estimating the Chronic Toxicity of Effluents and
Receiving Water to Marine and Estuarine Organisms, 2nd Ed., Office of Research and
Development, Washington, DC, 1994.
EPA/600/4-90/031 Manual for Evaluation of Laboratories Performing Aquatic Toxicity Tests.,
Office of Research and Development, Washington, DC, 1991.
EPA/600/R-94/025 Methods for Assessing the Toxicity of Sediment-associated Contaminants with
Estuarine and Marine Amphipods, Office of Research and Development, Washington, DC, 1994.
EPA/600/R-94/024 Methods for Measuring the Toxicity and Bioaccumulation of Sediment-
associated Contaminants with Freshwater Invertebrates, Office of Research and Development,
Washington, DC, 1994.
EPA/823/B-98/004 Evaluation of Dredged Material Proposed for Discharge in Waters of the U.S. -
Inland Testing Manual. Office of Water, Washington, DC, 1994.
EPA/503/8-91/001 Evaluation of Dredged Material Proposed for Ocean Disposal - Testing
Manual. Office of Water, Washington, DC, 1991.
EPA/600/3-88/029 Protocol for Short-term Toxicity Screening of Hazardous Wastes, Office of
Research and Development, Washington, DC, 1991.
EPA/600/3-89/013 Ecological Assessment of Hazardous Waste Sites, Office of Research and
Development, Washington, DC, 1991.
ASTM E1598-94 Conducting Early Seedling Growth Tests, American Society for Testing and
Materials, West Conshohocken, PA 1999..
ASTM E11676-97 Conducting a Laboratory Soil Toxicity Test with Lumbricid Earthworm Eisenia
foetida, American Society for Testing and Materials, West Conshohocken, PA 1999.

-------
APPENDIX B
(Reserved)

-------
QUALITY SYSTEMS
APPENDIX C
DEMONSTRATION OF CAPABILITY

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5C-1 of 4
Appendix C - DEMONSTRATION OF CAPABILITY
C.1 PROCEDURE FOR DEMONSTRATION OF CAPABILITY
A demonstration of capability (DOC) must be made prior to using any test method, and at any time
there is a change in instrument type, personnel or test method (see 5.10.2.1).
Note: In laboratories with specialized "work cells" (a well defined group of analysts that together
perform the method analysis), the group as a unit must meet the above criteria and this demonstration
must be fully documented.
In general, this demonstration does not test the performance of the method in real world samples, but
in the applicable and available clean matrix (a sample of a matrix in which no target analytes or
interferences are present at concentrations that impact the results of a specific test method), e.g.,
water, solids, biological tissue and air. However, before any results are reported using this method,
actual sample spike results may be used to meet this standard, i.e.,at least four consecutive matrix
spikes within the last twelve months. In addition, for analytes which do not lend themselves to
spiking, e.g., TSS, the demonstration of capability may be performed using quality control samples.
All demonstrations shall be documented through the use of the form in this appendix.
The following steps, which are adapted from the EPA test methods published in 40 CFR Part 136,
Appendix A, shall be performed if required by mandatory test method or regulation. Note: For
analytes for which spiking is not an option and for which quality control samples are not readily
available, the 40 CFR approach is one way to perform this demonstration. It is the responsibility of
the laboratory to document that other approaches to DOC are adequate, this shall be documented
in the laboratory's Quality Manual, e.g., for Whole Effluent Toxicity Testing see section D.2.1 .a.1.
a)	A quality control sample shall be obtained from an outside source. If not available, the QC
sample may be prepared by the laboratory using stock standards that are prepared independently
from those used in instrument calibration.
b)	The analyte(s) shall be diluted in a volume of clean matrix sufficient to prepare four aliquots at
the concentration specified, or if unspecified, to a concentration approximately 10 times the
method-stated or laboratory-calculated method detection limit.
c)	At least four aliquots shall be prepared and analyzed according to the test method either
concurrently or over a period of days.
d)	Using all of the results, calculate the mean recovery (• ) in the appropriate reporting units (such
as |jg/L) and the standard deviations of the population sample (n-1) (in the same units) for each
parameter of interest. When it is not possible to determine mean and standard deviations, such
as for presence/absence and logarithmic values, the laboratory must assess performance
against established and documented criteria.
e)	Compare the information from (d) above to the corresponding acceptance criteria for precision
and accuracy in the test method (if applicable) or in laboratory-generated acceptance criteria (if
there are not established mandatory criteria). If all parameters meet the acceptance criteria, the
analysis of actual samples may begin. If any one of the parameters do not meet the acceptance
criteria, the performance is unacceptable for that parameter.
1) When one or more of the tested parameters fail at least one of the acceptance criteria, the analyst
must proceed according to 1) or 2) below.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5C-2 of 4
1)	Locate and correct the source of the problem and repeat the test for all parameters of interest
beginning with c) above.
2)	Beginning with c) above, repeat the test for all parameters that failed to meet criteria.
Repeated failure, however, confirms a general problem with the measurement system. If this
occurs, locate and correct the source of the problem and repeat the test for all compounds
of interest beginning with c).
C.2 CERTIFICATION STATEMENT
The following certification statement shall be used to document the completion of each demonstration
of capability. A copy of the certification statement shall be retained in the personnel records of each
affected employee (see 5.6.3 and 5.12.3.4.b).

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5C-3 of 4
Demonstration of Capability
Certification Statement
Date:	Page	of	
Laboratory Name:
Laboratory Address:
Analyst(s) Name(s):
Matrix:
(examples: laboratory pure water, soil, air, solid, biological tissue)
Method number, SOP#, Rev#, and Analyte, or Class of Analytes or Measured
Parameters
(examples: barium by 200.7, trace metals by 6010, benzene by 8021, etc.)
We, the undersigned, CERTIFY that:
1.	The analysts identified above, using the cited test method(s), which is in use at
this facility for the analyses of samples under the National Environmental Laboratory
Accreditation Program, have met the Demonstration of Capability.
2.	The test method(s) was performed by the analyst(s) identified on this certification.
3.	A copy of the test method(s) and the laboratory-specific SOPs are available for all
personnel on-site.
4.	The data associated with the demonstration capability are true, accurate,
complete and self-explanatory (1).
5.	All raw data (including a copy of this certification form) necessary to reconstruct
and validate these analyses have been retained at the facility, and that the associated
information is well organized and available for review by authorized assessors.
Technical Director's Name and Title	Signature	Date
Quality Assurance Officer's Name
Signature
Date

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5C-4 of 4
This certification form must be completed each time a demonstration of capability study is completed.
(1) True: Consistent with supporting data.
Accurate: Based on good laboratory practices consistent with sound scientific principles/practices.
Complete: Includes the results of all supporting performance testing.
Self-Explanatory: Data properly labeled and stored so that the results are clear and require no
additional explanation.

-------
QUALITY SYSTEMS
APPENDIX D
ESSENTIAL QUALITY CONTROL
REQUIREMENTS

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-1 of 25
Appendix D - ESSENTIAL QUALITY CONTROL REQUIREMENTS
The quality control protocols specified by the laboratory's method manual (5.10.1.2) shall be followed.
The laboratory shall ensure that the essential standards outlined in Appendix Dare incorporated into
their method manuals and/or the Laboratory Quality Manual.
All quality control measures shall be assessed and evaluated on an on-going basis and quality control
acceptance criteria shall be used to determine the validity of the data. The laboratory shall have
procedures for the development of acceptance/rejection criteria where no method or regulatory criteria
exists.
The requirements from the body of Chapter 5, e.g., 5.5.4, apply to all types of testing. The specific
manner in which they are implemented is detailed in each of the sections of this Appendix, i.e.,
chemical testing, W.E.T. testing, microbiology testing, radiochemical testing and air testing.
D.1 CHEMICAL TESTING
D.1.1 Positive and Negative Controls
a) Negative Control
Purpose:
Frequency:
Composition:
Evaluation
Criteria and
Corrective
Action:
- Method Performance
The method blank is used to assess the preparation batch for possible
contamination during the preparation and processing steps. The method
blank shall be processed along with and under the same conditions as the
associated samples to include all steps of the analytical procedure.
Procedures shall be in place to determine if a method blank is
contaminated. Any affected samples associated with a contaminated
method blank shall be reprocessed for analysis or the results reported with
appropriate data qualifying codes.
The method blank shall be analyzed at a minimum of 1 per preparation
batch. In those instances for which no separate preparation method is
used (example: volatiles in water) the batch shall be defined as
environmental samples that are analyzed together with the same method
and personnel, using the same lots of reagents, not to exceed the analysis
of 20 environmental samples.
The method blank shall consist of a matrix that is similar to the associated
samples and is known to be free of the analytes of interest.
While the goal is to have no detectable contaminants, each method blank
must be critically evaluated as to the nature of the interference and the
effect on the analysis of each sample within the batch. The source of
contamination shall be investigated and measures taken to minimize or
eliminate the problem and affected samples reprocessed or data shall be
appropriately qualified if:
1. The concentration of atargeted analyte in the blank is at or above the
reporting limit as established by the test method or by regulation,
AND is greater than 1/10 of the amount measured in any sample.
2.
The blank contamination otherwise affects the sample results as per
the test method requirements or the individual project data quality
objectives.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-2 of 25
b) Positive Control - Method Performance
1) Laboratory Control Sample (LCS)
Purpose:
Frequency:
Composition:
The LCS is used to evaluate the performance of the total analytical
system, including all preparation and analysis steps. Results of the LCS
are compared to established criteria and, if found to be outside of these
criteria, indicates that the analytical system is "out of control". Any
affected samples associated with an out of control LCS shall be
reprocessed for re-analysis or the results reported with appropriate data
qualifying codes.
The LCS shall be analyzed at a minimum of 1 per preparation batch.
Exceptions would be for those analytes for which no spiking solutions are
available such as total suspended solids, total dissolved solids, total
volatile solids, total solids, pH, color, odor, temperature, dissolved oxygen
or turbidity. In those instances for which no separate preparation method
is used (example: volatiles in water) the batch shall be defined as
environmental samples that are analyzed together with the same method
and personnel, using the same lots of reagents, not to exceed the analysis
of 20 environmental samples.
The LCS is a controlled matrix, known to be free of analytes of interest,
spiked with known and verified concentrations of analytes. NOTE: the
matrix spike may be used in place of this control as long as the
acceptance criteria are as stringent as for the LCS. Alternatively the LCS
may consist of a media containing known and verified concentrations of
analytes or as Certified Reference Material (CRM). All analyte
concentrations shall be within the calibration range of the methods. The
following shall be used in choosing components for the spike mixtures:
The components to be spiked shall be as specified by the mandated test
method or other regulatory requirement or as requested by the client. In
the absence of specified spiking components the laboratory shall spike per
the following:
For those components that interfere with an accurate assessment such as
spiking simultaneously with technical chlordane, toxaphene and PCBs,
the spike should be chosen that represents the chemistries and elution
patterns of the components to be reported.
For those test methods that have extremely long lists of analytes, a
representative number may be chosen. The analytes selected should be
representative of all analytes reported. The following criteria shall be used
for determining the minimum number of analytes to be spiked. However,
the laboratory shall insure that all targeted components are included in the
spike mixture over a 2 year period.
a)	For methods that include 1-10 targets, spike all components;
b)	For methods that include 11-20 targets, spike at least 10 or 80%,
whichever is greater;

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-3 of 25
Evaluation
Criteria and
Corrective
Action:
c) For methods with more than 20 targets, spike at least 16
components.
The results of the individual batch LCS are calculated in percent recovery.
The laboratory shall document the calculation for percent recovery.
The individual LCS is compared to the acceptance criteria as published in
the mandated test method. Where there are no established criteria, the
laboratory shall determine internal criteria and document the method used
to establish the limits or utilize client specified assessment criteria.
A LCS that is determined to be within the criteria effectively establishes
that the analytical system is in control and validates system performance
for the samples in the associated batch. Samples analyzed along with a
LCS determined to be "out of control" should be considered suspect and
the samples reprocessed and re-analyzed or the data reported with
appropriate data qualifying codes.
Sample Specific Controls
The laboratory must document procedures for determining the effect of the sample matrix
on method performance. These procedures relate to the analyses of matrix specific
Quality Control (QC) samples and are designed as data quality indicators for a specific
sample using the designated test method. These controls alone are not used to judge
laboratory performance.
Examples of matrix specific QC include: Matrix Spike (MS); Matrix Spike Duplicate (MSD);
sample duplicates; and surrogate spikes. The laboratory shall have procedures in place
for tracking, managing, and handling matrix specific QC criteria including spiking
appropriate components at appropriate concentrations, calculating recoveries and relative
percent difference, evaluating and reporting results based on performance of the QC
samples.
Matrix Spike; Matrix Spike Duplicates:
Purpose:	Matrix specific QC samples indicate the effect of the sample matrix on the
precision and accuracy of the results generated using the selected
method. The information from these controls is sample/matrix specific and
would not normally be used to determine the validity of the entire batch.
Frequency: The frequency of the analysis of matrix specific samples shall be
determined as part of a systematic planning process (e.g. Data Quality
Objectives) or as specified by the required mandated test method.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-4 of 25
Composition: The components to be spiked shall be as specified by the mandated test
method. Any permit specified analytes, as specified by regulation or client
requested analytes shall also be included. If there are no specified
components, the laboratory shall spike per the following:
For those components that interfere with an accurate assessment such as
spiking simultaneously with technical chlordane, toxaphene and PCBs,
the spike should be chosen that represents the chemistries and elution
patterns of the components to be reported.
For those test methods that have extremely long lists of analytes, a
representative number may be chosen using the following criteria for
choosing the number of analytes to be spiked. However, the laboratory
shall insure that all targeted components are included in the spike mixture
over a 2 year period.
a)	For methods that include 1-10 targets, spike all components;
b)	For methods that include 11-20 targets, spike at least 10 or 80%,
whichever is greater;
c)	For methods with more than 20 targets, spike at least 16
components.
Evaluation The results from matrix spike/matrix spike duplicate are primarily designed
Criteria and to assess the precision and accuracy of analytical results in a given matrix
Corrective and are expressed as percent recovery (%R) and relative percent
Action:	difference (RPD). The laboratory shall document the calculation for
relative percent difference.
Results are compared to the acceptance criteria as published in the
mandated test method. Where there are no established criteria, the
laboratory should determine internal criteria and document the method
used to establish the limits. For matrix spike results outside established
criteria corrective action shall be documented or the data reported with
appropriate data qualifying codes.
d) Matrix Duplicates:
Purpose:	Matrix duplicates are defined as replicate aliquots of the same sample
taken through the entire analytical procedure. The results from this
analysis indicate the precision of the results for the specific sample using
the selected method. The matrix duplicate provides a usable measure of
precision only when target analytes are found in the sample chosen for
duplication.
Frequency: The frequency of the analysis of matrix duplicates may be determined as
part of a systematic planning process (e.g. Data Quality Objectives) or as
specified by the mandated test method.
Composition:
Matrix duplicates are performed on replicate aliquots of actual samples.
The composition is usually not known.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-5 of 25
Evaluation The results from matrix duplicates are primarily designed to assess the
Criteria and precision of analytical results in a given matrix and are expressed as
Corrective relative percent difference (RPD) or another statistical treatment (e.g.,
Action:	absolute differences). The laboratory shall document the calculation for
relative percent difference or other statistical treatments.
Results are compared to the acceptance criteria as published in the
mandated test method. Where there are no established criteria, the
laboratory shall determine internal criteria and document the method used
to establish the limits. For matrix duplicates results outside established
criteria corrective action shall be documented or the data reported with
appropriate data qualifying codes.
e)
Surrogate Spikes:
Purpose:	Surrogates are used most often in organic chromatography test methods
and are chosen to reflect the chemistries of the targeted components of
the method. Added prior to sample preparation/extraction, they provide a
measure of recovery for every sample matrix.
Frequency: Except where the matrix precludes its use or when not available,
surrogate compounds must be added to all samples, standards, and
blanks for all appropriate test methods.
Composition: Surrogate compounds are chosen to represent the various chemistries of
the target analytes in the method. They are often specified by the
mandated method and are deliberately chosen for their being unlikely to
occur as an environmental contaminant. Often this is accomplished by
using deuterated analogs of select compounds.
Evaluation The results are compared to the acceptance criteria as published in the
Criteria and mandated test method. Where there are no established criteria, the
Corrective laboratory should determine internal criteria and document the method
Action:	used to establish the limits. Surrogates outside the acceptance criteria
must be evaluated for the effect indicated forthe individual sample results.
The appropriate corrective action may be guided by the data quality
objectives or other site specific requirements. Results reported from
analyses with surrogate recoveries outside the acceptance criteria should
include appropriate data qualifiers.
D.1.2 Detection Limits
The laboratory shall utilize a test method that provides a detection limit that is appropriate and
relevant for the intended use of the data. Detection limits shall be determined by the protocol in the
mandated test method or applicable regulation, e.g., Method Detection Limit (MDL). If the protocol
for determining detection limits is not specified, the selection ofthe procedure must reflect instrument
limitations and the intended application ofthe test method.
a) Adetection limit study is not required for any component for which spiking solutions orquality
control samples are not available such as temperature.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-6 of 25
b)	The detection limit shall be initially determined for the compounds of interest in each test
method in a matrix in which there are not target analytes nor interferences at a concentration
that would impact the results or the detection limit must be determined in the matrix of interest
(see definition of matrix).
c)	Detection limits must be determined each time there is a change in the test method that
affects how the test is performed, or when a change in instrumentation occurs that affects the
sensitivity of the analysis.
d)	All sample processing steps of the analytical method shall be included in the determination
of the detection limit.
e)	All procedures used must be documented. Documentation must include the matrix type. All
supporting data must be retained.
1) The laboratory must have established procedures to relate detection limits with quantitation
limits.
g) The test method's quantitation limits must be established and must be above the detection
limits.
D.1.3 Data Reduction
The procedures for data reduction, such as use of linear regression, shall be documented.
D.1.4 Quality of Standards and Reagents
a)	The source of standards shall comply with 5.9.2.
b)	Reagent Quality, Water Quality and Checks:
1)	Reagents - In methods where the purity of reagents is not specified, analytical reagent
grade shall be used. Reagents of lesser purity than those specified by the test method
shall not be used. The labels on the container should be checked to verify that the purity
of the reagents meets the requirements of the particular test method. Such information
shall be documented.
2)	Water - The quality of water sources shall be monitored and documented and shall meet
method specified requirements.
3)	The laboratory will verify the concentration of titrants in accordance with written
laboratory procedures.
D.1.5 Selectivity
a)	Absolute retention time and relative retention time aid in the identification of components in
chromatographic analyses and to evaluate the effectiveness of a column to separate
constituents. The laboratory shall develop and document acceptance criteria for retention
time windows.
b)	A confirmation shall be performed to verify the compound identification when positive results
are detected on a sample from a location that has not been previously tested by the
laboratory. Such confirmations shall be performed on organic tests such as pesticides,

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-7 of 25
herbicides, or acid extractable or when recommended by the analytical test method except
when the analysis involves the use of a mass spectrometer. Confirmation is required unless
stipulated in writing by the client. All confirmation shall be documented.
c) The laboratory shall document acceptance criteria for mass spectral tuning.
D.1.6 Constant and Consistent Test Conditions
a)	The laboratory shall assure that the test instruments consistently operate within the
specifications required of the application for which the equipment is used.
b)	Glassware Cleaning - Glassware shall be cleaned to meet the sensitivity of the test method.
Any cleaning and storage procedures that are not specified by the test method shall be
documented in laboratory records and SOPs.
D.2 TOXICITY TESTING
These standards apply to laboratories measuring the toxicity and/or bioaccumulation of
contaminants in general. They are applicable to toxicity or bioaccumulation test methods for
evaluating effluents (whole effluent toxicity or WET), receiving waters, sediments, elutriates,
leachates and soils. In addition to the essential quality control standards described below,
some methods may have additional or other requirements based on factors such as the type
of matrix evaluated. Additional information can be found in the following methods manuals
(or most recent edition): EPA/600/4-91/002, EPA/600/4-91/003, EPA/600/4-90/027F (WET
testing), EPA/600/4-90/031 (general aquatictoxicity testing), EPA/600/R-94/025, EPA/600/R-
94/024, EPA/503/R-91/001, EPA/823/B-98/004 (sediments and elutriates), EPA/600/3-
88/029, EPA/600/3-89/013, ASTM E1598-94 AND ASTM 1676-97 (soils).
D.2.1 Positive and Negative Controls
a) Positive Control - Reference Toxicants - Reference toxicant tests indicate the sensitivity of
the test organisms being used and demonstrate a laboratory's ability to obtain consistent
results with the test method.
1)	The laboratory must demonstrate its ability to obtain consistent results with reference
toxicants before it performs toxicity tests with effluents or other environmental samples
for regulatory compliance purposes.
i)	To meet this requirement, the intra-laboratory precision must be determined by
performing five or more acceptable reference toxicant tests for each test method and
species with different batches of organisms and appropriate negative controls (water,
sediment, or soil).
ii)	An intralaboratory coefficient of variation (%CV) is not established for each test
method. However, a testing laboratory shall maintain control charts for the control
performance and reference toxicant statistical endpoint (such as NOEC or ECp) and
shall evaluate the intralaboratory variability with a specific reference toxicant for each
test method.
2)	Ongoing laboratory performance shall be demonstrated by performing regular reference
toxicant tests for each test method and species in accordance with the minimum
frequency requirements specified in D.2.1.a.3.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-8 of 25
i)	Intralaboratory precision on an ongoing basis must be determined through the use
of reference toxicant tests and plotted in quality control charts. The control charts
shall be plotted as point estimate values, such as EC25 for chronic tests and LC50
for acute tests, or as appropriate hypothesis test values, such as the NOEC or
NOAEC, overtime within a laboratory.
ii)	For endpoints that are point estimates (ICp, ECp) control charts are constructed by
plotting the cumulative mean and the control limits which consist of the upper and
lower 95% confidence limits (+/- 2 std. dev.); these values are re-calculated with
each successive test result. For endpoints from hypothesis tests (NOEC, NOAEC)
the values are plotted directly and the control limits consist of one concentration
interval above and below the concentration representing central tendency (i.e. the
mode).
iii)	After 20 data points are collected for a test method and species, the control chart is
maintained using only the last 20 data points, i.e. each successive mean value and
control limit is calculated using only the last 20 values.
iv)	Control chart limits are expected to be exceeded occasionally regardless of how well
a laboratory performs. Acceptance limits for point estimates (ICp, ECp) which are
based on 95% confidence limits should theoretically be exceeded for one in twenty
tests. Depending on the dilution factor and test sensitivity, control charts based on
hypothesis test values (NOEC, NOAEC) may be expected to be exceeded on a
similar frequency. Test results which fall outside of control chart limits at a frequency
of 5% or less, or which fall just outside control chart limits (especially in the case of
highly proficient laboratories which may develop relatively narrow acceptance limits
over time), are not rejected de facto. Such data are evaluated in comparison with
control chart characteristics including the width of the acceptance limits and the
degree of departure of the value from acceptance limits.
v)	Laboratories shall develop an acceptance/rejection policy for reference toxicant data
which considers test dilution factor, test sensitivity (for hypothesis test values),
testing frequency, out-of-control test frequency, relative width of acceptance limits
and degree of difference between test results and acceptance limits.
vi)	In the case of reference toxicant data which fails to meet acceptance criteria, the
results of environmental toxicity tests conducted during the affected period may be
suspect and regarded as provisional. In this case the test procedure is examined for
defects and the test repeated if necessary, using a different batch of organisms, as
soon as possible or the data is qualified.
3) The frequency of reference toxicant testing shall comply with the EPA or state permitting
authority requirements. The following minimum frequency shall be met:
i)	Each batch of test organisms obtained from an outside source, field collection or
from laboratory spawning of field-collected species not amenable to routine
laboratory culture (for example, sea urchins and bivalve mollusks) must be evaluated
with a reference toxicant test of the same type as the environmental toxicity test
within the seven days preceding the test or concurrently with the test.
ii)	Test organisms obtained from in-house laboratory cultures must be tested with
reference toxicant tests at least once each month for each test method. However,
if a given species produced by in-house cultures is used only monthly, or less

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-9 of 25
frequently, a reference toxicant test of the same type must be performed with each
environmental toxicity test.
iii) For test methods and species commonly used in the laboratory, but which are tested
on a seasonal basis (e.g. sea urchin fertilization tests), reference toxicant tests must
be conducted for each month the method is in use.
4)	These standards do not currently specify a particular reference toxicant and dilution
series however, if the state or permitting authority identifies a reference toxicant or
dilution series for a particular test, the laboratory shall follow the specified requirements.
All reference toxicant tests conducted for a given test method and species must use the
same reference toxicant, test concentrations, dilution water and data analysis methods.
A dilution factor of 0.5x or greater shall be used for both acute and chronic tests.
5)	The reference toxicant tests shall be conducted following the same procedures as the
environmental toxicity tests for which the precision is being evaluated, unless otherwise
specified in the test method (for example, 10-day sediment tests employ 96-h water-only
reference toxicant tests). The test duration, dilution or control water, feeding, organism
age, age range and density, test volumes, renewal frequency, water quality
measurements, and the number of test concentrations, replicates and organisms per
replicate shall be the same as specified for the environmental toxicity test.
b) Negative Control - Control, Brine Control, Control Sediment, Control Soil or Dilution Water -
1)	The standards for the use, type and frequency of testing of negative controls are
specified by the test methods and by permit or regulation and shall be followed. A
negative control is included with each test.
2)	Appropriate additional negative controls shall be included when sample adjustments (for
example addition of sodium hydroxide for pH adjustment or thiosulfate for dechlorination)
or solvent carriers are used in the test.
3) Test Acceptability Criteria (TAC) - The test acceptability criteria (for example, the whole-
effluent chronic Ceriodaphnia test, requires 80% or greater survival and an average 15
young per female in the controls) as specified in the test method must be achieved for
both the reference toxicant and the effluent or environmental sample toxicity test. The
criteria shall be calculated and shall meet the method specified requirements for
performing toxicity tests.
D.2.2 Variability and/or Reproducibility
Intralaboratory precision shall be determined on an ongoing basis through the use of further reference
toxicant tests and related control charts as described in item D.2.1.a above.
D.2.3 Accuracy
This principle is not applicable to Toxicity Testing.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-10 of 25
D.2.4 Test Sensitivity
a)	Ifthe Dunnett's procedure is used, the statistical minimum significant difference (SMSD) shall
be calculated according to the formula specified by the test method and reported with the
test results.
b)	Estimate the SMSD for non-normal distribution and or heterogenous variances.
c)	Point estimates: (LCp, ICp, or ECp) - Confidence intervals shall be reported as a measure
of the precision around the point estimate value.
d)	The SMSD shall be calculated and reported for only hypothesis test values, such as the
NOEC or NOAEC.
D.2.5 Selection of Appropriate Statistical Analysis Methods
a)	If required, methods of data analysis and endpoints are specified by language in the
regulation, permit or the test method.
b)	Dose Response Curves - When required, the data shall be plotted in the form of a curve
relating the dose of the chemical or concentration of sample to cumulative percentage of test
organisms demonstrating a response such as death.
D.2.6 Selection and Use of Reagents and Standards
a)	The grade of all reagents used in toxicity tests is specified in the test method except the
reference standard. All reference standards shall be prepared from chemicals which are
analytical reagent grade or better. The preparation of all standards and reference toxicants
shall be documented.
b)	All standards and reagents associated with chemical measurements, such as dissolved
oxygen, pH or specific conductance, shall comply with the standards outlined in Section
5.9.4 above.
c)	Only reagent-grade water collected from distillation or deionization units (> 17 megohm
resistivity) is used to prepare reagents.
D.2.7 Selectivity
This principle is not applicable. The selectivity of the test is specified by permit or regulation.
D.2.8 Constant and Consistent Test Conditions
a)	If closed refrigerator-sized incubators are used, culturing and testing of organisms shall be
separated to avoid loss of cultures due to cross-contamination.
b)	Laboratory space must be adequate for the types and numbers of tests performed. The
building must provide adequate cooling, heating and illumination for conducting testing and
culturing; hot and cold running water must be available for cleaning equipment.
c)	Air used for aeration of test solutions, dilution waters and cultures must be free of oil and
fumes.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-11 of 25
The laboratory or a contracted outside expert shall positively identify test organisms to
species on an annual basis. The taxonomic reference (citation and page(s))and the
names(s) of the taxonomic expert(s) must be kept on file at the laboratory. When organisms
are obtained from an outside source the supplier must provide this same information.
Instruments used for routine measurements of chemical and physical parameters such as pH,
DO, conductivity, salinity, alkalinity, hardness, chlorine, and weight shall be calibrated, and/or
standardized per manufacturer's instructions and Section 5.9.4. Temperature shall be
calibrated per section 5.9.4.2.1. All measurements and calibrations shall be documented.
Test temperature shall be maintained as specified forthe test method. Temperature control
equipment must be adequate to maintain the required test temperature(s). The average daily
temperature of the test solutions must be maintained within fC of the selected test
temperature, for the duration of the test. The minimum frequency of measurement shall be
once per 24 hour period. The test temperature for continuous-flow toxicity tests shall be
recorded and monitored continuously.
Reagent grade water, prepared by any combination of distillation, reverse osmosis, ion
exchange, activated carbon and particle filtration, shall meet the following requirements as
verified by monthly measurement: conductivity less than or equal to 0.1 umhos or resistivity
greater than or equal to 17 megohm, pH 5.5 to 7.5 S.U. and total residual chlorine non-
detectable.
The quality of the standard dilution water used for testing or culturing must be sufficient to
allow satisfactory survival, growth and reproduction of the test species as demonstrated by
routine reference toxicant tests and negative control performance. Water used for culturing
and testing shall be analyzed for toxic metals and organics whenever the minimum
acceptability criteria for control survival, growth or reproduction are not met and no other
cause, such as contaminated glassware or poor stock, can be identified. It is recognized that
the analyte lists of some methods manuals may not include all potential toxicants, are based
on estimates of chemical toxicity available at the time of publication and may specify
detection limits which are not achievable in all matrices. However, for those analytes not
listed, or for which the measured concentration or detection limit is greater than the method-
specified limit, the laboratory must demonstrate that the analyte at the measured
concentration or reported detection limit does not exceed one tenth the expected chronic
value forthe most sensitive species tested and/or cultured. The expected chronic value is
based on professional judgement and the best available scientific data. The "USEPA
Ambient Water Quality Criteria Documents" and the EPA AQUIRE data base provide
guidance and data on acceptability and toxicity of individual metals and organic compounds..
For each new batch of laboratory-prepared or lot of commercial food used by the laboratory,
the performance of organisms fed with the new food shall be compared with the performance
of organisms fed with a food of known quality. If the food is used for culturing, its suitability
is determined using a measure that evaluates the effect of food quality on survival and growth
or reproduction of each of the relevant test species. Where applicable, foods used only in
chronic toxicity tests are evaluated using the reference toxicant regularly employed in the
laboratory QA program and compared with results of previous test(s) using a food of known
quality. In the case of algae, rotifers or other cultured foods, which are collected as a
continuous batch, the quality is assessed as described above, each time new nutrient stocks
are prepared, a new starter culture is employed or when a significant change in culture
conditions occurs. The laboratory shall have written procedures for the statistical evaluation
of food acceptance.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-12 of 25
j) Food used to culture organisms used in bioaccumulation tests must be analyzed for the
compounds to be measured in the bioaccumulation tests.
k) Test chamber size and test solution volume shall be as specified in the test method. All test
chambers used in a test must be identical.
I) Test organisms shall be fed the quantity and type food or nutrients specified in the test
method. They shall also be fed at the intervals specified in the test methods.
m) All organisms in a test must be from the same source. Where available certified seeds are
used for soil tests.
n) All organisms used in tests, or used as broodstock to produce neonate test organisms (for
example cladocerans and larval fish), must appear healthy, show no signs of stress or
disease and exhibit acceptable survival (90% or greater) during the 24 hour period
immediately preceding use in tests.
o) All materials used for test chambers, culture tanks, tubing, etc. and coming in contact with
test samples, solutions, control water, sediment or soil or food must be non-toxic and cleaned
as described in the test methods. Materials must not reduce or add to sample toxicity.
Appropriate materials for use in toxicity testing and culturing are described in the referenced
manuals.
p) Light intensity shall be maintained as specified in the methods manuals. Measurements shall
be made and recorded on a yearly basis. Photoperiod shall be maintained as specified in the
test methods and shall be documented at least quarterly. For algal and plant tests, the light
intensity shall be measured and recorded at the start of each test.
q) At a minimum, during aquatic chronic testing DO and pH shall be measured daily in at least
one replicate of each concentration. In static-renewal tests DO must be measured at both
the beginning and end of each 24-h exposure period and may be measured in old and new
solutions prior to organism transfer, or after organism transfer; pH is measured at the end
of each exposure period (i.e. in old solutions).
r) The health and culturing conditions of all organisms used for testing shall be documented by
the testing laboratory. Such documentation shall include culture conditions (e.g. salinity,
hardness, temperature, pH) and observations of any stress, disease or mortality. When
organisms are obtained from an outside source, the laboratory shall obtain written
documentation of these water quality parameters and biological observations for each lot of
organism received. These observations shall adequately address the 24-hour time period
referenced in item D.2.8.n.above. The laboratory shall also record each of these
observations and water quality parameters upon the arrival of the organisms at the testing
laboratory.
s) Age and the age range of the test organisms must be as specified in the test method.
Supporting information, such as hatch dates and times, times of brood releases and metrics
(for example, chironomid head capsule width) shall be documented.
t) The maximum holding time of effluents (elapsed time from sample collection to first use in
a test) shall not exceed 36 hours and the last use of the sample in test renewals shall not
exceed 72 hours without the permission of the permitting authority.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-13 of 25
u) All samples shall be chilled to 4°C during or immediately after collection (see requirements
in section 5.11.3).
v) Organisms obtained from an outside source must be from the same batch. Chronic tests
shall have a minimum of four replicates per treatment.
w) The control population of Ceriodaphnia in chronic effluent or receiving water tests shall
contain no more than 20% males.
x) Dissolved oxygen and pH in aquatic tests shall be within acceptable range at test initiation
and aeration (minimal) is provided to tests if, and only if, acceptable dissolved oxygen
concentrations cannot be otherwise maintained or if specified by the test method.
y) The test soils or sediments must be within the geochemical tolerance range of the test
organism.
z) An individual test may be conditionally acceptable if temperature, dissolved oxygen, pH and
other specified conditions fall outside specifications, depending on the degree of the
departure and the objectives of the tests (see test conditions and test acceptability criteria
specified for each test method). The acceptability of the test shall depend on the experience
and professional judgment of the technical employee and the permitting authority.
D.3 MICROBIOLOGY TESTING
These standards apply to laboratories undertaking microbiological analysis of environmental samples.
Microbiological testing refers to and includes the detection, isolation, enumeration, or identification
of microorganisms and/or their metabolites, or determination of the presence or absence of growth
in materials and media.
D.3.1 Sterility Checks and Blanks, Positive and Negative Controls
a) Sterility Checks and Blanks
The laboratory shall demonstrate that the filtration equipment and filters, sample containers,
media and reagents have not been contaminated through improper handling or preparation,
inadequate sterilization, or environmental exposure.
1)	A sterility blank shall be analyzed for each lot of pre-prepared, ready-to-use medium
(including chromofluorogenic reagent) and for each batch of medium prepared in the
laboratory. This shall be done prior to first use of the medium.
2)	For each filtration series in the filtration technique, the laboratory shall prepare at least one
beginning and one ending sterility check. When an interruption of more than 30 minutes
occurs, the filtration funnels shall be re-sterilized.
3)	For pour plate technique, sterility blanks of the medium shall be made by pouring, at a
minimum, one uninoculated plate for each lot of pre-prepared, ready-to-use media and for
each batch of medium prepared in the laboratory.
4)	Sterility checks on sample containers shall be performed on at least one container for each
lot of purchased, pre-sterilized containers. For containers prepared and sterilized in the
laboratory, a sterility check shall be performed on one container per sterilized batch with non-
selective growth media.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-14 of 25
5)	A sterility blank shall be performed on each batch of dilution water prepared in the laboratory
and on each batch of pre-prepared, ready-to-use dilution water with non-selective growth
media.
6)	At least one filter from each new lot of membrane filters shall be checked for sterility with non-
selective growth media.
b)	Positive Controls
Positive culture controls demonstrate that the medium can support the growth of the target
organism(s), and that the medium produces the specified or expected reaction to the target
organism(s).
1) Each pre-prepared, ready-to-use lot of medium (including chromofluorogenic reagent) and
each batch of medium prepared in the laboratory shall be tested with at least one pure culture
of a known positive reaction. This shall be done prior to first use of the medium.
c)	Negative Controls
Negative culture controls demonstrate that the medium does not support the growth of non-target
organisms or does not demonstrate the typical positive reaction of the target organism(s).
Each pre-prepared, ready-to-use lot of selective medium (including chromofluorogenic reagent)
and each batch of selective medium prepared in the laboratory shall be analyzed with one or
more known negative culture controls, i.e. non-target organisms, as appropriate to the method.
This shall be done prior to first use of the medium.
D.3.2 Test Variability/Reproducibility
For test methods that specify colony counts such as membrane filter or plated media, duplicate counts
shall be performed monthly on one positive sample, for each month that the test is performed. If the
lab has two or more analysts, each analyst shall count typical colonies on the same plate. Counts
must be within 10% difference to be acceptable. In a laboratory with only one microbiology analyst,
the same plate shall be counted twice by the analyst, with no more than 5% difference between the
counts.
D.3.3 Method Evaluation
a)	Laboratories are required to demonstrate proficiency with the test method prior to first use. This
shall be achieved by comparison to a method already approved for use in the laboratory, or by
analyzing a minimum of ten spiked samples whose matrix is representative of those normally
submitted to the laboratory, or by analyzing and passing one proficiency test series provided by
an approved proficiency sample provider. The laboratory shall maintain this documentation as
long as the method is in use and for at least 5 years past the date of last use.
b)	Laboratories shall participate in the Proficiency Test programs identified by NELAP (5.4.2.j or
5.5.3.4). The results of these analyses shall be used to evaluate the ability of the laboratory to
produce acceptable data.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-15 of 25
D.3.4 Test Performance
a)	All growth and recovery media must be checked to assure that the target organism(s) respond
in an acceptable and predictable manner (see D.3.1.b).
b)	To ensure that analysis results are accurate, target organism identity shall be verified as specified
in the method, e.g. by use of the completed test, or by use of secondary verification tests such
as a catalase test.
D.3.5 Data Reduction
The calculations, data reduction and statistical interpretations specified by each test method shall be
followed.
D.3.6 Quality of Standards, Reagents and Media
The laboratory shall ensure that the quality of the reagents and media used is appropriate for the test
concerned.
a)	Culture media may be prepared from commercial dehydrated powders or may be purchased
ready-to-use. Preparation from different chemical ingredients shall not be done unless the media
is not available commercially or unless specified by the method.
b)	Reagents, commercial dehydrated powders and media shall be used within the shelf-life of the
product and shall be documented according to 5.10.5.
c)	Distilled water, deionized water or reverse-osmosis produced water free from bactericidal and
inhibitory substances shall be used in the preparation of media, solutions and buffers. The quality
of the water shall be monitored for chlorine residual, specific conductance, and heterotrophic
bacteria plate count monthly (when in use), when maintenance is performed on the water
treatment system, or at startup after a period of disuse longer than one month. Analysis for
metals and the Bacteriological Water Quality Test (to determine presence of toxic agents or
growth promoting substances) shall be performed annually. Results of these analyses shall meet
the specifications of the required method and records of analyses shall be maintained for five
years. (An exception to performing the Bacteriological Water Quality Test shall be given to
laboratories that can supply documentation to show that their water source meets the criteria, as
specified by the method, for Type I or Type II reagent water.)
d)	Media, solutions and reagents shall be prepared, used and stored according to a documented
procedure following the manufacturer's instructions or the test method. Documentation for media
prepared in the laboratory shall include date of preparation, preparer's initials, type and amount
of media prepared, manufacturer and lot number, final pH of the media, and expiration date.
Documentation for media purchased pre-prepared, ready-to-use shall include manufacturer, lot
number, type and amount of media received, date of receipt, expiration date of the media, and
pH of the media.
D.3.7 Selectivity
a) In order to ensure identity and traceability, reference cultures used for positive and negative
controls shall be obtained from a recognized national collection, organization, or manufacturer
recognized by the NELAP Accrediting Authority. Microorganisms may be single use preparations
or cultures maintained by documented procedures that demonstrate the continued purity and
viability of the organism.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-16 of 25
1)	Reference cultures may be revived (iffreeze-dried) or transferred from slants and subcultured
once to provide reference stocks. The reference stocks shall be preserved by a technique
which maintains the characteristics ofthe strains. Reference stocks shall be used to prepare
working stocks for routine work. If reference stocks have been thawed, they must not be re-
frozen and re-used.
2)	Working stocks shall not be sequentially cultured more than five times and shall not be
subcultured to replace reference stocks.
D.3.8 Constant and Consistent Test Conditions
a)	Laboratory Facilities
Floors and work surfaces shall be non-absorbent and easy to clean and disinfect. Work surfaces
shall be adequately sealed. Laboratories shall provide sufficient storage space, and shall be
clean and free from dust accumulation. Plants, food, and drink shall be prohibited from the
laboratory work area.
b)	Laboratory Equipment
1)	Temperature Measuring Devices
Temperature measuring devices such as liquid-in-glass thermometers, thermocouples, and
platinum resistance thermometers used in incubators, autoclaves and other equipment shall
be the appropriate quality to meet specification^) in the test method. The graduation ofthe
temperature measuring devices must be appropriate for the required accuracy of
measurement and they shall be calibrated to national or international standards for
temperature (see 5.9.2). Calibration shall be done at least annually.
2)	Autoclaves
i)	The performance of each autoclave shall be initially evaluated by establishing its
functional properties and performance, for example heat distribution characteristics with
respect to typical uses. Autoclaves shall meet specified temperature tolerances.
Pressure cookers shall not be used for sterilization of growth media.
ii)	Demonstration of sterilization temperature shall be provided by use of continuous
temperature recording device or by use of a maximum registering thermometer with
every cycle. Appropriate biological indicators shall be used once per month to determine
effective sterilization. Temperature sensitive tape shall be used with the contents of each
autoclave run to indicate that the autoclave contents have been processed.
iii)	Records of autoclave operations shall be maintained for every cycle. Records shall
include: date, contents, maximum temperature reached, pressure, time in sterilization
mode, total run time (may be recorded as time in and time out) and analyst's initials.
iv)	Autoclave maintenance, either internally or by service contract, shall be performed
annually and shall include a pressure check and calibration of temperature device.
Records ofthe maintenance shall be maintained in equipment logs.
v)
The autoclave mechanical timing device shall be checked quarterly against a stopwatch
and the actual time elapsed documented.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-17 of 25
3)	Volumetric Equipment
Volumetric equipment shall be calibrated as follows:
i)	equipment with movable parts such as automatic dispensers, dispensers/diluters, and
mechanical hand pipettes shall be calibrated quarterly.
ii)	equipment such as filter funnels, bottles, non-class A glassware, and other marked
containers shall be calibrated once per lot prior to first use.
iii)	the volume of the disposable volumetric equipment such as sample bottles, disposable
pipettes, and micropippette tips shall be checked once per lot.
4)	UV Instruments
UV instruments, used for sanitization, shall be tested quarterly for effectiveness with an
appropriate UV light meter or by plate count agar spread plates. Replace bulbs if output is less
than 70% of original for light tests or if count reduction is less than 99% for a plate containing 200
to 300 organisms.
5)	Conductivity meters, oxygen meters, pH meters, hygrometers, and other similar
measurement instruments shall be calibrated according to the method specified requirements
(see Section 5.9.4).
6)	Incubators, Water Baths, Ovens
i)	The stability and uniformity oftemperature distribution and time required after test sample
addition to re-establish equilibrium conditions in incubators and water baths shall be
established. Temperature of incubators and water baths shall be documented twice daily,
at least four hours apart, on each day of use.
ii)	Ovens used for sterilization shall be checked for sterilization effectiveness monthly with
appropriate biological indicators. Records shall be maintained for each cycle that include
date, cycle time, temperature, contents and analyst's initials.
7)	Labware (Glassware and Plasticware)
i)	The laboratory shall have a documented procedure for washing labware, if applicable.
Detergents designed for laboratory use must be used.
ii)	Glassware shall be made of borosilicate or other non-corrosive material, free of chips and
cracks, and shall have readable measurement marks.
iii)	Labware that is washed and reused shall be tested for possible presence of residues
which may inhibit or promote growth of microorganisms by performing the Inhibitory
Residue Test annually, and each time the lab changes the lot of detergent or washing
procedures.
iv)	Washed labware shall be tested at least once daily, each day of washing, for possible
acid or alkaline residue by testing at least one piece of labware with a suitable pH
indicator such as bromothymol blue. Records of tests shall be maintained.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-18 of 25
D.4 RADIOCHEMICAL TESTING
These standards apply to laboratories undertaking the examination of environmental samples by
radiochemical analysis. These procedures for radiochemical analysis may involve some form of
chemical separation followed by detection of the radioactive decay of analyte (or indicative daughters)
and tracer isotopes where used. For the purpose of these standards procedures for the determination
of radioactive isotopes by mass spectrometry (e.g. ICP-MS or TIMS) or optical (e.g. KPA) techniques
are not addressed herein.
D.4.1 Negative and Positive Controls
a)	Negative Controls
1)	Method Blank - Shall be performed at a frequency of one per preparation batch. The results
of this analysis shall be one of the quality control measures to be used to assess the batch.
The method blank result shall be assessed against the specific acceptance criteria [see
5.10.1.2.b)18] specified in the laboratory method manual [see 5.10.1.2]. When the specified
method blank acceptance criteria is not met the specified corrective action and contingencies
[see 5.10.1.2.ab) 19 and 20] shall be followed and results reported with appropriate data
qualifying codes. The occurrence of a failed method blank acceptance criteria and the
actions taken shall be noted in the laboratory report [see 5.13.a)10].
2)	In the case of gamma spectrometry where the sample matrix is simply aliquoted into a
calibrated counting geometry the method blank shall be of similar counting geometry that is
empty or filled to similar volume with ASTM Type II water to partially simulate gamma
attenuation due to a sample matrix.
3)	There shall be no subtraction of the required method blank [see D.4.1.a)1] result from the
sample results in the associated preparation or analytical batch unless permitted by method
or program. This does not preclude the application of any correction factor (e.g. instrument
background, analyte presence in tracer, reagent impurities, peak overlap, calibration blank,
etc.) to all analyzed samples, both program/project submitted and internal quality control
samples. However, these correction factors shall not depend on the required method blank
result in the associated analytical batch.
4)	The method blank sample shall be prepared with similar aliquot size to that of the routine
samples foranalysis and the method blank result and acceptance criteria [5.10.1.2.b)18] shall
be calculated in a manner that compensates for sample results based upon differing aliquot
size.
b)	Positive Controls
1)	Laboratory Control Samples - Shall be performed at a frequency of one per preparation
batch. The results of this analysis shall be one of the quality control measures to be used to
assess the batch. The laboratory control sample result shall be assessed against the specific
acceptance criteria [see 5.10.1.2.b)18] specified in the laboratory method manual [see
5.10.1.2]. When the specified laboratory control sample acceptance criteria is not met the
specified corrective action and contingencies [see 5.10.1.2.b)19 and 20] shall be followed.
The occurrence of a failed laboratory control sample acceptance criteria and the actions
taken shall be noted in the laboratory report [see 5.13.a)10].
2)	Matrix Spike - Shall be performed at a frequency of one per preparation batch for those
methods which do not utilize an internal standard or carrier, for which there is a chemical

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-19 of 25
separation process, and where there is sufficient sample to do so. The exceptions are gross
alpha, gross beta and tritium which shall require matrix spikes for aqueous samples. The
results of this analysis shall be one of the quality control measures to be used to assess the
batch . The matrix spike result shall be assessed against the specific acceptance criteria
[see 5.10.1.2.b)18] specified in the laboratory method manual [see 5.10.1.2]. When the
specified matrix spike acceptance criteria is not met, the specified corrective action and
contingencies [see 5.10.1.2.b)19 and 20] shall be followed. The occurrence of a failed matrix
spike acceptance criteria and the actions taken shall be noted in the laboratory report [see
5.13.a)10]. The lack of sufficient sample aliquot size to perform a matrix spike shall be noted
in the laboratory report.
3)	The activity of the laboratory control sample shall: (1) be two to ten times the detection limit
or (2) at a level comparable to that of routine samples if the sample activities are expected
to exceed 10 times the detection limit.
4)	The activity of the matrix spike analytes(s) shall be greaterthan ten times the detection limit.
5)	The laboratory standards used to prepare the laboratory control sample and matrix spike
shall be from a source independent of the laboratory standards used for instrument
calibration.
6)	The matrix spike shall be prepared by adding a known activity of target analyte. Where a
radiochemical method, other than gamma spectroscopy, has more than one reportable
analyte isotope (e.g. plutonium, Pu 238 and Pu 239, using alpha spectrometry), only one of
the analyte isotopes need be included in the laboratory control or matrix spike sample at the
indicated activity level. However, where more than one analyte isotope is present above the
specified detection limit each shall be assessed against the specified acceptance criteria.
7)	Where gamma spectrometry is used to identify and quantitate more than one analyte isotope
the laboratory control sample and matrix spike shall contain isotopes that represent the low
(e.g. americium-241), medium (e.g. cesium-137) and high (e.g. cobalt-60) energy range of
the analyzed gamma spectra. As indicated by these examples the isotopes need not exactly
bracket the calibrated energy range or the range over which isotopes are identified and
quantitated.
8)	The laboratory control sample shall be prepared with similar aliquot size to that of the routine
samples for analyses.
Other Controls
1)	Tracer - For those methods that utilize a tracer (i.e. internal standard) each sample result
shall have an associated tracer recovery calculated and reported. The tracer recovery for
each sample result shall be one of the quality control measures to be used to assess the
associated sample result acceptance. The tracer recovery shall be assessed against the
specific acceptance criteria [see 5.10.1.2.b)18] specified in the laboratory method manual
[see 5.10.1.2]. When the specified tracer recovery acceptance criteria is not met the
specified corrective action and contingencies [see 5.10.1.2.b)19 and 20] shall be followed.
The occurrence of a failed tracer recovery acceptance criteria and the actions taken shall be
noted in the laboratory report [see 5.13.a)10].
2)	Carrier - For those methods that utilize a carrier, each sample shall have an associated
carrier recovery calculated and reported. The carrier recovery for each sample shall be one
of the quality control measures to be used to assess the associated sample result

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-20 of 25
acceptance. The carrier recovery shall be assessed against the specific acceptance criteria
[see 5.10.1.2.b)18] specified in the laboratory method manual [see 5.10.1.2]. When the
specified carrier recovery acceptance criteria is not met the specified corrective action and
contingencies [see 5.10.1.2.b)19 and 20] shall be followed. The occurrence of a failed carrier
recovery acceptance criteria and the actions taken shall be noted in the laboratory report [see
5.13.a)11],
D.4.2 Analytical Variability/Reproducibility
a)	Replicate - Shall be performed at a frequency of one per preparation batch where there is
sufficient sample to do so. The results of this analysis shall be one of the quality control
measures to be used to assess batch acceptance. The replicate result shall be assessed against
the specific acceptance criteria [see 5.10.1.2.b)18] specified in the laboratory method manual
[see 5.10.1.2]. When the specified replicate acceptance criteria is not met the specified corrective
action and contingencies [see 5.10.1.2.b)19 and 20] shall be followed. The corrective action
shall consider the fact that sample inhomogeneity may be a cause of the failed replicate
acceptance criteria. The occurrence of a failed replicate acceptance criteria and the actions taken
shall be noted in the laboratory report [see 5.13.a)10].
b)	For low level samples (less than approximately three times the detection limit) the laboratory may
analyze duplicate laboratory control samples or a replicate matrix spike (matrix spike and a matrix
spike duplicate) to determine reproducibility within a preparation batch.
D.4.3 Method Evaluation
In order to ensure the accuracy of the reported result, the following procedures shall be in place:
a)	Initial Demonstration of Capability-(section 5.10.2.1 and Appendix C) shall be performed initially
(priortothe analysis of any samples) and with a significant change in instrument type, personnel
or method.
b)	Proficiency Test Samples - The results of such analysis (5.4.2.j and 5.5.3.4) shall be used by the
laboratory to evaluate the ability of the laboratory to produce accurate data.
D.4.4 Radiation Measurement System Calibration
Because of the stability and response nature of modern radiation measurement instrumentation, it is
not typically necessary to verify calibrate of these systems each day of use. This section addresses
those practices that are necessary for proper calibration and those requirements of section 5.9.4.2
(Instrument Calibrations) that are not applicable to some types of radiation measurement
instrumentation.
a) Initial Instrument Calibration
1) Given that activity detection efficiency is independent of sample activity at all but extreme
activity levels, the requirements of subsections f, h and i of 5.9.4.2.1 are not applicable to
radiochemical method calibrations except mass attenuation in gas-proportional counting and
sample quench in liquid scintillation counting Radiochemistry analytical instruments are
subject to calibration when purchased, when the instrument is serviced, when the instrument
is moved and when the instrument setting(s) have been changed.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-21 of 25
2)	Instrument calibration shall be performed with reference standards as defined in section
D.4.7a. The standards shall have the same general characteristics (i.e., geometry,
homogeneity, density, etc.) as the associated samples.
3)	The frequency of calibration shall be addressed in the laboratory method manual [see
5.10.1.2.b)13] if not addressed in the method. A specific frequency (e.g. monthly) or
observations from the associated control or tolerance chart, as the basis for calibration shall
be specified.
Continuing Instrument Calibration Verification
Calibration verification checks shall be performed using appropriate check sources and monitored
with control charts or tolerance charts to ensure that the instrument is operating properly and that
the calibration has not changed. The same check source used in the preparation of the tolerance
chart or control chart at the time of calibration shall be used in the calibration verification of the
instrument. The check sources must provide adequate counting statistics for a relatively short
count time and the source should be sealed or encapsulated to prevent loss of activity and
contamination of the instrument and laboratory personnel. For alpha and gamma spectroscopy
systems, the instrument calibration verification shall include checks on the counting efficiency and
the relationship between channel number and alpha or gamma ray energy.
1)	For gamma spectroscopy systems, the calibration verification checks for efficiency and
energy calibration shall be performed on a day of use basis along with performance checks
on peak resolution.
2)	For alpha spectroscopy systems, the calibration verification check for energy calibration shall
be performed on a weekly basis and the performance check for counting efficiency shall be
performed on at least a monthly basis.
3)	For gas-proportional and liquid scintillation counters, the calibration verification check for
counting efficiency shall be performed on a day of use basis. Verification of instrument
calibration does not directly verify secondary calibrations, e.g., the mass efficiency curve or
the quench curve.
4)	For scintillation counters the calibration verification for counting efficiency shall be
performed on a day of use basis.
Background Measurement
Background measurements shall be made on a regular basis and monitored using control charts
or tolerance charts to ensure that a laboratory maintains its capability to meet required data
quality objectives. These values are subtracted from the total measured activity in the
determination of the sample activity.
1)	For gamma spectroscopy systems, background measurements shall be performed on at least
a monthly basis.
2)	For alpha spectroscopy systems, background measurements shall be performed on at least
a monthly basis.
3)	For gas-proportional counters background measurements shall be performed on a weekly
basis.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-22 of 25
4) For scintillation counters, background measurements shall be performed each day of use.
D.4.5 Detection Limits
a)	Must be determined prior to sample analysis and must be redetermined each time there is a
significant change in the test method or instrument type.
b)	The procedures employed must be documented and consistent with mandated method or
regulation.
D.4.6 Data Reduction
a)	Refer to Section 5.10.6," Computers and Electronic Data Related Requirements," of this
document.
b)	Measurement Uncertainties - each result shall be reported with the associated measurement
uncertainty. The procedures for determining the measurement uncertainty must be documented
and be consistent with mandated method and regulation.
D.4.7 Quality of Standards and Reagents
a)	The quality control program shall establish and maintain provisions for radionuclide standards.
1)	Reference standards that are used in a radiochemical laboratory shall be obtained from the
National Institute of Standards and Technology (NIST), EPA, or suppliers who participate in
supplying NIST standards or NIST traceable radionuclides. Any reference standards
purchased outside the United States shall be traceable back to each country's national
standards laboratory. Commercial suppliers of reference standards shall conform to ANSI
N42.22 to assure the quality of their products.
2)	Reference standards shall be accompanied with a certificate of calibration whose content is
as described in ANSI N42.22 - 1995, Section 8, Certificates.
3)	Laboratories should consult with the supplier if the lab's verification of the activity of the
reference traceable standard indicates a noticeable deviation from the certified value. The
laboratory shall not use a value other than the decay corrected certified value.
b)	All reagents used shall be analytical reagent grade or better.
D.4.8 Constant and Consistent Test Conditions
a)	To prevent incorrect analysis results caused by the spread of contamination among samples, the
laboratory shall establish and adhere to written procedures to minimize the possibility of cross-
contamination between samples.
b)	For gamma spectrometry systems, background check measurements shall be performed each
day of use.
c)	For alpha spectrometry systems, background check measurements shall be performed except
when using the electro-plating method of sample preparation.
d)	For gas-proportional counter systems, background check measurements shall be performed each
day of use.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-23 of 25
D.5 AIR TESTING
These standards shall apply to samples that are submitted to a laboratory for the purpose of analysis.
They do not apply to field activities such as source air emission measurements or the use of
continuous analysis devices.
D.5.1 Negative and Positive Controls
a)	Negative Controls
1)	Method Blanks - Shall be performed at a frequency of at least one (1) per batch of twenty
(20) environmental samples or less per sample preparation method. The results of the
method blank analysis shall be used to evaluate the contribution of the laboratory provided
sampling media and analytical sample preparation procedures to the amount of analyte found
in each sample. Ifthe method blank result is greater than the detection limit and contributes
greater than 10% of the total amount of analyte found in the sample, the source of the
contamination must be investigated and measures taken to eliminate the source of
contamination. If contamination is found, the data shall be qualified in the report.
2)	Collection Efficiency- Sampling trains consisting of multiple sections (e.g. filters, sorbent
tubes, impingers)that are received intact by the laboratory, shall be separated into "front" and
"back" sections if required by the client. Each section shall be processed and analyzed
separately and the analytical results reported separately.
b)	Positive Controls
1) Laboratory Control Sample (LCS) - Shall be analyzed at a rate of at least one (1) per batch
of twenty (20) or fewer samples per sample preparation method for each analyte. If a spiking
solution is not available, a calibration solution, whose concentration approximates that of the
samples, shall be included in each batch and with each lot of media. If a calibration solution
must be used for the LCS, the client will be notified prior to the start of analysis. The
concentration of the LCS shall be relevant to the intended use of the data and either at a
regulatory limit or below it.
c)	Surrogates - Shall be used as required by the test method or if requested by the client.
d)	Matrix spike - Shall be used as required by the test method, or if requested by the client.
D.5.2 Analytical Variability/Reproducibility
Matrix Spike Duplicates (MSDs) or Laboratory Duplicates - Shall be analyzed at a minimum of 1 in
20 samples per sample batch. The laboratory shall document their procedure to select the use of
appropriate types of spikes and duplicates. The selected samples(s) shall be rotated among client
samples so that various matrix problems may be noted and/or addressed. Poor performance in the
spikes and duplicates may indicate a problem with the sample composition and shall be reported to
the client.
D.5.3 Method Evaluation
In order to ensure the accuracy of the reported result, the following procedures shall be in place:

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-24 of 25
a)	Demonstration of Capability - (Sections 5.6.2 and 5.10.2.1) shall be performed prior to the
analysis of any samples and with a significant change in instrument type, personnel, matrix, or
test method.
b)	Calibration - Calibration protocols specified in Section 5.9.4 shall be followed.
c)	Proficiency Test Samples - The results of such analyses (5.4.2.j or 5.5.3.4)shall be used by the
laboratory to evaluate the ability of the laboratory to produce accurate data.
D.5.4 Detection Limits
The laboratory shall utilize a test method that provides a detection limit that is appropriate and
relevant for the intended use of the data. Detection limits shall be determined by the protocol in the
mandated test method or applicable regulation, e.g., MDL. If the protocol for determining detection
limits is not specified, the selection of the procedure must reflect instrument limitations and the
intended application of the test method.
a)	A detection limit study is not required for any component for which spiking solutions are not
available such as temperature or on-line analyses.
b)	The detection limit shall be initially determined for the compounds of interest in each test method
in a matrix in which there are not target analytes nor interferences at a concentration that would
impact the results or the detection limit must be determined in the matrix of interest (see definition
of matrix).
c)	Detection limits must be determined each time there is a significant change in the test method
or instrument type.
d)	All sample processing steps of the analytical method must be included in the determination of the
detection limit.
e)	All procedures used must be documented. Documentation must include the matrix type. All
supporting data must be retained.
1) The laboratory must have established procedures to tie detection limits with quantitation limits.
D.5.5 Data Reduction
The procedures for data reduction, such as use of linear regression, shall be documented.
D.5.6 Quality of Standards and Reagents
a)	The source of standards shall comply with 5.9.2.
b)	The purity of each analyte standard and each reagent shall be documented by the laboratory
through certificates of analyses from the manufacturer/vendor, manufacturer/vendor
specifications, and/or independent analysis.
c)	In methods where the purity of reagents is not specified, analytical reagent grade or higher
quality, if available, shall be used.

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5D-25 of 25
D.5.7 Selectivity
The laboratory shall develop and document acceptance criteria for test method selectivity such as
absolute and relative retention times, wavelength assignments, mass spectral library quality of match,
and mass spectral tuning.
D.5.8 Constant and Consistent Test Conditions
a)	The laboratory shall assure that the test instruments consistently operate within the
specifications required of the application for which the equipment is used.
b)	The laboratory shall document that all sampling equipment, containers and media used or
supplied by the laboratory meet required test method criteria.
c)	If supplied or used by the laboratory, procedures for field equipment decontamination shall be
developed and their use documented.
d)	The laboratory shall have a documented program for the calibration and verification of sampling
equipment such as pumps, meter boxes, critical orifices,flow measurement devices and
continuous analyzers, if these equipment are used or supplied by the laboratory.

-------
QUALITY SYSTEMS
APPENDIX E
ADDITIONAL SOURCES OF
INFORMATION AND ASSISTANCE
-Non-Mandatory Appendix-

-------
NELAC
Quality Systems
Revision 15
May 25, 2001
Page 5E-1 of 1
Appendix E - ADDITIONAL SOURCES OF INFORMATION
Non-Mandatory Appendix-
Additional sources of information are available to assist laboratories in the design and implementation
of a quality system. These materials may be found on the NELAC web page at
www.epa.gov/ttn/nelac under the topic "Related Information."

-------
m



CD

O

1—H

<'

CD

C_

c



J-*-

ro

o

o

00

c
>
3
"O

"O
CD

C/)
o
C/)

O
I-H
CD
Q.
CD

03
<

c75"
ro
CD
cn
3
O
I-H
ro
o
CD
o
National Environmental
Laboratory Accreditation
Conference
>
> o

-------
Note that the NELAC standards now have two significant dates: 1) the
date the standards were approved at the annual meeting, and 2) the date
the standards are effective and must be implemented. This is especially
important as some portions of the standards have different effective
dates. The approval date is part of the document control header on each
page. The cover of each chapter shows both the approval date and the
effective date. Changes approved for implementation at a time other than
the effective date (on the chapter cover) are noted in the chapter,
showing the approved text and its effective date.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page i of ii
TABLE OF CONTENTS
6.0	ACCREDITING AUTHORITY	 1
6.1	INTRODUCTION 	 1
6.2	GENERAL PROVISIONS	 1
6.2.1	Recognition 	 2
6.2.2	Where to Apply for NELAP Accreditation 	 4
6.2.3	Documentation Maintained by Accrediting Authorities	 5
6.3	APPLICATION FOR NELAP RECOGNITION 	 5
6.3.1	Written Application for NELAP Recognition	 6
6.3.2	Application Completeness Review by NELAP	 8
6.3.3	Application Technical Review by a NELAP Evaluation Team 	 8
6.3.3.1	Required Technical Elements of a NELAP-Recognized Accrediting Authority's
Program 	 9
6.3.3.1.1	Records	 11
6.3.3.1.2	Use of Contractors by an Accrediting Authority	 11
6.3.3.1.3	Accrediting Authority's Quality System 	 12
6.3.3.1.4	Mutual Assistance Agreements	 12
6.3.3.2	Application Technical Review Report 	 12
6.3.4	Notification of Changes to An Accrediting Authority's Program	 14
6.4	ON-SITE EVALUATION OF THE ACCREDITING AUTHORITY	 14
6.4.1	Scheduling the On-site Evaluations	 15
6.4.2	Conducting the On-site Evaluation 	 15
6.4.3	On-site Evaluation Reports 	 16
6.5	ACCREDITING AUTHORITY'S REQUEST FOR EXTENSION OF TIME TO COMPLY WITH
THE NELAC STANDARDS 	 18
6.6	NELAP EVALUATION TEAM RECOMMENDATIONS TO THE NELAP DIRECTOR	 19
6.7	CERTIFICATE OF RECOGNITION TO THE ACCREDITING AUTHORITY	 20
6.8	USE OF ACCREDITATION BY NELAP ACCREDITED LABORATORIES 	 20
6.9	REQUIREMENTS OF THE NELAP 	 21
6.9.1 NELAP Evaluation Team	 22
6.10	APPEALING FINDINGS BASED UPON DIFFERENCES IN STANDARDS
INTERPRETATIONS	 22
6.11	APPEALING DECISIONS TO DENY OR REVOKE NELAP RECOGNITION	 23
Appendix A-QUESTIONS OF UNIFORMITY PROCEDURE	 A-1
A.1 PURPOSE 	 A-1

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page ii of ii
A.2 PROCEDURE FOR INITIATION OF RESOLUTION BY AFFECTED PARTIES	A-1
A.2.1 Initial Decision/Interpretation Procedure	A-1
A.2.2 Decision/Interpretation Procedure When Affected Parties Cannot an Agreement
	A-1
A.3 APPEAL PROCEDURE	A-1
A.4 POSTING OF DECISION 	A-2

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 1 of 25
6.0	ACCREDITING AUTHORITY
6.1	INTRODUCTION
The standards in this chapter define the process and criteria that will be used by the National
Environmental Laboratory Accreditation Program (NELAP) to determine whether accrediting
authorities applying for NELAP recognition meet the standards required for such recognition.
Chapter 6 is structured so that the requirements of the International Organization for
Standardization/the International Electrotechnical Commission (ISO/IEC) Guide 58: Calibration and
testing laboratory accreditation systems-General requirements for operation and recognition, 1993
are incorporated into the requirements for an accrediting authority to be NELAP-recognized.
Chapter 6 addresses most of the requirements of ISO/IEC Guide 58. All NELAP-recognized
accrediting authorities are required to administer an environmental laboratory accreditation program
that meets the requirements contained in the National Environmental Laboratory Accreditation
Conference (NELAC) standards, Chapter 6. Those ISO/IEC Guide 58 requirements not addressed
in Chapter 6 are addressed in the NELAC standards, Chapters 2 through 5. Since Chapter 6
requires an accrediting authority to administer an environmental laboratory accreditation program that
requires laboratories to meet the standards set forth in the NELAC standards, Chapters 2 through 6,
all the requirements of ISO/IEC Guide 58 will be met by a NELAP-recognized accrediting authority.
In most cases, the ISO/IEC requirements, contained in Chapter 6 or elsewhere in the NELAC
standards are not direct quotations from the ISO/IEC guidance document.
6.2	GENERAL PROVISIONS
a)	In all cases, accrediting authorities are governmental organizations at the territory, state orfederal
levels.
b)	A territorial, state or federal entity shall designate the appropriate agencies or departments as its
designated NELAP-recognized accrediting authorities for the fields of accreditation for which
NELAP recognition is being sought.
c)	A NELAP-recognized accrediting authority shall not delegate authority for granting, maintaining,
suspending or revoking a laboratory's NELAP accreditation to an outside person or body.
Portions of the accreditation process may be contracted out when the accrediting authority
follows the provisions of subsections 6.3.3.1.2 and 6.3.3.1.3 (b)(3); however, the authority to
grant, maintain, suspend or revoke NELAP accreditation must remain with the accrediting
authority.
d)	The procedures under which a NELAP-recognized accrediting authority operates shall be
administered in an impartial and non-discriminatory manner. The accrediting authority also shall
require accredited laboratories to maintain impartiality and integrity. An accrediting authority shall
have no rules, regulations, procedures or practices that:
1)	restrict the size, large or small, of any laboratory seeking accreditation;
2)	require membership or participation in any laboratory or other professional association;
3)	impose any financial conditions or restrictions for participation in the accreditation program
other than the fees authorized by territorial, state orfederal law; and

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 2 of 25
4) conflict with any territorial, state or federal laws governing discrimination.
e)	Accrediting authorities and their contractors shall confine their requirements, evaluations and
decision making processes for a NELAP accredited laboratory to those matters specifically
related to the fields of accreditation of the NELAP accreditation being sought by a laboratory.
f)	If the NELAP insignia is used on general literature such as brochures, letterheads and business
cards, a NELAP-recognized accrediting authority shall accompany the display of the NELAP
insignia with at least the phrase "NELAP-recognized."
g)	Accrediting authorities, within the scope and applicability of their prevailing rules and regulations,
shall establish one or more technical committees for assistance in interpretation of requirements
and for advising the accrediting authority on the technical matters relating to the operation of its
environmental laboratory accreditation program. When such committees are established, the
accrediting authority shall have
1)	formal rules and structures for the appointment and operation of committees involved in the
accreditation process and such committees shall be free from any commercial, financial, and
other pressures that might influence decisions, or
2)	a structure where committee members are chosen to provide relevant competent technical
support and impartiality through a balance of interests where no single interest predominates,
and
3)	a mechanism for publishing interpretations and recommendations made by these
committees.
h)	Unless the contrary is clearly indicated, all references in this Chapter to singular nouns include
the plural noun, and all references to plural nouns include the singular, for example, "area of
responsibility" also includes multiple "areas of responsibility."
6.2.1 Recognition
a)	Except for NELAP-recognized federal accrediting authorities (see 6.2.1 (h) and (i) below),
NELAP-recognized secondary accrediting authorities shall grant accreditation to laboratories
accredited by any other NELAP-recognized primary accrediting authority. Such reciprocal
NELAP accreditation shall be granted on a laboratory-by-laboratory basis. The NELAP-
recognized secondary accrediting authority shall consider only the current certificate of
accreditation issued by the NELAP-recognized primary accrediting authority.
b)	When granting reciprocal accreditation to a laboratory, the NELAP-recognized secondary
accrediting authority shall:
1)	grant reciprocal accreditation for only the fields of accreditation, methods and analytes for
which the laboratory holds current primary NELAP accreditation, and
2)	grant reciprocal accreditation and issue certificates, as required in NELAC, Chapter 4, to an
applicant laboratory within 30 calendar days of receipt of the laboratory's application.
c)	All fees shall be paid by laboratories as required by the NELAP-recognized secondary accrediting
authority.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 3 of 25
Laboratories seeking NELAP accreditation by a NELAP-recognized secondary accrediting
authority shall not be required to meet any additional proficiency testing, quality assurance, or on-
site evaluation requirements for the fields of accreditation for which the laboratory holds primary
NELAP accreditation.
If a NELAP-recognized secondary accrediting authority notes any potential nonconformance with
the NELAC standards by a laboratory during the initial application process for reciprocal
accreditation, or for a laboratory that already has been granted NELAP accreditation through
recognition, the NELAP-recognized secondary accrediting authority shall immediately notify, in
writing, the applicable NELAP-recognized primary accrediting authority and the laboratory.
However, the laboratory is to be notified only in situations where no administrative or judicial
prosecution is contemplated. The notification must cite the applicable sections within the NELAC
standards for which nonconformance by the laboratory has been noted.
1)	If the alleged nonconformance is noted during the initialapplication process for reciprocal
NELAP accreditation, final action on the application for reciprocal NELAP accreditation shall
not be taken until the alleged nonconformance issue has been resolved, or
2)	If the alleged nonconformance is noted after reciprocal NELAP accreditation has been
granted, the laboratory shall maintain its current NELAP accreditation status until the alleged
nonconformance issue has been resolved.
Upon receipt of the subsection 6.2.1 (e) notification, the NELAP-recognized primary accrediting
authority shall:
1)	review and investigate the alleged nonconformance,
2)	take appropriate action on the laboratory as set forth by the NELAC standards, including the
addition of any change of accreditation status in the National Environmental Laboratory
Accreditation Database. All such actions shall be taken in accordance with the laboratory's
right to due process as set forth in the NELAC standards, Chapter 4, Accreditation Process,
3)	respond to the NELAP-recognized secondary accrediting authority, in writing, with a copy to
the NELAP Director, within 20 calendardays of receipt of the subsection 6.2.1 (e) notification
providing:
i)	an initial report of the findings;
ii)	a description of the actions to be taken; and,
iii)	a schedule for implementation of further action on the alleged nonconformance, if
necessary.
If, in the opinion of the secondary accrediting authority, the primary accrediting authority does not
take timely and appropriate action on the complaint, the secondary accrediting authority should
notify the NELAP Director of the dispute between the two accrediting authorities regarding proper
disposition of the complaint. Within 20 calendardays of receipt of such notification, the NELAP
Director shall review the alleged nonconformance and take appropriate action according to the
standards set forth in this chapter.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 4 of 25
h)	Federal accrediting authorities shall serve as the accrediting authority only for governmental
laboratories.
i)	County, municipal, and non-governmental laboratories shall not claim either primary orsecondary
accreditation by a federal agency, even if the laboratory is performing analyses under contract
to that agency.
6.2.2 Where to Apply for NELAP Accreditation
a)	All county, municipal and non-governmental laboratories seeking NELAP accreditation or renewal
of NELAP accreditation must apply for such accreditation through their home state (the state in
which the laboratory facility is located) accrediting authority.
b)	Laboratories located in a territory or state that is not NELAP-recognized may seek NELAP
accreditation through any NELAP-recognized state or territorial accrediting authority.
c)	Except as noted in subsection 6.2.2 (g) below, state governmental laboratories seeking NELAP
accreditation or renewal of NELAP accreditation may apply for such accreditation through their
home state, home territory or through a NELAP-recognized federal accrediting authority.
d)	Except as noted in subsection 6.2.2 (g) below, federal governmental laboratories located in a
department or agency that is a NELAP-recognized federal accrediting authority shall follow that
department or agency's policy regarding NELAP accreditation or renewal of NELAP accreditation.
e)	Federal governmental laboratories located in a federal department or agency that is not a
NELAP-recognized accrediting authority may seek NELAP accreditation through any NELAP-
recognized federal or state accrediting authority, except where the relationship poses a conflict
of interest.
f)	Laboratories that are NELAP accredited by a state accrediting authority that has lost NELAP
recognition may seek renewal of NELAP accreditation through any NELAP-recognized state
accrediting authority. The laboratory's NELAP accreditation from an accrediting authority that has
lost NELAP recognition shall remain valid throughout its current certificate of accreditation.
g)	Governmental laboratories that are organizational units of the same department or agency in
which the accrediting authority is located or have other institutional conflicts of interest shall:
1)	demonstrate by organizational structure that the laboratory's Technical Director and the
environmental laboratory accreditation program manager do not report within the same chain-
of-command; and
2)	demonstrate by policies and procedures that conflicts-of-interest do not exist; or
3)	apply for NELAP accreditation through any other NELAP-recognized accrediting authority.
h)	In order that all laboratory applications for NELAP accreditation are treated equally, accrediting
authorities shall initiate processing applications for NELAP accreditation in the chronological order
that the applications are received.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 5 of 25
6.2.3 Documentation Maintained by Accrediting Authorities
a)	The accrediting authority shall maintain in hard copy, electronic media or other means a
document or documents describing its environmental laboratory accreditation program.
1)	The document or documents shall include the following:
A)	information setting forth the authority of the accrediting authority to grant laboratory
accreditations and whether such laboratory accreditation is mandatory or voluntary;
B)	information setting forth the accrediting authority's requirements for an environmental
laboratory to become accredited;
C)	information setting forth the accrediting authority's evaluatortraining and ongoing internal
audit program
D)	a list of names of the qualified evaluators and a list of technical support personnel (as
defined in 3.4.1.2) with areas of responsibility, education and experience.
E)	information stating the requirements for granting, maintaining, withdrawing,
suspending or revoking laboratory accreditation;
F)	information about the laboratory accreditation process;
G)	information on fees charged to applicants and accredited laboratories;
H)	information regarding the rights and duties of accredited laboratories; and
I)	information listing its NELAP accredited laboratories describing the NELAP accreditation
granted.
2)	The document or documents shall be reviewed annually. A written record of this review must
be available for inspection by the NELAP evaluation team.
b)	When the document or documents reviewed in subsection 6.2.3(a)(2) above reveals that the
accrediting authority's environmental laboratory accreditation program has changed or is
otherwise different from the accreditation program described in such documents, the document
or documents shall be updated within 30 calendar days of the review.
c)	The document or documents described in subsection 6.2.3(a)(1) above shall be made readily
available upon request.
d)	The accrediting authority shall have arrangements, consistent with NELAC, Chapter 3, On-site
Evaluation to safeguard information claimed by the laboratories as confidential.
6.3 APPLICATION FOR NELAP RECOGNITION
This section describes the process by which accrediting authorities may apply for NELAP recognition
and the procedures that NELAP will use to review the applications.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 6 of 25
6.3.1 Written Application for NELAP Recognition
a)	Each accrediting authority requesting initial NELAP recognition shall complete an application and
supply all supporting documentation. Applications can be obtained from the Office of the NELAP
Director, USEPA.
b)	The application shall request information that is essential for the NELAP to evaluate an
accrediting authority's environmental laboratory accreditation program. When documentation is
required, copies of the applicable statutes, rules, regulations, policy statements, standard
operating procedures, guidance documents, etc. must be submitted along with a clear citation of
where the required information is found in the documents. The application will request the
following information and documentation from the accrediting authority:
1)	the name, mailing address, telephone number, electronic mail address and telefacsimile
number of the accrediting authority;
2)	the statutes and regulations establishing and governing the accrediting authority's
environmental laboratory accreditation program as required in subsection 6.3.3.1 (b) and (c);
3)	the policies, guidance documents, promulgating instructions and standard operating
procedures governing the operation of the accrediting authority's environmental laboratory
accreditation program as set forth in subsection 6.3.3.1;
4)	the accrediting authority's arrangements for liability insurance and workman's compensation
insurance coverage as required in subsection 6.3.3.1 (d);
5)	the requirements governing how the accrediting authority restricts the use of its accreditation
by accredited laboratories as required in Section 6.8;
6)	the fields of accreditation for which the accrediting authority is requesting NELAP recognition;
7)	the name and title of the primary person responsible for the day-to-day management of the
accrediting authority's environmental laboratory accreditation program as required in
subsection 6.3.3.1 (h);
8)	the names, areas of responsibility, education and experience levels of the accrediting
authority's environmental laboratory accreditation program's management and technical staff
as required in subsection 6.3.3.1 (f), (g) and (h);
9)	the names and contractual agreements for any external evaluation bodies used by the
accrediting authority as required in subsection 6.3.3.1.2 and 6.3.3.1.3 (b)(3);
10)	the names, areas of responsibility, education and experience levels of all technical and
evaluation employees of any external evaluation bodies used by the accrediting authority as
required in subsection 6.3.3.1.2 and 6.3.3.1.3 (b)(3);
11)	RESERVED
12)	a description of the accrediting authority's environmental laboratory accreditation program
quality systems (e.g., a quality systems manual or a quality assurance plan) as required in
subsection 6.3.3.1.3;

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 7 of 25
13)	the procedures for the selecting, training, contracting and appointing of the accrediting
authority's laboratory evaluators as required in subsection 6.3.3.1 (f) and (g);
14)	a description of the accrediting authority's conflict-of-interest disclosure program as required
in subsection 6.3.3.1 (i);
15)	a tabular listing of all laboratories applying for accreditation in the two-year period
immediately preceding the date of the application. The table shall set forth the date on which
the laboratory's application for accreditation was received by the accrediting authority and the
date on which final action on the application was taken.
16)	the policies and procedures used by the accrediting authority for establishing and maintaining
records on each accredited laboratory and procedures for record access and retention as
required in subsection 6.3.3.1.1;
17)	the accrediting authority's findings, reports and corrective actions from internal audits
conducted in the last two years as required in subsection 6.3.3.1 (j) and 6.3.3.1.3 (b)(4);
18)	a certification that the accrediting authority meets the provisions of Section 6.2 of this
chapter;
19)	the name and job title of the individual or individuals authorized to sign accreditation
certificates; and
20)	the standardized checklist required by subsection 6.3.2 (c)(1) is to be completed by the
applicant accrediting authority citing the location in the application or supporting documents
where the checklist information is provided.
The application must be signed and dated by the highest ranking individual within the department
or agency responsible for laboratory accreditation activities for which NELAP recognition is being
sought. By signature on the application, this individual must attest to the validity of the
information contained within the application and its supporting documents.
The accrediting authority shall submit a renewal application to the NELAP every two years to
maintain NELAP recognition.
1)	The NELAP shall send by certified mail or some other verifiable means to the accrediting
authority, no laterthan 180 calendardays priortothe expiration of the accrediting authority's
then-current NELAP recognition an application for renewal of NELAP recognition to the
accrediting authority. This notification of renewal shall indicate whether an on-site evaluation
is due as set forth in subsection 6.4 (a).
2)	The accrediting authority must address each requirement of subsection 6.3.1 (b); however,
it must su bmit information and documentation only of changes from the accrediting authority's
most recent NELAP-recognized environmental laboratory accreditation program.
3)	The accrediting authority must submit the completed renewal application and supporting
documents to the NELAP within 30 calendar days of receiving the renewal notification.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 8 of 25
6.3.2	Application Completeness Review by NELAP
a)	The NELAP is required to provide notices required by this chapter only to those accrediting
authorities who have submitted an initial application for NELAP recognition or who hold NELAP
recognition.
b)	If the NELAP does not receive a completed renewal application as specified in subsection 6.3.1
(d)(3), the accrediting authority shall be notified in writing. If the accrediting authority does not
submit the completed application within 20 calendar days of receipt of this notification from the
NELAP, the accrediting authority's NELAP recognition will not be renewed upon expiration of its
current NELAP recognition.
c)	Following receipt of an initial or a renewal application, the NELAP must complete a review of the
application and supporting documents to determine that information and supporting
documentation required in subsection 6.3.1 (b) is included with the submittal.
1)	The completeness review of the application and supporting documents shall be conducted
using a standardized checklist provided by the NELAP as part of the application. The
checklist shall be designed to assist the applicant in gathering all the information needed to
complete the application and include a place to note the date the completeness review was
completed.
2)	The NELAP must notify the accrediting authority in writing within 20 calendar days of
receiving the application of any additional information needed to complete the application.
3)	The accrediting authority must provide any additional information or clarification requested
in writing within 20 calendar days of receipt of the 6.3.2(c)(2) notification.
i)	The NELAP may grant extensions to the 20-day time period for up to an additional 20
calendar days if the accrediting authority requests the extension in writing.
ii)	The NELAP shall notify the accrediting authority in writing when an extension is granted.
4)	Written notification to the accrediting authority that an application is complete shall be
furnished by the NELAP within seven calendar days of the date of such determination.
6.3.3	Application Technical Review by a NELAP Evaluation Team
a) Within 30 calendar days of the determination that the application is complete, the NELAP
evaluation team as established in subsection 6.9.1 will perform a technical review of the
application and its supporting documents and respond in writing to the accrediting authority.
1)	The review shall be conducted in accordance with the NELAP standard operating procedures
for application review; and
2)	The review shall be performed by the same NELAP evaluation team assigned to conduct the
on-site evaluation.
3)	In the years when no on-site evaluation is required, as provided in subsection 6.4 (a)(2), the
NELAP Director shall endeavor to appoint the same NELAP evaluation team that conducted

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 9 of 25
the application technical review and on-site evaluation for the accrediting authority's
immediately preceding application cycle.
4) The NELAP Director shall appoint a different NELAP evaluation team for each succeeding
four-year NELAP on-site evaluation cycle as set forth in Section 6.4 (a) of this chapter. New
four-year NELAP on-site evaluation cycles shall start with each renewal application when an
on-site evaluation of the accrediting authority is required.
The NELAP evaluation team will review the application and supporting documents to evaluate
whether the accrediting authority's environmental laboratory accreditation program requires its
accredited laboratories to meet the standards set forth by the NELAC standards, Chapter 2,
Proficiency Testing, Chapter 3, On-site Evaluation, Chapter 4, Accreditation Process and Chapter
5, Quality Systems.
Should the NELAP evaluation team have questions or need additional application information to
determine the accrediting authority's compliance with this chapter, the NELAP evaluation team
must seek additional application information and documentation from the accrediting authority.
.3.1 Required Technical Elements of a NELAP-Recognized Accrediting Authority's Program
The NELAP evaluation team will review the application and supporting documentation to ensure
that the accrediting authority's environmental laboratory accreditation program meets the
requirements of subsection (b) through (m) below.
The accrediting authority shall be a legally identifiable governmental entity;
The accrediting authority shall have the authority, rights and responsibilities necessary to carry
out an environmental laboratory accreditation program;
The accrediting authority shall have the same arrangements to cover liabilities and workman's
compensation claims arising from its operations and activities as all other programs, units,
divisions, bureaus, etc. in the department or agency in which the accrediting authority is located;
The accrediting authority shall have financial stability and the physical and human resources
required for the operation of an accrediting authority's laboratory accreditation program. The
accrediting authority shall have and make available on request a description of the means by
which it receives its financial support. As a benchmark, the accrediting authority shall have the
resources necessary to complete action on a laboratory's application within nine months from the
time a completed application is first received from the laboratory. This time period applies as long
as all turn-around times for responses to application review, proficiency testing and on-site
evaluation issues are carried out within the required time limits set forth in the NELAC standards.
The accrediting authority shall appoint and maintain records on evaluators, including contractual
evaluators, who meet the education, experience and training requirements set forth in the NELAC
standards, Chapter 3, On-site Evaluation. Such records shall include:
1)	name and address;
2)	organization affiliation and position held;
3)	educational qualification and professional status;

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 10 of 25
4)	work experience;
5)	training applicable to laboratory accreditation;
6)	experience in laboratory evaluation, together with field of competence; and
7)	date of most recent updating of record.
g)	The accrediting authority shall have a system in place to evaluate evaluator performance that is
consistent with the organizational employee evaluation program and demonstrates compliance
with the NELAC standards, Chapter 3, On-site Evaluation.
h)	The accrediting authority shall identify one individual responsible for day-to-day management of
the accrediting authority's environmental laboratory accreditation program. This individual must:
1)	be an employee of the accrediting authority, and
2)	have the technical expertise necessary to:
i)	plan and manage the laboratory accreditation program,
ii)	coordinate various facets of the laboratory accreditation program with other territory,
state and federal accrediting authorities,
iii)	coordinate development of environmental laboratory accreditation regulations, and
iv)	evaluate the technical competence and performance of contractors or employees.
i)	The accrediting authority shall have arrangements to ensure that the accrediting authority's
management and technical staff are free of any commercial, financial or other pressures that
influence the results of the accreditation process and are subject to the same conflict of interest
disclosure requirements designed to identify and eliminate potential conflict-of- interest problems
as all other programs, units, divisions, bureaus etc. in the department or agency in which the
accrediting authority is located;
j) The accrediting authority shall have a documented procedure in place to conduct systematic
internal audits annually of the accrediting authority's environmental laboratory accreditation
program to verify compliance with the NELAC standards. One element of the annual internal
audit shall be to review the effectiveness of the quality systems required in subsection 6.3.3.1.3.
When applicable, the accrediting authority shall use the same policies and procedures for internal
audits as used by all other programs, units, divisions, bureaus etc. in the department or agency
in which the accrediting authority is located;
k) The accrediting authority shall designate the individual specified in subsection 6.3.3.1 (h) or an
individual who reports directly to the individual responsible for day-to-day management of the
accrediting authority's environmental laboratory accreditation program to take responsibility for
the quality system and maintenance of the quality documentation required in subsection 6.3.3.1.3;
I) The accrediting authority shall have established standard operating procedures for dealing with
appeals, complaints and disputes arising from denial, suspension or revocation of laboratory

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 11 of 25
accreditation, or from users of the services about the NELAP accredited laboratories or any other
matters;
m) The accrediting authority shall require NELAP-accredited laboratories to participate in a
proficiency testing program meeting the requirements of the NELAC standards, Chapter 2,
Proficiency Testing, Appendix A; and
n) The accrediting authority or its contractors shall not offer consultancy or other services which may
compromise the objectivity or impartiality of its accreditation process and decisions.
6.3.3.1.1	Records
a)	The accrediting authority shall have arrangements to establish and maintain records for each
accredited laboratory with respect to all aspects of the laboratory's accreditation process.
b)	The accrediting authority shall have a policy and procedure for retaining NELAP accreditation
records for a minimum often years or a longer period of time if required by contractual obligations
or pertinent territorial, state or federal laws and regulations.
c)	The accrediting authority shall have a policy and procedures concerning access to records as
prescribed by the territorial, state or federal entity in which the accrediting authority resides.
d)	The accrediting authority shall have a policy and procedure for updating the NELAP national
database with the NELAP-required information specific to the laboratories for which that
accrediting authority is the primary orsecondary accrediting authority. These updates must occur
no less frequently than every two weeks. The schedule forthe updates would include submitting
a report even if there were no changes to the database.
6.3.3.1.2	Use of Contractors by an Accrediting Authority
a)	The accrediting authority shall have arrangements to ensure and require by signed contract or
other similar type of binding document that all laboratory accreditation functions performed by a
contractor on behalf of the accrediting authority are carried out in compliance with the NELAC
standards.
b)	When laboratory accreditation functions are contracted out, the accrediting authority shall:
1)	take full responsibility for such contracted work,
2)	ensure that the contractor and their employees are competent and comply with the applicable
provisions of the NELAC standards,
3)	ensure that the contractor and their employees comply with the confidentiality requirements
of the accrediting authority and NELAC, and,
4)	ensure that the contractor and their employees are not directly involved with:
i) the laboratory seeking NELAP accreditation from the accrediting authority employing the
contractor; or

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 12 of 25
ii) any other affiliation which would compromise impartiality in the NELAP laboratory
accreditation process.
6.3.3.1.3	Accrediting Authority's Quality System
a)	The accrediting authority shall have a quality system appropriate to the type, range and volume
of work performed by the accrediting authority.
b)	The quality system shall be documented in a quality manual and associated written quality
procedures and shall be made available for use by the staff. The quality manual shall include at
least the following:
1)	the quality policy statement, including objectives and commitments, signed by the manager
responsible for day-to-day management of the accrediting authority's environmental
laboratory accreditation program;
2)	the organizational structure of the accrediting authority's environmental laboratory
accreditation program and the responsibilities of individual staff assigned to the structure;
3)	the policies and procedures for acquiring, training, supervising and evaluating the
performance of accrediting authority employees or contractors carrying out any part of the
accrediting authority's laboratory accreditation program;
4)	the arrangements for annual internal audits, including Quality System reviews, as required
in subsection 6.3.3.1 (j);
5)	the system for providing feed back to personnel responsible forthe area audited and fortaking
timely and appropriate corrective actions whenever discrepancies are detected;
6)	the procedures established to address conflict-of-interest questions arising from the NELAC
standards as set forth in subsection 6.2.2 (d)(2) and for the accrediting authority's
management and technical staff as set forth in subsection 6.3.3.1 (i);
7)	the policies and procedures established to maintain document control fordocuments required
by the NELAC standards;
8)	the policies and procedures to implement the accreditation process; and
9)	the policies and procedures for dealing with appeals, complaints and disputes by
laboratories.
6.3.3.1.4	Mutual Assistance Agreements
Upon mutual agreement, another NELAP-recognized accrediting authority may perform laboratory
accreditation functions on behalf of a NELAP-recognized primary accrediting authority. Such an
arrangement does not require approval by the NELAP Director.
6.3.3.2 Application Technical Review Report
a) The NELAP evaluation team will accept an initial application and its supporting documentation
for continued processing that contains sufficient information to determine that an accrediting

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 13 of 25
authority meets the requirements of the NELAC standards for designation as a NELAP-
recognized accrediting authority. When the NELAP evaluation team completes its review of an
initial application and notes no deficiencies, the NELAP evaluation team will schedule the on-site
evaluation as set forth in subsection 6.4.1 below.
The NELAP evaluation team will accept a renewal application and its supporting documentation
for continued processing that contains sufficient information to determine that an accrediting
authority meets the requirements of the NELAC standards for designation as a NELAP-
recognized accrediting authority. When the NELAP evaluation team completes its review of a
renewal application and denotes no deficiencies, the NELAP evaluation team will recommend to
the NELAP Director that NELAP recognition be maintained.
Except as noted in Section 6.5, the NELAP evaluation team will not accept the application for
continued processing if it notes deficiencies. The NELAP evaluation team will send by certified
mail an application technical review report to the accrediting authority. The report will:
1)	identify any specific deficiencies noted during the application technical review,
2)	include references to the specific NELAC standards, and
3)	provide suggested corrective action.
To proceed with the review process, the accrediting authority shall respond with written corrective
actions within 30 calendar days of receipt of the NELAP evaluation team's subsection 6.3.3.2(c)
notification. The NELAP evaluation team will review the corrective actions within 30 calendar
days of receipt of the accrediting authority's response. Alternately, the accrediting authority has
the option to withdraw all or part of its NELAP recognition request.
1)	Ifthe corrective actions submitted by the accrediting authority do not meet the requirements
of this chapter, the NELAP evaluation team will notify the accrediting authority that it must
submit additional corrective actions within 20 calendar days of receipt of the NELAP
evaluation team's response. The NELAP evaluation team will review the accrediting
authority's second corrective action response within 20 calendar days of receipt.
2)	If the second corrective action response submitted by the accrediting authority does not
address satisfactorily all of the application deficiencies, the NELAP evaluation team will make
no further suggestions to the accrediting authority for correction of application deficiencies.
3)	If application deficiencies still remain after the evaluation team's second attempt to resolve
those deficiencies, the NELAP evaluation team will document those deficiencies which are
not resolved and recommend to the NELAP Director that:
i)	the accrediting authority's application for initial NELAP recognition be denied; or
ii)	the accrediting authority's NELAP recognition be revoked.
Ifthe initial application as submitted contained no deficiencies or if deficiencies were corrected
as provided in subsection 6.3.3.2(d), except those deficiencies requiring legislative or rulemaking
action as set forth in Section 6.5, the NELAP evaluation team will schedule the on-site evaluation
as set forth in subsection 6.4.1 below.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 14 of 25
f)	If an accrediting authority elects to appeal denial or revocation of NELAP recognition resulting
from the Section 6.3.3 application technical review process, an accrediting authority must follow
the procedure set forth in Section 6.10 of this chapter.
g)	After review of the renewal NELAP-recognition application and supporting documents, the NELAP
evaluation team will schedule, when required, an on-site evaluation of the accrediting authority's
environmental laboratory accreditation program as set forth in Section 6.4 (a) and subsection
6.4.1 (a) below.
6.3.4 Notification of Changes to An Accrediting Authority's Program
a)	For all changes in the accrediting authority's environmental laboratory accreditation program
listed below, the NELAP Director shall be notified of changes to:
1)	the authority to accredit laboratories as stated in the statutes, regulations and promulgating
instructions establishing and governing the accrediting authority's environmental laboratory
accreditation program,
2)	the organizational structure including key personnel,
3)	the rules, regulations, policies, guidance documents and standard operating procedures,
4)	the mailing address and office location, telephone and telefacsimile numbers and electronic
mail address, and
5)	the contractual arrangements, including contractor's personnel, for laboratory accreditation
activities contracted out under authority of subsection 6.2 (c).
b)	The notification to the NELAP Director shall be made within 30 calendar days of the change
taking place in the accrediting authority's environmental laboratory accreditation program.
c)	The NELAP Director may request further documentation or conduct on-site evaluations to verify
that changes in the accrediting authority's NELAP-recognized environmental laboratory
accreditation program do not place that program in violation of the NELAC standards.
6.4 ON-SITE EVALUATION OF THE ACCREDITING AUTHORITY
a) On-site evaluations of an accrediting authority's environmental laboratory accreditation program
shall be conducted on a four-year cycle as follows:
1)	An initial on-site evaluation shall be conducted in conjunction with an accrediting authority's
initial application process and every four years thereafter; and
2)	No on-site evaluation of an accrediting authority's environmental laboratory accreditation
program is required forthe two-year renewal application immediately following an application
for NELAP recognition where an on-site evaluation was conducted.
b) The NELAP evaluation team will arrange on-site evaluations except as stated in subsection 6.4(c)
below at the mutual convenience of the parties.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 15 of 25
c) The NELAP evaluation team may make subsequent announced or unannounced on-site
evaluations of an accrediting authority's environmental laboratory accreditation program
whenever such an evaluation is necessary to determine the accrediting authority's compliance
with the requirements of the NELAC standards.
[effective July 1, 2001]
d) As part of the two-year AA renewal process, at least one of the NELAP evaluator(s) shall observe
an accrediting authority's laboratory evaluator(s) conducting an on-site evaluation of a laboratory
seeking initial or renewal NELAP accreditation. The NELAP evaluator(s) shall not participate in
the laboratory's evaluation.
6.4.1	Scheduling the On-site Evaluations
a)	The NELAP evaluation team shall contact the accrediting authority to schedule on-site
evaluations as set forth in Section 6.4 (a) above within 20 calendar days of the date the NELAP
evaluation team accepts an initial or renewal application.
b)	The NELAP evaluation team must send to the accrediting authority written confirmation of the
logistics required to conduct the on-site evaluation. The written confirmation shall include, but
is not limited to:
1)	on-site evaluation date and agenda or schedule of activities,
2)	copies of the standardized evaluation checklists,
3)	the names, titles, affiliations, and on-site evaluation responsibilities of the NELAP evaluation
team members, and
4)	the names and titles of all accrediting authority staff that need to be available during the
on-site evaluation.
c)	All on-site evaluations shall be conducted no later than 50 calendar days following approval of
the application.
6.4.2	Conducting the On-site Evaluation
a)	The purpose of the on-site evaluation is to verify compliance with the requirements of the NELAC
standards including, but not limited to:
1)	determining the accuracy of information contained in the accrediting authority's application
and supporting documents;
2)	determining whetherthe accrediting authority's implementation of its environmental laboratory
accreditation program conforms with the information and data contained in the application
and supporting documents.
b)	When conducting an on-site evaluation, the NELAP evaluation team shall, at a minimum:

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 16 of 25
1)	review the accrediting authority's record keeping and documentation procedures;
2)	conduct interviews with the accrediting authority's management and technical staff;
3)	review selected laboratory accreditation cases;
4)	review the training records and conduct interviews of staff designated as qualified evaluators
to evaluate their training, knowledge of evaluation techniques and the NELAC standard;
5)	review records of laboratory complaints, disputes and appeals; and
6)	review quality assurance and internal audit procedures employed by the accrediting authority.
c)	The NELAP evaluation team shall only have access to records of the accrediting authority's
environmental laboratory accreditation program that are necessary to determine compliance with
the NELAC standards. An accrediting authority shall not be required to give the NELAP
evaluation team access to sensitive or confidential documents, or documents that are part of the
record of an ongoing legal proceeding.
d)	NELAP evaluation teams performing an on-site evaluation of a Federal agency may need security
clearances, appropriate badging, and/or a security briefing before proceeding with the on-site
evaluation. Evaluators shall be informed in writing of any information that is controlled for national
security reasons and cannot be released to the public.
e)	The NELAP evaluation team shall have the opportunity to interview privately:
1)	all management, technical staff and evaluators of the accrediting authority's environmental
laboratory accreditation program; and
2)	any NELAP-accredited laboratory receiving its accreditation from the applicant accrediting
authority.
f)	The NELAP evaluation team must ensure that the evaluation is conducted according to the
schedule as set forth in subsection 6.4.1 (b)(1) and consists of the following:
1)	an opening meeting,
2)	the comprehensive on-site evaluation ofthe accrediting authority's environmental laboratory
accreditation program, and
3)	an exit interview to discuss all noted deficiencies.
g)	The NELAP evaluation team shall conduct all evaluations in accordance with the NELAP
standard operating procedure for conducting on-site evaluations of accrediting authorities.
6.4.3 On-site Evaluation Reports
a) The NELAP evaluation team will send by certified mail to the accrediting authority an on-site
evaluation report within 30 calendardays of completion ofthe on-site evaluation. The report shall
include, but is not limited to:

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 17 of 25
1)	the date(s) of evaluation;
2)	the name(s) of the person(s) responsible for the report;
3)	the NELAP recognition fields of accreditation forwhich initial recognition or renewal is sought;
and
4)	the comments of the NELAP evaluation team on the accrediting authority's compliance with
the requirements of the NELAC standards.
If the on-site evaluation does not reveal any deficiencies, the NELAP evaluation team shall
recommend to the NELAP Director that the accrediting authority be granted or maintain NELAP
recognition.
If deficiencies are noted during the on-site evaluation, the report will:
1)	identify any specific deficiencies noted during the on-site evaluation,
2)	include references to the specific NELAC standards, and
3)	provide suggested corrective action.
If the on-site evaluation reveals deficiencies, the accrediting authority shall submit a plan of
corrective action to the NELAP evaluation team within 30 calendar days of receipt of the on-site
evaluation report.
1)	The plan of corrective action must detail those specific actions taken or that will betaken by
the accrediting authority to correct all deficiencies noted by the NELAP evaluation team
during the on-site evaluation.
2)	The plan of corrective action must include the accrediting authority's projected time to
complete the corrective actions not yet complete at the time of the accrediting authority's
response to the on-site evaluation report.
3)	Except forthose deficiencies set forth in Section 6.5, the implementation of corrective actions
must take place no more than 65 calendar days from receipt of the on-site evaluation report.
The NELAP evaluation team shall recommend to the NELAP Director revocation or denial of
NELAP recognition for on-site evaluation deficiencies for any accrediting authority that fails to
submit a plan of corrective action within 30 calendar days as set forth in subsection 6.4.3(d)
above.
Within 20 calendar days of receipt of the accrediting authority's plan of corrective actions, the
NELAP evaluation team shall review the plan and respond in writing to the accrediting authority.
1)	If the accrediting authority corrects all deficiencies, the NELAP evaluation team shall
recommend to the NELAP Director that the accrediting authority be granted or maintain
NELAP recognition.
2)	If the accrediting authority's plan of corrective actions does not address all deficiencies, the
NELAP evaluation team will notify the accrediting authority by certified mail that it must

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 18 of 25
submit another plan of corrective actions for the remaining deficiencies not covered by
Section 6.5 within 20 calendar days of the accrediting authority's receipt of this notification.
g)	The NELAP evaluation team shall review the corrective actions for the remaining deficiencies
within 20 calendar days of receipt of a subsection 6.4.3(f)(2) response from the accrediting
authority.
1)	If all deficiencies are not corrected and the remaining deficiencies affect only certain fields
of accreditation, the NELAP evaluation team shall recommend to the NELAP Director that
the accrediting authority's NELAP recognition be denied or revoked for those fields of
accreditation for which on-site evaluation deficiencies remain.
2)	If all deficiencies are not corrected and the remaining deficiencies affect the entire accrediting
authority's environmental laboratory accreditation program, the NELAP evaluation team shall
recommend to the NELAP Director that the accrediting authority's NELAP recognition be
denied or revoked.
3)	If the only remaining deficiencies require legislation or rulemaking as set forth in Section 6.5,
the NELAP evaluation team shall recommend to the NELAP Director that the accrediting
authority be granted or maintain NELAP recognition.
4)	If remaining deficiencies are corrected, the NELAP evaluation team shall recommend to the
NELAP Director that the accrediting authority be granted or maintain NELAP recognition.
h)	If the NELAP evaluation team determines that the accrediting authority has falsified information
included in its application and supporting documents, the NELAP evaluation team shall
recommend to the NELAP Director that the accrediting authority's NELAP recognition be denied
or revoked.
6.5 ACCREDITING AUTHORITY'S REQUEST FOR EXTENSION OF TIME TO COMPLY WITH
THE NELAC STANDARDS
a) Upon written request to the NELAP Director, through the NELAP evaluation team, an extension
of time, not to exceed two years, to correct deficiencies noted in the accrediting authority's
application and/or deficiencies noted during the on-site evaluation will be granted only:
1) when an applicant accrediting authority has an operating environmental laboratory
accreditation program forthe fields of accreditation for which it is seeking or renewing NELAP
recognition, and
[effective July 1, 2001]
2)	when, as set forth in Section 6.4.3(g)(3), implementation of corrective actions to correct
application and/or evaluation deficiencies requires the accrediting authority to promulgate
new or revised regulations, or
3)	when, as set forth in Section 6.4.3(g)(3) implementation of corrective actions to correct
application and/or evaluation deficiencies requires the accrediting authority to seek new or
revised legislation.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 19 of 25
If the deficiencies continue to exist after two years from the date the original extension was
granted, the accrediting authority shall reapply to the NELAP Director, through the NELAP
evaluation team, for an additional extension time. The additional extension time will be subject
to the following conditions:
1.	it shall not exceed two years, unless the Accrediting Authority Review Board recommends
to the NELAP Director an additional length of time, and
2.	the accrediting authority shall meet the conditions given in Section 6.5(a)(1), (2), and (3), and
3.	the accrediting authority shall provide documentation to demonstrate that it has made
significant progress towards completing its regulatory or legislative process.
Note: Sections 6.5(a)(2), 6.5(a)(3), 6.5(b), 6.5(b)(1), 6.5(b)(2), and 6.5(b)(3) are effective
immediately upon passage and amend NELAC 1999 and 2000 standards.
The accrediting authority shall include in its request for an extension of time to comply with the
NELAC standards a projected time table for correction of the application and/or evaluation
deficiencies.
NELAP EVALUATION TEAM RECOMMENDATIONS TO THE NELAP DIRECTOR
All recommendations required by this chapter from the NELAP evaluation team to the NELAP
Director must be made in writing.
All NELAP evaluation team recommendations to the NELAP Director shall include the following
documentation when applicable:
1)	a recommendation to grant, maintain or revoke NELAP recognition in full or in part;
2)	a summary of the reasons supporting the recommendation;
3)	a copy of all application review letters sent to the accrediting authority and all corrective
action response letters submitted by the accrediting authority to the NELAP evaluation team;
4)	a copy of all on-site evaluation review letters sent to the accrediting authority and all
corrective action response letters submitted by the accrediting authority; and
5)	a copy of the accrediting authority's requests for extension of time to implement corrective
actions if legislative or additional rulemaking is required pursuant to Section 6.5.
A copy of any NELAP evaluation team's recommendation with all supporting documentation to
the NELAP Director also shall be furnished to the accrediting authority.
Within 20 calendardays of receipt of the NELAP evaluation team's recommendation, the NELAP
Director shall provide written notification to the accrediting authority of acceptance or rejection
of the NELAP evaluation team's recommendation.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 20 of 25
e) The accrediting authority has the option to appeal a revocation or denial decision regarding
NELAP recognition by the NELAP Director as set forth in Section 6.10 of this chapter.
6.7	CERTIFICATE OF RECOGNITION TO THE ACCREDITING AUTHORITY
a)	The NELAP Director will issue a certificate of NELAP recognition dated the day on which NELAP
recognition is granted.
b)	The certificate of NELAP recognition shall include the following items:
1)	the name and address of the accrediting authority,
2)	the fields of accreditation for which the accrediting authority is NELAP-recognized,
3)	the date of the accrediting authority's most recent on-site evaluation,
4)	the expiration date of the accrediting authority's NELAP recognition which shall not be more
than two years from the date of the most recent date granting NELAP recognition,
5)	the signature of the NELAP Director,
6)	a statement that the accrediting authority is in compliance with the NELAC standards,
7)	a statement that the accrediting authority has been granted the authority to accredit
environmental laboratories for the fields of accreditation for which the accrediting authority
is NELAP-recognized,
8)	a statement that continued NELAP recognition depends on compliance with the NELAC
standards;
9)	a seal incorporating the NELAP insignia; and
10)	a unique designator, such as date of issuance and a serial or certificate number.
6.8	USE OF ACCREDITATION BY NELAP ACCREDITED LABORATORIES
a) The accrediting authority shall have requirements for controlling the ownership, use and display
of the accrediting authority's NELAP accreditation documents and for controlling the manner in
which an accredited laboratory may refer to its NELAP accreditation and/or use of the
NELAC/NELAP logo. These arrangements shall include, but are not limited to requirements that:
1)	NELAP accredited laboratories post or display their most recent NELAP accreditation
certificate or their NELAP-accredited fields of accreditation in a prominent place in the
laboratory facility;
2)	NELAP accredited laboratories make accurate statements concerning their NELAP
accreditation fields of accreditation and NELAP accreditation status;
3)	NELAP accredited laboratories accompany the accrediting authority's name and/or the
NELAC/NELAP logo with at least the phrase "NELAP accredited" and the laboratory's
accreditation number or other identifier when the accrediting authority's name is used on

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 21 of 25
general literature such as catalogs, advertising, business solicitations, proposals, quotations,
laboratory analytical reports or other materials; and
4) NELAP accredited laboratories not use their NELAP certificate, NELAP accreditation status
and/or NELAC/NELAP logo to imply endorsement by the accrediting authority.
The accrediting authority shall have arrangements to ensure that NELAP accredited laboratories
choosing to use the accrediting authority's name, making reference to its NELAP accreditation
status and/or using the NELAC/NELAP logo in any catalogs, advertising, business solicitations,
proposals, quotations, laboratory analytical reports or other materials, the NELAP accredited
laboratory shall:
1)	distinguish between proposed testing for which the NELAP-accredited laboratory is
accredited and the proposed testing for which the NELAP accredited laboratory is not
accredited;
2)	include the NELAP-accredited laboratory's accreditation number or other identifier; and
The accrediting authority shall have arrangements to ensure that the NELAP-accredited
laboratories upon suspension, revocation or withdrawal of their NELAP accreditation shall:
1)	discontinue use of all catalogs, advertising, business solicitations, proposals, quotations,
laboratory analytical results or other materials that contain reference to their past NELAP
accreditation status and/or display the NELAC/NELAP logo, and,
2)	return any certificates for NELAP accreditation to the accrediting authority.
The accrediting authority shall have arrangements to take suitable actions, including legal action,
when incorrect references to the accrediting authority's NELAP accreditation, misleading use of
the laboratory's NELAP accreditation status and/or unauthorized use of the NELAC/NELAP logo
is found in catalogs, advertisements, business solicitations, proposals, quotations, laboratory
analytical reports or other materials.
9 REQUIREMENTS OF THE NELAP
The NELAP evaluation team shall submit all documents, letters, evaluation notes, checklists, etc.
to the NELAP headquarters office within:
1)	30 calendar days of the final decision on the application by the NELAP Director, or
2)	30 calendar days after the final recommendation by the Accrediting Authority Review Board
(AARB) as set forth in Section 6.10 of this chapter.
The NELAP Director shall maintain complete and accurate records of all documents relating to
the application and on-site evaluation processes for each accrediting authority for a minimum of
ten years or a longer period of time if required by contractual obligations or pertinent federal laws
and regulations.
The NELAP Director shall maintain an electronic directory to display the status of all NELAP-
recognized accrediting authorities, pending applications for NELAP recognition and currently
scheduled announced on-site evaluations.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 22 of 25
6.9.1 NELAP Evaluation Team
a)	The NELAP Director shall appoint NELAP evaluation team members as set forth in Section 6.3.3
(a)(4) and delegate the responsibilities required by this chapter to evaluation teams.
b)	During the time priorto the NELAP issuing the first NELAP recognitions to accrediting authorities,
the NELAP evaluation team shall consist of at least one member who is an employee of the
USEPA and at least one member who is an employee of another operating territorial, state or
federal environmental laboratory accreditation program.
c)	No later than two years from the date that the first accrediting authority recognitions are
announced, the NELAP evaluation team shall consist of at least one member who is an
employee of the USEPA and at least one member who is an employee of a NELAP-recognized
accrediting authority.
d)	Prior to conducting the on-site evaluation of an accrediting authority's program, at least one
member of the NELAP evaluation team shall complete the NELAP Accrediting Authority Evaluator
Training Course.
e)	The NELAP evaluation team shall:
1)	have at least one member of the NELAP evaluation team who meets the education,
experience and training requirements for laboratory evaluators specified in the NELAC
standards, Chapter 3, On-site Evaluation; and
2)	have at least another member with experience that includes at least one of the following:
i)	certification as a management systems lead evaluator (quality or environmental) from an
internationally recognized auditor certification body;
ii)	one year of experience implementing federal orstate laboratory accreditation rulemaking;
iii)	laboratory accreditation management; or
iv)	one year experience developing or participating in laboratory accreditation programs.
3)	All experience required by this subsection must have been acquired within the five year
period immediately preceding appointment as a NELAP evaluation team member.
[effective July 1, 2001]
6.10 APPEALING FINDINGS BASED UPON DIFFERENCES IN STANDARDS
INTERPRETATIONS
a) Though standards are written as clearly and succinctly as possible, conflicts regarding
interpretation of standards may arise between the NELAP evaluation team and an accrediting
authority, a laboratory and the accrediting authority or between two or more accrediting
authorities. Appendix A of this chapter outlines the procedures that must be followed in these
instances.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 23 of 25
b)	The outcome of the procedure outlined in Appendix A is a final consensus interpretation of
a standard. This interpretation must be communicated to the relevant standing committees.
The decision shall be posted on the NELAC Website and be accessible to all accrediting
authorities and laboratories within 14 days.
c)	The consensus interpretation must be recognized by the NELAP Director, the NELAP
evaluation teams, all accrediting authorities and laboratories until such a time as the standard
is changed or another consensus interpretation has been issued.
6.11 APPEALING DECISIONS TO DENY OR REVOKE NELAP RECOGNITION
a)	Within 20 calendar days of official notification of the NELAP action on an accrediting
authority's application for NELAP recognition, the accrediting authority shall notify the NELAP
Director if the accrediting authority chooses to appeal the NELAP action. If the accrediting
authority does not receive satisfactory resolution, the accrediting authority may request a
review by the AARB. This request shall be made within 20 calendar days of the Director's
decision.
b)	If any AARB member is not free of financial connection to the appealing accrediting authority,
or is not free of any other relationship that would bias their review of the case, that AARB
member shall be excluded from participating in deliberations on that appeal.
c)	The AARB shall carry out an independent review of all relevant parts of the record.
d)	The AARB shall conduct interviews with the accrediting authority and the NELAP Director.
The AARB also may conduct interviews with the NELAP evaluation team member(s) or other
individuals deemed appropriate by the AARB.
e)	If the accrediting authority so desires, an opportunity for both the NELAP and the accrediting
authority to meet jointly with the AARB shall be granted.
f)	The AARB shall complete its review and render a final recommendation to the NELAP
Director within 90 calendar days following receipt of the notice of appeal. This time frame
may be extended by mutual agreement of all parties up to a maximum of 60 additional
calendar days.
g)	The ultimate decision to grant, maintain, deny or revoke NELAP recognition remains with the
NELAP Director. The NELAP Director shall notify the appealing accrediting authority of
his/her decision within 20 calendar days of receipt of the recommendation from the AARB.
h)	Accrediting authorities shall be limited to one appeal for each application cycle.
i)	Upon filing an appeal, the status existing prior to the decision will remain in effect pending
resolution of the appeal.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 24 of 25
START
Request for Application
HELAP Sends Application
Package
National Environmental Laboratory
Accreditation Conference,
Accrediting Authority
Prepare Application/
Information for HELAP
Recognition
Completeness Review of
Application Package by HELAP
Flow Chart for NELAP Recognition of An
Accrediting Authority
20 days
Review App. and prepare Technical
Report by HELAP Assmt. Team
30d to prepare 1st report
30d to review 1st resubmittal
20d to review 2nd resubmittal
Accrediting Authority Submits
Corrective Actions
1st Resubmittal-30 days
2nd Resubmittal-20 days
Does Accrediting
Authority choose to
Withdraw?
Application Process
Terminated
NELAP Director Reviews
Team's Recommendation to
Deny or Revoke
20 days
Is this an on-site
audit cycle?
Certificate of
HELAP Recognition
Audit Scheduled
50 days MAX to complete
Is this an on-site
audit cycle?
Prepare Audit Report by


HELAP Team

Accrediting Authority Submits


Corrective Actions
30 days

1st Resubmittal-30 days

2nd Resubmittal-20 days
Is this a 1st or 2nd
Request for Corrective
Actions?
0

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 25 of 25
See Previous Page
Figure 1: Flow Chart for NEIiAP Recognition of An Accrediting Authority

-------
[effective July 1, 2001]
ACCREDITING AUTHORITY
APPENDIX A
QUESTIONS OF UNIFORMITY
PROCEDURE

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 6A-1 of 2
[effective July 1, 2001]
Appendix A - QUESTIONS OF UNIFORMITY PROCEDURE
A.1 PURPOSE
In the event where two or more parties cannot resolve an issue of interpretation of a standard, the
following procedure shall be followed. This procedure may be initiated by any involved party and is
to be used when the appeal procedure provided by the Accrediting Authority has been exhausted or
is not appropriate.
A.2 PROCEDURE FOR INITIATION OF RESOLUTION BY AFFECTED PARTIES
A.2.1 Initial Decision/Interpretation Procedure
a)	The affected party shall contact the involved Accrediting Authority(s) (AA)(s) in writing with
a copy to the NELAP Director. The request shall include the reference for the affected
standard and a statement of the variances in interpretation made by the AA(s) as well as a
summary explaining the affected party's position.
b)	The parties shall discuss the difference in interpretation within 7 days of notification of the
issue.
c)	If the affected parties reach an agreement on interpretation the NELAP Director is informed
in writing of their decision.
d)	If the affected parties cannot reach an agreement the request is forwarded in writing to the
NELAP Director within 14 days by the affected party(s)
A.2.2 Decision/Interpretation Procedure When Affected Parties Cannot an Agreement
1.	Within 7 days after receiving the request from the affected parties, the NELAP Director
shall forward the request to the appropriate NELAC committee or AA workgroup for an
interpretation/decision.
2.	The standing committee or AA workgroup will have 45 days to inform the director of their
interpretation/decision
3.	The affected parties shall be informed of the interpretation by the director within 7 days.
4.	The effective parties shall notify the director of accepting or appeal the
interpretation/decision within 7 days of being informed of the interpretation/decision.
A.3 APPEAL PROCEDURE
If the affected parties disagree with the decision/interpretation, the issue is appealed in writing to
the NELAP Board of Directors for final resolution by being placed on the agenda of the next
scheduled meeting for review and a decision.

-------
NELAC
Accrediting Authority
Revision 13
May 25, 2001
Page 6A-2 of 2
A.4 POSTING OF DECISION
Once the issue has been resolved, the question and resolution shall be posted by the NELAP
Director on the NELAC web site within 14 days.
Note: This appendix becomes immediately effective upon passage and appends NELAC 1999
and 2000 standards.

-------