United States
Environmental Protection
Agency
Office of Research and
Development
Washington DC 20460
EPA600/R-98/151
November 1998
www.epa.gov
xvEPA NATIONAL ENVIRONMENTAL
LABORATORY ACCREDITATION
CONFERENCE
Constitution, Bylaws, and Standards
Approved July 1998
-------
m^^m
_ O
C^
-\\
o
. o
LJ • O
If I
_o o
O
CONSTITUTION
AND BYLAWS
U S. Environmental Protection Agency
Region 5, Library (PL-12JJ
77 West Jackson Boulevard, 12tn Floor
Chicago, IL 60604-3590
July 2, 1998
-------
{"• ^
0;
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page i of ii
TABLE OF CONTENTS
CONSTITUTION AND BYLAWS
CONSTITUTION 1
ARTICLE I - GENERAL 1
ARTICLE II - OBJECTIVES 1
A. Forum 1
B. Mechanism 1
C. Consensus 1
D. Uniformity 1
E. Cooperation 1
ARTICLE III - PARTICIPATION 1
ARTICLE IV - OFFICERS 2
SECTION 1 - EXOFFICIO OFFICERS 2
A. Director 2
B. Executive Secretary 2
SECTION 2 - ELECTIVE OFFICERS 2
A. Eligibility 2
B. Nominations and Elections 2
ARTICLE V - APPOINTIVE OFFICIALS 3
SECTION 1 - OFFICIALS, SPECIFIC 3
A. Appointment 3
B. Assumption of Office 3
ARTICLE VI - MEETINGS OF NELAC 4
A. Annual Meeting 4
B. Interim Meeting 4
C. Special Meetings 4
D. Rules of Order 4
ARTICLE VII -AMENDMENTS TO THE CONSTITUTION 4
ARTICLE VIII - BYLAWS 4
SECTION 1 - SUPPLEMENTATION OF CONSTITUTION 4
SECTION 2 -AMENDMENTS AND REPEALS OF THE BYLAWS 5
SECTION 3 - RENUMBERING 5
BYLAWS 6
ARTICLE I -APPLICATION FOR PARTICIPATION 6
SECTION 1 - FORM OF APPLICATION 6
ARTICLE II - PARTICIPANTS' RECORDS 6
SECTION 1 - TERM OF PARTICIPATION 6
SECTION 2 - EVIDENCE OF VOTING MEMBERSHIP 6
ARTICLE III - USE OF THE INSIGNIA 6
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page ii of ii
ARTICLE IV - BOARD OF DIRECTORS 6
SECTION 1 - MEMBERSHIP 6
SECTION 2 - DUTIES 6
ARTICLE V - DUTIES OF THE OFFICERS 7
SECTION 1 - CHAIR 7
SECTION 2 - CHAIR-ELECT 7
SECTION 3 - PAST CHAIR 7
SECTION 4 - DIRECTOR 7
SECTION 5 - EXECUTIVE SECRETARY 7
SECTION 6 - PARLIAMENTARIAN 8
ARTICLE VI - COMMITTEES 8
SECTION 1 - GENERAL 8
SECTION 2 - ADMINISTRATIVE COMMITTEES 8
A. Terms 8
B. Duties 8
SECTION 3 - STANDING COMMITTEES 9
A. Terms 9
B. Duties 9
SECTION 4 - SPECIAL COMMITTEES, TASK FORCES AND STUDY GROUPS 10
SECTION 5 - SUBCOMMITTEES 10
ARTICLE VII - VOTING SYSTEM 10
SECTION 1 - HOUSE OF REPRESENTATIVES 10
A. Official Designation 10
B. Composition 10
C. Method of Designation 11
SECTION 2 - HOUSE OF DELEGATES 11
A. Designation 11
B. Requirements 11
SECTION 3 -VOTING RULES 11
A. Applicability 11
B. Quorum 11
C. Voting 11
D. Committee Report Voting 12
SECTION 4 - FLOOR AMENDMENTS 12
A. Procedure 12
B. Editorial Changes 13
SECTION 5 - SEATING 13
A. Arrangement 13
B. Supervision 13
SECTION 6 - PROCEDURES 13
SECTION 7 - CHANGES IN ORGANIZATION AND PROCEDURE 13
Figure 1. Seating Arrangement 14
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Pagel of 14
CONSTITUTION
ARTICLE I - GENERAL
This organization shall be known as "The National Environmental Laboratory Accreditation Conference"
(NELAC) and is sponsored by the United States Environmental Protection Agency (EPA) as a voluntary
association of State and federal Officials. The purpose of the organization is to foster the generation of
environmental laboratory data of known and documented quality through the development of national
performance standards for environmental laboratories and other entities directly involved in the
environmental field measurement and sampling process.1
ARTICLE II-OBJECTIVES
The objectives of the National Environmental Laboratory Accreditation Conference are:
A. Forum
To provide a national forum for the discussion of all questions related to standards for accreditation of
laboratories and other entities directly involved in the environmental field measurement and sampling
process.
B. Mechanism
To provide a mechanism to establish policy and coordinate activities within NELAC on matters of national
and international significance pertaining to standards for accreditation of environmental laboratories and
other entities directly involved in the environmental field measurement and sampling process.
C. Consensus
To develop a consensus on uniform standards for laboratory accreditation and implementation of those
standards by the accrediting authorities.
D. Uniformity
To encourage and promote uniform standards of quality for assessment and accreditation requirements
among the various accrediting authorities.
E. Cooperation
To foster cooperation among environmental laboratory accrediting authorities and regulatory officials, and
between them and other entities directly involved in the environmental field measurement and sampling
process.
ARTICLE III - PARTICIPATION
Participants consist of two categories:
'The Constitution and Bylaws shall be reviewed at a later date to accommodate the unique characteristics of the GLP program, taking
into consideration the recommendations of the Environmental Laboratory Advisory Board."
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page 2 of 14
Voting Membership is limited to officials who are in the employ of the Government of the United States,
and the States, the Territories, the Possessions of the United States, or the District of Columbia and who are
actively engaged in environmental regulatory programs or accreditation of environmental laboratories.
Contributors include representatives of laboratories, manufacturers, industry, business, consumers,
academia, laboratory associations, industrial associations, laboratory accreditation associations, counties,
municipalities, and other political subdivisions of States, Territories and Possessions of the United States,
other federal officials not engaged in environmental activities, and other persons who are interested in the
objectives and activities of NELAC.
ARTICLE IV - OFFICERS
SECTION 1 - EXOFFICIO OFFICERS
A. Director
The Director of the EPA National Environmental Laboratory Accreditation Program is the ex officio Director
of NELAC.
B. Executive Secretary
The Executive Secretary is an employee of EPA who is conversant with laboratory accreditation. She/he
serves NELAC and its Board of Directors.
SECTION 2 - ELECTIVE OFFICERS
The Elective officers of NELAC shall be:
Chair,
Chair - Elect,
Immediate Past-Chair, and
6 members-at-large to serve on the NELAC Board of Directors.
The consecutive reelection of a Chair-Elect is prohibited; the Chair-Elect shall not serve on any committee
other than the Board of Directors. Should the Chair-Elect for any reason be unable or unwilling to be
installed as Chair, his/her successor shall be elected in the manner prescribed below. In this event, the
newly elected Chair-Elect shall be installed as Chair.
A. Eligibility
1. Any Voting Member in good standing shall be eligible to hold any office provided that the individual
meets the other requirements set forth in the Constitution and Bylaws.
2. Only a State official is eligible for election to Chair-Elect.
B. Nominations and Elections
1. Nominating Committee
The Chair shall appoint a Nominating Committee consisting of the most recent active Past Chair as
Committee Chair, four (4) Voting Members, to be geographically representative insofar as possible, and
five (5) Contributors.
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page 3 of 14
2. Nominations
a. The Nominating Committee shall submit one name for each elective office and present its
recommendation to NELAC.
b. Additional nominations for officers may be made from the floor by any Voting Member at the Annual
Meeting provided that prior consent of the nominee has been obtained in writing and presented to
the presiding officer at the time of the nomination.
3. Elections
Officers shall be elected during a designated session of the Annual Meeting by a formal recorded vote
of the Voting Members in attendance and eligible to vote on NELAC motions.
4. Terms of Office
a. The Chair, Chair-Elect, and Past Chair, shall serve for a term of one year or until their successors
are respectively qualified and elected or appointed. After serving one year as Chair-Elect, the
incumbent shall succeed to the office of NELAC Chair.
b. The six Board of Directors members-at-large shall serve for 3-year terms; two elected each year.
c. All officers shall take office immediately following the close of the Annual Meeting at which they
were elected.
5. Filling Vacancies
In case of a vacancy in any of the elective offices, the Board of Directors shall fill the office by
appointment.
The term of this appointment shall be until the date of the next Annual Meeting, at which time the Voting
Members vote to confirm the appointment or elect a candidate to fill the remaining time in the initial term
that was vacated.
ARTICLE V -APPOINTIVE OFFICIALS
SECTION 1 - OFFICIALS, SPECIFIC
A. Appointment
The NELAC Chair shall appoint the Parliamentarian and other officials as needed to conduct activities not
covered by elected officials.
B. Assumption of Office
All appointive officials shall take office immediately following appointment and shall serve through the
subsequent Annual Meeting of NELAC unless otherwise requested by the NELAC Chair, or specified in the
Constitution or Bylaws.
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page 4 of 14
ARTICLE VI - MEETINGS OF NELAC
A. Annual Meeting
An Annual Meeting shall be held. The agenda for this meeting shall include the election of officers, reports
from the various committees, task forces, and study groups, other items pertinent to NELAC, and
presentation to the Voting Membership of pending issues requiring action by vote.
The Annual Meeting may include the presentation of technical papers, discussions, displays, or other events
at the discretion of the Board of Directors.
B. Interim Meeting
The Interim Meeting of the Board of Directors and those Standing Committees designated by the Chair shall
be held annually, approximately six months prior to the Annual Meeting to develop the agenda and
committee recommendations for presentation and action at the Annual Meeting. Draft resolutions and
standards regarding environmental laboratory accreditation shall be discussed and modified as appropriate
in the Interim Meeting.
C. Special Meetings
1. The NELAC Chair is authorized to call a meeting of the Board of Directors at any time deemed
necessary by the Chair to be in the best interest of NELAC.
2. Committees of NELAC are authorized to hold meetings at times other than the Annual Meeting or Interim
Meeting.
D. Rules of Order
The rules contained in the latest version of Robert's Rules of Order shall govern NELAC in all cases to which
they are applicable, and in which they are not inconsistent with the Constitution or Bylaws or special rules
of NELAC.
ARTICLE VII -AMENDMENTS TO THE CONSTITUTION
This Constitution may be amended, added to, or repealed at any Annual Meeting under normal NELAC
procedures. However, proposed changes must be included in the agenda of the Board of Directors for the
preceding Interim Meeting, published in the Recommendations of the Board of Directors in its report, and
discussed at the general session of the Board of Directors at the Annual Meeting at which said changes shall
be voted upon.
Amendments to the Constitution must be approved by a minimum of a two-thirds vote of the Voting Members
in attendance at the Annual Meeting in both the House of Representatives and the House of Delegates.
ARTICLE VIII - BYLAWS
SECTION 1 - SUPPLEMENTATION OF CONSTITUTION
This Constitution shall be supplemented by Bylaws which shall detail the methods of operation of NELAC.
Such Bylaws shall not be inconsistent with the provisions of the Constitution.
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page 5 of 14
SECTION 2 -AMENDMENTS AND REPEALS OF THE BYLAWS
The Bylaws may be amended, added to, or repealed at any Annual Meeting under normal NELAC
procedures. However, proposed changes must be included in the agenda of the Board of Directors for the
Interim Meeting, published in the Recommendations of the Board of Directors in its Tentative Report, and
discussed at the general session of the Board of Directors at the Annual Meeting at which said changes shall
be voted upon.
Amendments to the Bylaws must be approved by a majority vote of the Voting Members in attendance at
the Annual Meeting in both the House of Representatives and the House of Delegates.
SECTION 3 - RENUMBERING
The Executive Secretary is authorized to renumber the Articles and Sections of the Constitution or Bylaws
to accommodate any changes made.
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page 6 of 14
BYLAWS
ARTICLE I - APPLICATION FOR PARTICIPATION
SECTION 1 - FORM OF APPLICATION
A completed registration form for the Annual Meeting of the National Environmental Laboratory Accreditation
Conference (NELAC) shall serve as the application for participation in NELAC.
ARTICLE II - PARTICIPANTS' RECORDS
SECTION 1 -TERM OF PARTICIPATION
Registration for NELAC participation shall be prior to the Annual Meeting each year and shall cover the
period from the beginning of one Annual Meeting to the beginning of the next Annual Meeting.
SECTION 2 - EVIDENCE OF VOTING MEMBERSHIP
Reserved.
ARTICLE III - USE OF THE INSIGNIA
The insignia of NELAC may be used or displayed only for official publications, announcements, and
documents of NELAC unless expressly authorized in writing by the Board of Directors of NELAC.
ARTICLE IV - BOARD OF DIRECTORS
SECTION 1 - MEMBERSHIP
A. The Board of Directors consists of the Director, Executive Secretary, Chair of NELAC, Chair-Elect, the
most recent still active Past Chair of NELAC, and the six at-large-members.
B. The Nominating Committee in recommending candidates for the Board of Directors shall consider
geographic and organizational representation.
C. The term of the Board of Directors begins with the adjournment of the Annual Meeting at which its
members are elected or appointed. Six of the Board of Directors are members-at-large with three-year
terms.
SECTION 2 - DUTIES
A. The Board of Directors has leadership responsibility for NELAC and is charged with guiding NELAC in
its primary mission of establishing standards for the accreditation of environmental laboratories.
B. It establishes administrative procedures and policy on internal matters and serves as the policy and
coordinating body in matters of national and international significance.
C. It holds accountable, reviews, and approves actions of all Committees.
D. It utilizes the Standing Committees to resolve technical criteria issues regarding laboratory accreditation.
E. It acts for NELAC in all routine or emergency situations.
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page 7 of 14
F. It authorizes interim meetings of NELAC Committees as necessary.
G. It fills any vacancy in any elective office of NELAC occurring during the term of office.
H. It brings recommendations to NELAC for consideration and action as appropriate.
I. It annually reviews the work of committees and task forces to assure that the concerns of the various
constituencies are being addressed.
ARTICLE V - DUTIES OF THE OFFICERS
SECTION 1 - CHAIR
The NELAC Chair is the presiding officer at the meetings of NELAC and of the Board of Directors, makes
appointments to the several Standing and Administrative Committees, and appoints other NELAC officials
to perform functions not covered by elected offices to serve during his or her term of office.
SECTION 2 - CHAIR-ELECT
The Chair-Elect shall:
A. serve as acting Chair of NELAC and the Board of Directors in the event that the Chair is unable to carry
out the duties of that office;
B. perform other duties assigned by the NELAC Chair, including presiding over sessions of the meetings
of NELAC as assigned by the NELAC Chair and assisting the Chair in the discharge of his or her duties;
and
C. serve on the Board of Directors.
SECTION 3 - PAST CHAIR
The most recent still-active Past Chair shall serve on the Board of Directors, serve as Chair of the
Nominating Committee, and perform such duties as may be assigned by the NELAC Chair. The NELAC
Past Chair may preside over sessions of the meetings of NELAC as assigned by the NELAC Chair and assist
the Chair in the discharge of his or her duties.
SECTION 4 - DIRECTOR
The Director coordinates all laboratory accreditation activities within EPA for purposes of establishing a
single uniform environmental laboratory accreditation system. The Director serves as the link with EPA and
other federal department/agency policy makers, those responsible for implementation of the National
Environmental Laboratory Accreditation Program, the NELAC Board of Directors, the Environmental
Laboratory Advisory Board, and the Accrediting Authority Review Board. The Director serves on the Board
of Directors as an ex officio member and is responsible for the appointment and support of an Executive
Secretary to the Board of Directors.
SECTION 5 - EXECUTIVE SECRETARY
The Executive Secretary acts as the executive officer of NELAC, as an ex officio member, secretary and
executive officer of the Board of Directors, and the non-voting secretary to each standing committee;
certifies eligible voters and records the vote of NELAC; keeps the records of the proceedings of the
meetings, and manages NELAC administration as prescribed in its administrative procedures.
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page 8 of 14
SECTION 6 - PARLIAMENTARIAN
The Parliamentarian shall, when requested by the Chair, help in resolving procedural matters at meetings
of NELAC. The parliamentarian shall use the latest edition of Robert's Rules of Order and any special rules
adopted by NELAC.
ARTICLE VI - COMMITTEES
SECTION 1 - GENERAL
Except as otherwise provided, each Administrative and Standing Committee shall consist often participants,
five Voting Members and five Contributors who may not vote. All participants are appointed by the Chair
of NELAC to serve staggered terms on a rotating basis or until a successor is appointed.
Except for the Nominating Committee, each committee annually selects one of its Voting Members to serve
as its chair, who may succeed himself or herself.
When necessary, an appointment shall be made to any of the standing or administrative committees to fill
any vacancy for the unexpired portion of the participant's term.
SECTION 2 - ADMINISTRATIVE COMMITTEES
A. Terms
1. Conference Management Committee. The term of service shall be three years; two Voting Members
and two Contributors shall be appointed each of two years and one Voting Member and one Contributor
the third year.
2. Nominating Committee. The chair is the NELAC Past Chair. Four Voting Members and five
Contributors shall be appointed annually to serve one year.
3. Membership and Outreach Committee. The term of service shall be three years. Two Voting Members
and two Contributors shall be appointed in each of two years and one Voting Member and one
Contributor shall be appointed in the third year.
B. Duties
1. Conference Management Committee. This committee recommends to the Board of Directors the places
and dates of each Annual and Interim Meeting of NELAC; and advises and assists the Executive
Secretary with the logistic details of the Interim and Annual meetings and with preparing publications for
the Annual and Interim Meetings.
2. Nominating Committee. This committee presents a slate of nominees for all elective offices at the
Annual Meeting. The names and qualifications of these nominees shall appear in the report of the
Nominating Committee and be published in the Annual Meeting announcement.
3. Membership and Outreach Committee. This committee:
a. initiates Voting Member invitations for membership in the House of Representatives and maintains
an active roster, publicizes N ELAC to prospective participants, coordinates and resolves participants'
concerns and establishes criteria for and reviews the credentials of Voting Members.
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page 9 of 14
b. solicits and develops informational materials to promote understanding and appreciation of the
importance of consistent standards for environmental sampling and analysis in fostering quality data
on which to base responsible public and environmental health decisions.
c. promotes a spirit of cooperation and timely dialogueamong NELAC, other organizations, the private
sector and federal agencies.
SECTION 3 - STANDING COMMITTEES
A. Terms
Standing Committee participants serve staggered five year terms, one Voting Member and one Contributor
being appointed annually.
B. Duties
1. Program Policy and Structure Committee. This committee generates the Constitution and Bylaws of
NELAC, and interprets the intent and meaning of the Constitution and Bylaws, presents amendments,
proposes changes in organizational structure, and defines roles and responsibilities as appropriate, for
approval of the participants. This committee develops modifications to the scope, structure, and
requirements to the tiers and fields of testing.
2. The Accrediting Authority Committee. This committee develops the standards for use by EPA to
oversee compliance by State and federal accrediting authorities with NELAC standards. This committee
considers matters concerning reciprocity of accreditation.
3. Quality Systems Committee. This committee develops and keeps current uniform standards for quality
systems in testing operations. The elements of the quality system include organizational structure,
responsibilities, procedures, processes and resources (e.g., facilities, staff, equipment) for implementing
quality management in testing operations.
4. Proficiency Testing Committee. This committee develops standards for the proficiency testing samples,
develops criteria for selection of the providers of the samples, and develops and updates protocols for
the use of proficiency test samples and data in the accreditation of laboratories.
5. On-Site Assessment Committee. This committee generates procedures for the on-site assessments,
and publishes standard check lists based on these procedures. This committee also establishes the
frequency of inspection, and the minimum education, experience, and training requirements of the
assessors.
6. Accreditation Process Committee. This committee generates and develops procedures for the
administrative aspects of the accreditation process of environmental laboratories, for use by the
accrediting authorities, including the requirements for accreditation, procedures for changes in
accreditation status, roles and responsibilities of laboratories, and appeal processes.
7. Regulatory Coordination Committee. This committee provides the Standing Committees with current
information on regulations and laws that impact laboratory testing and accreditation. The Regulatory
Coordination Committee is also responsible for the development of model State legislation and
regulations that reflect the findings and actions of NELAC.
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page 10 of 14
SECTION 4 - SPECIAL COMMITTEES, TASK FORCES AND STUDY GROUPS
Special committees, task forces, and study groups may be established by the NELAC Chair as the need
arises or as requested by NELAC. Participants shall be appointed for as long as deemed appropriate. Upon
completion of their assigned tasks, such bodies shall be dissolved by the Chair of NELAC.
SECTION 5 - SUBCOMMITTEES
Upon request of any committee, the NELAC Chair may appoint a subcommittee(s) to assist that committee
in fulfilling its responsibilities. The NELAC Chair may appoint Voting Members or Contributors in any
combination, as the need arises or NELAC requests.
ARTICLE VII -VOTING SYSTEM
All questions before a meeting of NELAC that are to be decided by a formal recorded vote of the Voting
Members are voted upon in accordance with the following voting structures and procedures.
SECTION 1 - HOUSE OF REPRESENTATIVES
A. Official Designation
This body of officials shall be known as the "House of Representatives".
B. Composition
1. Each State, Territory, Possession of the United States, and the District of Columbia, is authorized one
official to serve as its representative in the House of Representatives at the NELAC Annual Meeting. The
representative shall be named by the respective Governor or the Mayor for the District of Columbia, and
shall remain as the named representative of that State, Territory, Possession of the United States, or
the District of Columbia, until such time as the Governor or Mayor appoints someone else, or the
individual is no longer an employee of the applicable governmental organization .
2. Each of the eight EPA Assistant/Associate Administrators (Office of Air and Radiation; Office of
Enforcement and Compliance Assurance; Office of Policy, Planning and Evaluation; Office of
Prevention, Pesticides, and Toxic Substances; Office of Regional Operations and State/Local Relations;
Office of Research and Development; Office of Solid Waste and Emergency Response; and Office of
Water) and each of the ten Regional Administrators, or his or her designee, may appoint one Voting
Member.
3. Each cabinet level federal department (Department of Agriculture, Department of Commerce,
Department of Defense, Department of Energy, Department of Interior, and Department of Health and
Human Services) with environmental laboratory accreditation, certification or evaluation activities may
appoint one official to the House of Representatives as determined by the Department Secretary.
4. The Nuclear Regulatory Commission may appoint one representative to the House of Representatives.
5. At the discretion of the respective Governor or Mayor, EPA Assistant/Associate Administrator, cabinet
level federal department, or the Nuclear Regulatory Commission, an alternate to the House of
Representatives may be named to serve when the principal is unable to attend a national meeting of
NELAC. In the absence of the principal, the alternate shall be provided all of the rights and privileges
of the principal in the House of Representatives, provided that he or she has met all other requirements
for Voting Membership. If the respective Governor or Mayor, EPA Assistant/Associate Administrator,
cabinet level federal department, or the Nuclear Regulatory Commission has not appointed a
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page 11 of 14
representative to the House of Representatives then the Voting Members of that State, office,
department or commission in the House of Delegates shall elect one of its Voting Members to vote in
the House of Representatives.
C. Method of Designation
Prior to the NELAC Annual Meeting, the Executive Secretary shall certify to the Board of Directors the
names of the Voting Members and their alternates in the House of Representatives.
SECTION 2 - HOUSE OF DELEGATES
A. Designation
All other environmental officials of the States, Territories, Possessions of the United States, the District of
Columbia and the federal government (those not sitting in the House of Representatives) are grouped as a
body known as the "House of Delegates".
B. Requirements
No other special requirements apply. The number of potential Voting Members is not limited.
SECTION 3 - VOTING RULES
A. Applicability
These rules apply only to the Annual Meetings of NELAC. However, only Voting Members are permitted to
vote in committee or other meetings.
B. Quorum
A quorum of the House of Representatives is required for official voting. This quorum consists of
representatives from fifty percent of the States, Territories and Possessions of the United States, and the
District of Columbia, and fifty percent of federal representatives.
No quorum is required for a vote in the House of Delegates.
C. Voting
At the conclusion of debate on a motion, there shall be a call for the vote, and the vote on the motion shall
be taken in accordance with the following method.
1. Minimum Votes
a. House of Representatives. A majority of the eligible and present participating representatives must
cast their votes in favor of an issue for the motion to be passed.
b. House of Delegates. A majority of those eligible and present delegates must cast their votes in
favor of an issue for the motion to be passed.
Note that any vote on amendments to the Constitution must be approved by a minimum of a two-thirds
vote of the Voting Members in attendance at the voting session of the Annual Meeting in both the House
of Representatives and the House of Delegates.
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page 12 of 14
2. Motion Accepted
The motion is accepted if it passes in both Houses.
3. Disposition of Failed Motions
a. If the original motion fails, or if an amended motion fails, the original or amended motion is returned
to the proposing committee for further consideration.
b. The Chair may consider a new motion on the same subject prior to returning the issue to committee,
if the conditions regarding floor amendments (Article VII, Section 4 of the Bylaws)have been met.
c. The proposing committee may drop the motion or reconsider it for submission the following year.
4. Proxy Votes
Proxy votes are not permitted. Since issues and recommendations in the Committees' interim reports are
often modified and amended at the Annual Meeting, the attendance of officials at the NELAC Annual
Meeting and voting sessions is vital.
5. Method of Indicating Vote
a. Voting is by show of hands, standing vote or machine (electronic). There shall be no voice voting.
b. Voting by both Houses is simultaneous.
6. Recording
a. The Executive Secretary is responsible for the establishment of a means for recording the vote of
NELAC on any matter, as well as providing a means for the certification of eligible voters at any time
a vote is called.
b. House of Representatives. The votes of the Representatives are recorded and published on a state-
by-state or agency-by-agency basis.
c. House of Delegates. The vote of the Delegates are recorded as the total number of votes, and are
not tabulated on a state-by-state or agency-by-agency basis.
D. Committee Report Voting
The specific recommendations from each committee report shall be subject to the approval of the Voting
Membership at the Annual Meeting as expressed by a vote on each individual recommendation.
Alternatives that may be used in voting on the reports are to vote on the entire report, to vote on grouped
items or sections or to vote on individual items. A Voting Member with the support of 10 other Voting
Members may request that the vote be on individual items.
SECTION 4 - FLOOR AMENDMENTS
A. Procedure
1. A Voting Member can offer an amendment from the floor to the motion under consideration.
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page 13 of 14
2. A two-thirds majority favorable vote of each House on the amendment is required for passage.
B. Editorial Changes
Following completion of voting on a Committee's report, the Committee Chair may make a motion to extend
editorial privileges to the Executive Secretary to make editorial changes in the final report.
SECTION 5 - SEATING
A. Arrangement
The seating arrangement for voting sessions is shown in Figure 1.
B. Supervision
The Board of Directors shall control placement and movement of delegates. The Executive Secretary shall
count votes.
SECTION 6 - PROCEDURES
The NELAC officers and committees are to observe the principles of due process; specifically, to give
reasonable advance notice of contemplated committee studies, items to be considered for committee action,
and tentative or definite recommendations for NELAC action, and to provide that all interested parties have
an opportunity to be heard by committees and by NELAC.
SECTION 7 - CHANGES IN ORGANIZATION AND PROCEDURE
Proposals for changes in organization or procedure of NELAC are not acted upon until the Annual Meeting
of NELAC following the Annual Meeting at which such proposals are made.
-------
NELAC
Constitution and Bylaws
Revision 10
July 2, 1998
Page 14 of 14
FRONT OF ROOM
•y\!" '/..'>
t y$ '& jtj.t'^^1 "^ f||' -/^''' i^>, >,
^ \" \ $V {' ", > ' " , s'V ^
Figure 1. Seating Arrangement
-------
c
_ o
"
LU
2
^
co
2 o
o
o
c
CD
CD
O
PROGRAM POLICY
AND STRUCTURE
July 2,1998
-------
NELAC
Program Policy and Structure
Revision 8
July 2,1998
Page i of ii
TABLE OF CONTENTS
PROGRAM POLICY AND STRUCTURE
1.0 PROGRAM POLICY AND STRUCTURE 1
1.1 INTRODUCTION 1
1.1.1 Overview of NELAC 1
1.1.2 History 1
1.1.3 Summary of the NELAC Standards 1
1.1.4 General Application of NELAC Standards 2
1.1.5 Application of NELAC Standards to Small Laboratory Operations 2
1.2 OBJECTIVES 2
1.3 ELEMENTS 3
1.4 PURPOSE AND SCOPE OF NELAC 3
1.4.1 Purpose 3
1.4.2 Scope 3
1.5 ROLES AND RESPONSIBILITIES OF THE FEDERAL GOVERNMENT, THE STATES,
AND OTHER PARTIES 4
1.5.1 EPA 4
1.5.1.1 National Environmental Laboratory Accreditation Program 4
1.5.2 States and Federal Agencies as Accrediting Authorities 5
1.5.2.1 Federal Agencies 5
1.5.2.2 States 5
1.5.2.3 Accrediting Authorities 5
1.5.2.3.1 Responsibilities of Primary Accrediting Authorities 5
1.5.2.3.2 Responsibilities of Secondary Accrediting Authorities 6
1.5.2.3.3 Accreditation Fees 6
1.5.3 Reciprocity 6
1.5.4 Joint Federal and State Roles 6
1.5.5 Assessor Bodies 7
1.5.6 Other Parties 7
1.6 STRUCTURE OF NELAC 7
1.6.1 The Board of Directors 7
1.6.2 The Environmental Laboratory Advisory Board 8
1.6.3 The Accrediting Authority Review Board 8
1.6.4 The Participants 8
1.6.4.1 Participation of the Voting Members and Contributors 9
1.6.5 The Committees 9
1.6.5.1 The Standing Committees 9
1.6.5.1.1 Program Policy and Structure Committee 10
1.6.5.1.2 Accrediting Authority Committee 10
1.6.5.1.3 Quality Systems Committee 10
1.6.5.1.4 Proficiency Testing Committee 10
1.6.5.1.5 On-Site Assessment Committee 10
1.6.5.1.6 Accreditation Process Committee 10
1.6.5.1.7 Regulatory Coordination Committee 10
1.6.5.2 The Administrative Committees 10
1.6.5.2.1 Conference Management Committee 11
1.6.5.2.2 Nominating Committee 11
1.6.5.2.3 Membership and Outreach Committee 11
1.7 CONDUCT OF CONFERENCE BUSINESS 11
1.7.1 The Generation of Standards 11
1.7.2 Meetings 11
1.7.2.1 Annual Meeting 11
-------
NELAC
Program Policy and Structure
Revision 8
July 2,1998
Page ii of ii
1.7.2.2 Interim Meeting 12
1.7.2.3 Special Meetings 13
1.7.2.4 Committee Meetings 13
1.8 ORGANIZATION OF THE ACCREDITATION REQUIREMENTS 13
1.8.1 Scope of Accreditation 13
1.8.2 Supplemental Accreditation Requirements 14
1.8.3 General Laboratory Requirements 14
1.8.4 General Field Sampling Requirements 14
1.8.5 Chemistry Requirements 14
1.8.6 Whole Effluent Toxicity Requirements 14
1.8.7 Microbiology Requirements 15
1.8.8 Radiochemistry Requirements 15
1.8.9 Microscopy Requirements 15
1.8.10 Field Measurement Requirements 15
Figure 1-1. NELAC Structure 16
Figure 1-2. Flowchart for Standards Development and Implementation 17
Figure 1-3. NELAC Tiered Scope of Accreditation 18
-------
NELAC
Program Policy and Structure
Revision 8
July 2, 1998
Pagel of 18
1.0 PROGRAM POLICY AND STRUCTURE
Chapter One provides an overview of the history, purpose and objectives of the National
Environmental Laboratory Accreditation Conference (NELAC). The organizational structure and
function of NELAC, and the roles of the various participants, form the major portion of this chapter.
In addition, the Constitution and Bylaws, and the content of the five chapters which follow are briefly
described. Together, these six chapters and related appendices constitute the NELAC standards.
1.1 INTRODUCTION
1.1.1 Overview of NELAC
This association shall be known as the "National Environmental Laboratory Accreditation
Conference" (NELAC) and is sponsored by the United States Environmental Protection Agency
(EPA) as a voluntary association of state and federal officials. The purpose of the organization is
to foster the generation of environmental laboratory data of known and documented quality in a cost-
effective manner through the development of nationally accepted standards for environmental
laboratory accreditation. NELAC encompasses all fields of testing associated with compliance with
EPA regulations. The program will be administered by state and federal accrediting authorities in
a uniform, consistent fashion nationwide.
1.1.2 History
N ELAC is the result of a joint effort by EPA, other federal agencies, the states, and the private sector
that began in 1990 when EPA's EnvironmentalMonitoring ManagementCouncil(EMMC) established
an internal work group to consider the feasibility and advisability of a national environmental
laboratory accreditation program. The work group concluded that EPA should consult with
representatives of all stakeholders, by establishing a federal advisory committee. As a result, the
Committee on National Accreditation of Environmental Laboratories (CNAEL) was chartered in 1991
under the Federal Advisory Committee Act. In its final report to EMMC, CNAEL recommended that
a national program for environmental laboratory accreditation be established. In response to the
CNAEL recommendations, EPA and state representatives formed the State/EPA Focus Group that
developed a proposed framework for NELAC, modeled after the National Conference on Weights
and Measures. The Focus Group prepared a draft Constitution, Bylaws and standards, which were
published in the Federal Register in December 1994. NELAC was established on February 16,1995
by state and federal officials with the adoption of an interim Constitution and Bylaws.
NELAC was established as a standards-setting body, only, to support a National Environmental
Laboratory Accreditation Program (NELAP). The goal of NELAP is to foster cooperation among the
current accreditation activities of different states or other governmental agencies. NELAP seeks to
unify the existing state and federal agency standards, at minimum cost to the states, federal
agencies and accredited laboratories.
1.1.3 Summary of the NELAC Standards
The NELAC uniform standards are contained in this chapter and the following five chapters and
related appendices.
Chapter 2 contains the criteria for the proficiency testing (PT) program. Laboratory participation in
PT programs fulfills one part of the quality assessment requirements of NELAC. The PT programs
in which a laboratory must participate to become accredited are defined as well as the criteria for
samples, PT providers, and acceptance limits.
-------
NELAC
Program Policy and Structure
Revision 8
July 2, 1998
Page 2 of 18
Chapter 3 describes the essential elements that are to be included in an on-site assessment and the
requirements for an accrediting authority conducting on-site assessments. The qualifications and
requirements for assessors are described as well as the program elements to ensure uniform and
consistent implementation of the NELAC standards.
Chapter 4 describes the accreditation process the laboratory must follow to be recognized as a
NELAC laboratory. The chapter defines the period of accreditation, and the process for maintaining,
awarding and revoking accreditation.
Chapter 5 and the related appendices contain the elements of the laboratory quality system. The
section provides detail concerning quality assurance/quality control requirements so that all
accrediting authorities will evaluate laboratories consistently and uniformly.
Chapter 6 defines the process and operating requirements established by NELAC for an accrediting
authority to become nationally recognized. It provides the policies and criteria that an accrediting
authority must meet to apply for and maintain recognition.
The Glossary contains the definition of terms which are used throughout the standards to assure the
consistency of their use and interpretation.
1.1.4 General Application of NELAC Standards
These standards are for use by accrediting authorities and others concerned with the competence
of environmental laboratories and other organizations directly involved and interested in the
standardization of environmental measurements. Note that any reference to NELAP approval or
NELAC accreditation means that the accrediting authority or laboratory meets the requirements in
the NELAC standards, and is not an endorsement by EPA.
As described in more detail in Chapter 4, an accredited organization may use the NELAC logo on
general literature. It is the ethical responsibility of an accredited organization to describe its
accredited status in a manner that does not imply accreditation in areas that are outside its actual
Scope of Accreditation. When soliciting business or reporting test results, an accredited organization
must distinguish between those tests that fall within its scope of accreditation and those that do not.
1.1.5 Application of NELAC Standards to Small Laboratory Operations
All laboratory operations subject to NELAC standards are expected to generate data of known and
documented quality and maintain the quality systems required to generate quality data. However,
NELAP recognizes that some laboratory operations have some unique characteristics that
differentiate them from other operations. The NELAC standards have addressed these issues by
allowing some flexibility in meeting the requirements for personnel (Section 5.4.2, Section 5.6) and
their credentials (Section 4.1.1).
1.2 OBJECTIVES
The objectives of NELAC, as specified in Article II of the Constitution, are: to provide a national
forum for the discussion of all questions related to standards for environmental laboratory
accreditation; to provide a mechanism to establish policy and coordinate activities within NELAC;
to develop a consensus on uniform standards for laboratory accreditation, and encourage and
promote uniform standards of quality for assessment and accreditation; and to foster cooperation
among environmental laboratory accrediting authorities and regulatory officials.
-------
NELAC
Program Policy and Structure
Revision 8
July 2, 1998
Page 3 of 18
1.3 ELEMENTS
Functional elements of the objectives are:
a) To develop and improve the standards for qualifying as an accredited laboratory, for qualifying
as an accrediting authority, and for uniformly implementing the national accreditation program.
The standards address the accreditation process; on-site laboratory assessments to review the
quality systems; assessor training; proficiency testing; and oversight of accrediting authorities
for uniform interpretation of the standards.
b) To designate the States, Territories and Possessions of the United States (hereinafter referred
to as States) and federal agencies as the accrediting authorities. These authorities may be the
assessor bodies, or may use third parties as assessor bodies to carry out in part or in whole the
assessment functions. As accrediting authorities, the States and the federal agencies shall grant
accreditation and ensure compliance with NELAC laboratory standards and criteria.
c) To provide for reciprocity among the States and the federal agencies by assuring the consistent
application of the national standards. Oversight by NELAP assures uniformity among the
various accrediting authorities. The Accrediting Authority Review Board (AARB) provides a
balanced review of the program.
d) To develop model language for legislation and regulations which can be adopted by the State
legislatures and accrediting authorities.
e) To incorporate, to the extent applicable, ISO 25, ISO 43, and ISO 58. 1
1.4 PURPOSE AND SCOPE OF NELAC
1.4.1 Purpose
NELAC shall be a standards-setting body. NELAC shall, through the process described in the
Constitution and Bylaws, develop, adopt and publish uniform consensus performance standards on
which the national accreditation program shall be based. These standards will be adopted by
NELAC at its annual meeting. These uniform standards shall include, but are not limited to, quality
systems, proficiency testing, audit programs, and other key elements as established by the standing
committees of NELAC. It is not the purpose of NELAC to function as an assessor body, oversee or
approve assessor bodies, or administer any of the main elements of the accreditation program, other
than the development and adoption of standards.
1.4.2 Scope
The scope of NELAC shall encompass the necessary scientific testing to serve the needs of the
States, United States Environmental Protection Agency (EPA), and other federal agencies involved
in the generation and use of environmental data, where such generation or use is mandated by EPA
1A review by the Environmental Laboratory Advisory Board (ELAB), a federal advisory
committee, is currently underway on whether to include within NELAC, laboratories complying with
Good Laboratory Practices (GLP). GLPs are mandated by EPA under the Federal Insecticide,
Fungicide, and Rodenticide Act (FIFRA) and the Toxic Substances Control Act (TSCA). If GLP
laboratories are included in NELAC, the EPA GLP programs and the Organization for Economic and
Cooperative Development (OECD) GLP Principle Technical Standards will be incorporated, to the
extent applicable.
-------
NELAC
Program Policy and Structure
Revision 8
July 2, 1998
Page 4 of 18
statutes and pursuant regulations. Laboratories are encouraged to use the NELAC standards for all
other tests.
Applicable EPA statutes include the Clean Air Act (CAA); the Comprehensive Environmental
Response Compensation and Liability Act (CERCLA); the Federal Insecticide, Fungicide and
Rodenticide Act (FIFRA); the Federal Water Pollution Control Act (Clean Water Act; CWA); the
Resource Conservation and Recovery Act (RCRA); the Safe Drinking Water Act (SDWA); and the
Toxic Substances Control Act (TSCA). The standards shall also include provisions to permit special
requirements or fields of testing promulgated by any of the accrediting authorities.
The standards shall not be implemented or administered in a way which limits the ability of local,
state or federal agencies to investigate and prosecute enforcement cases. Specifically, when
engaged in the collection and analysis of forensic evidence to support litigation, those agencies may
use any procedure that is appropriate given the nature of the investigation, subject only to the
bounds of sound scientific practice. The standards shall not apply to governmental laboratories
engaged solely in the analysis of forensic evidence.
1.5 ROLES AND RESPONSIBILITIES OF THE FEDERAL GOVERNMENT, THE STATES, AND
OTHER PARTIES
1.5.1 EPA
EPA shall provide staff support to NELAC as provided for in the Bylaws and agreed to by EPA. EPA
shall assist NELAC by publishing all proposed and final standards.
EPA also participates in joint activities with other federal and State agencies, as described below.
1.5.1.1 National Environmental Laboratory Accreditation Program
EPA shall establish and administer the National Environmental Laboratory Accreditation Program
(NELAP), and shall staff an office to oversee the implementation of NELAC standards. The purpose
of this oversight is to ensure a high degree of standardization and coordination among the different
accrediting authorities.
NELAP performs the following functions in support of NELAC:
a) evaluating and approving the implementation of NELAC standards by accrediting authorities;
b) establishing and maintaining a national database on environmental laboratories which contains
information on the status of accrediting authorities, current status of NELAC accredited
laboratories, and status of providers of proficiency test samples;
c) where conflict of interest may occur in an accrediting authority, accrediting that authority's
principal laboratory if requested. See Chapter 6, section 6.2.2 d) and e);
d) accrediting EPA laboratories;
e) reporting to NELAC on the evaluation of the conformance of State and federal accreditation
program activities to NELAC standards;
f) reporting to NELAC on results of evaluations of proficiency testing sample providers and
assessor training programs; and
-------
NELAC
Program Policy and Structure
Revision 8
July 2, 1998
Page 5 of 18
g) approving supplemental accreditation requirements proposed by accrediting authorities (see
Section 1.8.2).
1.5.2 States and Federal Agencies as Accrediting Authorities
In order to be considered a NELAP approved accrediting authority, the individual State or federal
program must adopt the NELAC standards, utilize assessors trained according to the requirements
of NELAC, and be evaluated by the EPA oversight office as being an agency whose accreditation
and assessment program meet all of the requirements of NELAC. Failure in any one of these areas
would preclude a State or federal program from being recognized by NELAP.
1.5.2.1 Federal Agencies
To operate as accrediting authorities, or to obtain NELAC accreditation for their environmental
monitoring laboratories, federal agencies shall conform to the NELAC standards.
1.5.2.2 States
The authority of the States to adopt the NELAC standards is manifest in the authority granted to their
administrative agencies by State legislatures. State governments shall be the principal accrediting
authorities.
1.5.2.3 Accrediting Authorities
An accrediting authority can be either a) any federal department/agency with responsibility for
operating mandated environmental monitoring programs which require laboratory testing, or b) any
State which requires laboratory testing in conformancewith at least one of the EPA programs listed
within the scope of NELAC (see Section 1.4.2). If a State chooses not to adopt the NELAC
standards , laboratories in that State may obtain accreditation from any other accrediting authority.
A primary accrediting authority is one which ensures directly that the laboratory is in conformance
with the NELAC standards. A secondary accrediting authority is one which, through reciprocity,
recognizes the accreditation of a primary accrediting authority.
1.5.2.3.1 Responsibilities of Primary Accrediting Authorities
Once a State or federal department/agency has been approved by NELAP as being an entity whose
accreditation and assessment program meets all of the requirements of NELAC, it will be a primary
accrediting authority, and it will have full responsibility for:
a) using the NELAC standards as the basis for assessing the qualificationsof laboratories applying
for initial or continuing NELAC accreditation;
b) ensuring conformance by the laboratories it accredits with the national standards established by
NELAC;
c) granting interim and/or full accreditation to applicant laboratory organizatio ns through the
review and approval of applications, performance of on-site assessments, evaluation of results
on proficiency testing samples, and enforcement of all applicable laws and rules relating to
accreditation; and
d) submitting the names and appropriate accreditation material to EPA for inclusion in the national
laboratory database.
-------
NELAC
Program Policy and Structure
Revision 8
July 2, 1998
Page 6 of 18
Federal laboratories within a State may be accredited by the State accrediting authority or by a
federal accrediting authority. A State accrediting authority is the primary accrediting authority for
all non-federal NELAP accredited laboratories in that State. However, if the State accrediting
authority does not grant NELAP accreditation for testing in conformance with a particular field of
testing (see section 1.8), laboratories may obtain primary accreditation for that particular field of
testing from any other accrediting authority.
In addition, a primary accrediting authority may delegate assessment activities to a third party
(assessor body). If any of these assessment activities are delegated to a third party, the accrediting
authority maintains responsibility for ensuring compliance with the standards established by NELAC.
1.5.2.3.2 Responsibilities of Secondary Accrediting Authorities
A secondary accrediting authority must be approved by NELAP as being an entity whose
accreditation and assessment program meets all of the requirements of NELAC for a secondary
accrediting authority.
A secondary accrediting authority may require laboratories to submit an application, may issue
certificates of accreditation, and will exercise its legal authority for enforcement of all applicable laws
and rules. However, it must recognize the laboratory accreditations through reciprocity, and must
not replicate any of the assessment functions, of a primary accrediting authority.
1.5.2.3.3 Accred Station Fees
Accrediting authorities may adopt and impose laboratory accreditation fees.
1.5.3 Reciprocity
Reciprocity means that an accrediting authority will recognize and accept the accreditation status
of a laboratory issued by another NELAP accrediting authority. This principle of reciprocity is an
element of the national accreditation standard to which all accrediting authorities are held. In
recognizing the accreditation status of a laboratory through reciprocity, the accrediting authority
assumes the responsibilities of a secondary accrediting authority as stated in Section 1.5.2.3.2.
A state, in the role of a secondary accrediting authority, which has a law or decision resulting from
a legal action, the legal effect of which precludes that state from granting any accreditation to a
particular laboratory, is not required to accept the accreditation of this laboratory.
Reciprocity among the environmental laboratory accreditationauthorities is necessary to the success
of a national program. The essential ingredient of reciprocity is uniformity from one accrediting
authority to another. The mechanisms to assure this uniformity (e.g., uniform national performance
standards, thorough and consistent inspections, and comparable decisions on accreditation status
when deficiencies are uncovered) are necessary to ensure that reciprocity is equitable.
1.5.4 Joint Federal and State Roles
NELAC shall be the joint responsibility of EPA, the States, and the other federal agencies. As
provided in the following section on the structure of NELAC and in the NELAC Bylaws, EPA, the
States, and the other federal agencies share responsibilities of governance, analysis and
establishment of policy and NELAC technical standards.
-------
NELAC
Program Policy and Structure
Revision 8
July 2, 1998
Page 7 of 18
1.5.5 Assessor Bodies
An assessor body, operating underwritten agreement with an accrediting authority, may perform
specified functions of the assessment process. These functions may include: the review of the
laboratories' documentation regarding facilities, personnel, use of approved methods, and quality
assurance procedures; and conduct of on-site assessments, including review of performance in the
analysis of proficiency test samples. The assessor body reports to the accrediting authority under
which it is operating. The assessor body will provide full documentation to the accrediting authority.
Only the accrediting authority may determine if a laboratory has met the NELAC standards, may
issue certificates of accreditation, may make any decisions on the granting and withdrawal of a
laboratory's accreditation status, and may take responsibility for the accreditation process.
1.5.6 Other Parties
All other interested parties including, but not limited to, the laboratory industry, clients of the
laboratory industry, environmental or other public interest groups, private industry, third party
assessors, and the general public, may participate in NELAC. In this role, these other parties may
bring technical and policy issues to the attention of NELAC, its Board of Directors, or its committees
and subcommittees. It is anticipated that these issues shall be brought to NELAC in the form of
reports, presentations, discussion material, or other forms of documentation for presentation at the
NELAC annual, interim , or committee/subcommittee meetings.
1.6 STRUCTURE OF NELAC
The structure of NELAC is shown in Figure 1-1. NELAC is composed of a Board of Directors, a
House of Representatives, a House of Delegates, Contributors, and a number of committees. There
are nine elected officials of NELAC: the Chair; the Chair-Elect; the immediate Past Chair; and six
members at large. The Standing Committees and Administrative Committees are appointed by the
Chair. The activities of the Standing and Administrative Committees are overseen by the Board of
Directors.
NELAC will meet twice a year: an annual meeting at which final action is taken on all issues, and
an interim meeting about six months prior to the annual meeting at which time committees meet to
receive, consider and deliberate on issues, propose and draft standards or policies for adoption at
the annual meeting.
NELAC shall also consider advice and comment provided by the Environmental Laboratory Advisory
Board (ELAB) chartered under the Federal Advisory Committee Act and the Accrediting Authority
Review Board (AARB).
1.6.1 The Board of Directors
The Board of Directors consists of the NELAC Chair, the Chair-Elect, immediate Past Chair, six
members elected at large from the active membership (to serve 3-year staggered terms), a NELAC
Director, and an Executive Secretary. The NELAP Director is the ex officio Director of NELAC. The
Executive Secretary is an EPA employee.
The Board of Directors serves as a policy and coordinating body in matters of national and
international significance and makes interim policy decisions when necessary between annual
meetings. The Board of Directors has the overall responsibility and authority for the supervisory,
administrative and procedural duties associated with NELAC. The Board of Directors will charge
the committees with issues they must address or take under consideration. Comments on the
standards should be directed to the committees through their respective chairs.
-------
NELAC
Program Policy and Structure
Revision 8
July 2, 1998
Page 8 of 18
1.6.2 The Environmental Laboratory Advisory Board
The Environmental Laboratory Advisory Board (ELAB), chartered under the Federal Advisory
Committee Act, consists of members appointed by EPA and composed of a balance of non-State,
non-federal representatives, from the environmental laboratory community, and co-chaired by an
ELAB member and an EPA representative. The ELAB advises EPA and NELAC on matters
affecting the interests of the regulated laboratories and other interested parties. The
recommendations of the ELAB shall be presented to the Chairs of the standing committees, the
Board of Directors and to the EPA.
1.6.3 The Accrediting Authority Review Board
The Accrediting Authority Review Board (AARB) is composed of five representatives from EPA,
other federal agencies, and the States. The AARB shall include one member from EPA and at least
two members from the States. The AARB annually selects one of its members to serve as its chair.
All members are appointed by the NELAC Directorfollowing consultation with the Board of Directors.
Each member shall serve five years with one member appointed annually. The AARB has the
responsibility to monitor EPA to assure that EPA is following the NELAC standards for approving
the accrediting authorities, and to serve as an appeal board for accrediting authorities that have
been denied NELAP recognition or have had such recognition revoked (see Chapters). In all cases,
the final decision remains with the NELAP Director. The AARB will report on its activities to the
Board of Directors at each annual meeting.
1.6.4 The Participants
The participants consist of two groups, i.e., Voting Members and Contributors.
Membership is limited to officials who are in the employ of the Government of the United States and
the States, and who are actively engaged in environmental programs or accreditation of
environmental laboratories. State and federal participants being compensated by the private sector
to inspect environmental laboratories or as consultants are considered to have a conflict of interest
and are ineligible for Voting Membership but may participate as Contributors. The Voting Member
may vote and is eligible to serve on all committees and the Board of Directors. At the annual
meeting the Voting Members are divided into a House of Representatives and a House of Delegates.
The House of Representatives is composed of one officially designated representative from each
State, one representative from each of eight EPA Assistant/Associate Administrators, and one
representative from each EPA Region. Each other cabinet level federal department or independent
agency (as defined in the Constitution) with environmental laboratory accreditation, certification or
evaluation activities may appoint one official to the House of Representatives.
The House of Delegates is composed of all other State and federal environmental officials. The size
of the House of Delegates is not limited.
Contributors are all other interested parties and groups. They include, but are not limited to,
laboratory personnel, industry representatives, environmental groups, the general public, laboratory
associations, industry associations, accreditation associations and retired Voting Members. The
Contributors may not vote, but can make presentations, comments or input at all stages of the
standards and procedures making process, and do have the ability to enter the substantive debate
on the floor of the meeting as it occurs. Contributors are eligible to serve as non-voting participants
on all committees.
-------
NELAC
Program Policy and Structure
Revision 8
July 2, 1998
Page 9 of 18
1.6.4.1 Participation of the Voting Members and Contributors
Contributors, as well as Voting Members, have the right to appear before the standing committees
as they consider proposed standards and procedures related to the national accreditation program
and to debate the substantive issues before NELAC as such discussion occurs during the meeting.
Appearance before the committees will be in accordance with procedures approved by the Board
of Directors and Voting Membership.
1.6.5 The Committees
Two types of committee are associated with NELAC: Standing Committees and Administrative
Committees. Each committee has five Voting Members including the chair and five Contributors
who may not vote. Except for the Nominating Committee, the Voting Members of each committee
annually select a chair from one of its Voting Members. All committees report to NELAC through the
Board of Directors. Following each annual meeting, the Board of Directors will make available an
updated roster of the Board of Directors, NELAC officers and committee participants and chairs.
New Standing Committees:
The Board of Directors shall establish a new standing committee if the following conditions exist:
An ad hoc group appointed by a NELAC Chair has been studying an issue which is likely to require
continuing attention by NELAC; the ad hoc group has reached a consensus and is ready to develop
standards; once the standards are implemented, they are likely to need evaluation and revision in
the future; no NELAC committee exists to deal with the issue; the topic is of broad scope and has
impact on a significant portion of the laboratory community; the Program Policy and Structure
Committee has reviewed the proposal and has recommended that the new standing committee be
created; and the NELAC Voting Members have approved the creation of the committee.
1.6.5.1 The Standing Committees
The participants of each committee serve for five years, with one Voting Member and one
Contributor being appointed each year. There are seven Standing Committees:
Program Policy and Structure Committee
Accrediting Authority Committee
Quality Systems Committee
Proficiency Testing Committee
On-site Assessment Committee
Accreditation Process Committee
Regulatory Coordination Committee
The Standing Committees shall receive input regarding standards and test procedures, then process
this input into resolutions which shall be put before the Voting Membership at the annual meeting.
These resolutions will be made available not less than 45 calendar days prior to the annual meeting.
All resolutions shall be presented to the Voting Membership at the annual meeting for discussion and
ballot. The committees may also receive input via comments and presentations at the interim and
annual meetings. The committees shall draft resolutions which shall be made available not later
than 30 calendar days prior to either the interim or annual meetings. The committees shall prepare
and arrange agenda items for interim meetings and annual meetings to be made available 30
calendar days prior to the meeting.
-------
NELAC
Program Policy and Structure
Revision 8
July 2, 1998
Page 10 of 18
1.6.5.1.1 Program Policy and Structure Committee
This committee generates the Constitution and Bylaws of NELAC, and interprets the intent and
meaning of the Constitution and Bylaws, presents amendments, proposes changes in organizational
structure, and defines roles and responsibilities as appropriate, for approval of the Voting
Membership. This committee develops modifications to the scope, structure, and requirements to
the tiers and fields of testing.
1.6.5.1.2 Accrediting Authority Committee
This committee develops the standards for use by EPA to oversee compliance by State and federal
accrediting authorities with NELAC standards. This committee considers matters concerning
implementation of reciprocity among accrediting authorities.
1.6.5.1.3 Quality Systems Committee
This committee develops and keeps current uniform standards for quality systems in testing
operations. The elements of the quality system include organizational structure, responsibilities,
procedures, processes and resources (e.g., facilities, staff, equipment) for implementing quality
management in testing operations.
1.6.5.1.4 Proficiency Testing Committee
This committee develops standards for the proficiency testing samples, develops criteria for
selection of the providers of the samples, and develops and updates protocols for the use of
proficiency test samples and data in the accreditation of laboratories.
1.6.5.1.5 On-S'rte Assessment Committee
This committee generates procedures for the on-site assessments, and publishes standard check
lists based on these procedures. This committee also establishes the frequency of inspection, and
the minimum education, experience, and training requirements of the assessors.
1.6.5.1.6 Accreditation Process Committee
This committee generates and develops procedures for the administrative aspects of the
accreditation process of environmental laboratories, for use by the accrediting authorities, including
the requirements for accreditation, procedures for changes in accreditation status, roles and
responsibilities of laboratories, and appeal processes.
1.6.5.1.7 Regulatory Coordination Committee
This committee provides the Standing Committees with current information on regulations and laws
that impact laboratory testing and accreditation. The Regulatory Coordination Committee is also
responsible for the development of model language for state legislation and regulations that reflect
the findings and actions of NELAC.
1.6.5.2 The Administrative Committees
Administrative Committees have varying terms. The duties are outlined below. The term of service
shall be three years; two Voting Members and two Contributors will be appointed each of two years
and one Voting Member and one Contributor the third year, except for the Nominating Committee
(see below).
-------
NELAC
Program Policy and Structure
Revision 8
July 2,1998
Page 11 of 18
1.6.5.2.1 Conference Management Committee
This committee recommends to the Board of Directors the places and dates of each annual and
interim meeting of NELAC; and advises and assists the Executive Secretary with the logistic details
of the interim and annual meetings and with preparing publications for the Annual and interim
meetings.
1.6.5.2.2 Nominating Committee
The chair is the NELAC Past Chair. Four Voting Members and five Contributors shall be appointed
annually to serve one year. This committee presents nominees for all elective offices at the annual
meeting. The names of these nominees shall appear in the report of the Nominating Committee and
be published in the meeting announcement.
1.6.5.2.3 Membership and Outreach Committee
This committee initiates membership invitations and maintains an active roster, publicizes NELAC
to prospective participants, coordinates and resolves participants' concerns, and establishes criteria
and verifies the credentials of Voting Members.
This committee solicits and develops informational materials to promote understanding and
appreciation of the importance of consistent standards for environmental sampling and analysis in
fostering quality data on which to base responsible public and environmental health decisions.
This committee promotes a spirit of cooperation and timely dialogue between NELAC and other
organizations and federal agencies.
1.7 CONDUCT OF CONFERENCE BUSINESS
1.7.1 The Generation of Standards
The process for the generation and adoption of standards by a State accrediting authority is shown
in Figure 1 -2. The standards for the accreditation of laboratories begin with recommendations made
within or to the committees. Committees shall propose standards in the form of resolutions on which
the Voting Membership shall vote. Standards proposed by the committees are publicized on the
NELAC electronic bulletin board by EPA not later than 45 calendar days prior to the date of the
meeting at which they will be considered.
Proposed amendments from the floor to specific standards and proposals offered by the committee
for adoption by NELAC shall be allowed in the manner described in the Constitution and Bylaws.
Amendments to the report describing committee activities over the year will not be allowed without
the concurrence of the chair of the subject committee and the concurrence of the Chair of NELAC.
1.7.2 Meetings
1.7.2.1 Annual Meeting
An annual meeting of NELAC shall be held to conduct business including, but not limited to, election
of officers, consideration of issues for presentation to the membership for voting, receiving reports
from committees, task groups, or other sources, and conducting other business of NELAC. All final
action on resolutions or proposals shall take place at the annual meeting.
-------
NELAC
Program Policy and Structure
Revision 8
July 2,1998
Page 12 of 18
The Board of Directors shall determine the place and dates for the annual meeting, after receiving
recommendations from the Conference Management Committee, and shall publish this information
on the NELAC electronic bulletin board at least 90 calendar days prior to the annual meeting.
A completed registration for the annual meeting shall serve as the application for participation as
Voting Member or Contributor. The registration form must be completed by all potential participants,
whether or not attending the annual meeting. Prior to the annual meeting, the Executive Secretary
shall certify the names of the Voting Members and their alternates of the House of Representatives
to the Board of Directors. The Nominating Committee shall present, to the Board of Directors,
nominees for all elective offices for the annual meeting. The names and qualifications of the
nominees shall be published in the annual meeting announcement.
The following deadlines will apply in preparing and submitting material for the annual meeting:
a) Sixty calendar days prior to the date of the annual meeting, each of the standing committees
shall present to the Board of Directors a summary of the issues and matters considered by the
committees over the course of the year. This report shall discuss all matters which the
committee considered since its last report, including how the committee disposed of the issues
it considered. The report shall also contain draft standards for consideration by NELAC.
b) Committees shall prepare and arrange agenda items and resolutions for the annual meeting.
These, and other resolutions received by the Board of Directors will be made available not less
than 45 calendar days prior to the meeting.
c) Standards proposed by the committees for consideration at the annual meeting shall be
publicized on the electronic bulletin board not less than 45 calendar days prior to the annual
meeting.
As soon as possible, but no later than 90 calendar days after the annual meeting, the Board of
Directors shall make available an updated roster of the Board of Directors, NELAC officers,
committee members and chairs, and minutes and findings of the meeting to the participants. EPA
shall publish the revised standards as soon as possible, but no later than 90 calendar days after
the annual meeting. Changes in organization and/or procedures of NELAC proposed at the annual
meeting shall not be acted upon until the annual meeting following the annual meeting at which
proposed.
1.7.2.2 Interim Meeting
The interim meeting, at which time committees meet to receive, consider and debate issues, and
propose and draft standards or policies for the annual meeting, shall be scheduled at least six
months prior to the annual meeting.
The Board of Directors shall determine the place and dates for the interim meeting, after receiving
recommendations from the Conference Management Committee, and shall publish this information
on the NELAC electronic bulletin board at least 90 calendar days prior to the interim meeting.
Committees shall prepare and arrange agenda items for the interim meeting. The agenda shall be
approved by the Board of Directors and will be made available not less than 30 calendar days prior
to the date of the meeting.
Conclusions and findings of the interim meeting shall be provided to the participants not later than
90 calendar days following the interim meeting.
-------
NELAC
Program Policy and Structure
Revision 8
July 2, 1998
Page 13 of 18
1.7.2.3 Special Meetings
The NELAC Chair is authorized to call a meeting of the Board of Directors at any time deemed
necessary by the Chair to be in the best interests of NELAC. Announcements of the meetings and
meeting summaries or reports shall be made available to the participants.
1.7.2.4 Committee Meetings
Committees of NELAC are authorized to hold meetings at times other than the annual or interim
meeting. Announcements of the meetings and meeting summaries or reports shall be made
available to the participants.
1.8 ORGANIZATION OF THE ACCREDITATION REQUIREMENTS
1.8.1 Scope of Accreditation
Laboratories must meet all relevant EPA program requirements, including quality assurance/quality
control, use of specified methods, and other criteria.
The accreditation requirements shall be based on the tiered approach shown in Figure 1-3.
Laboratories must meet the general requirements found in Chapter 5, and the specific quality control
requirements for the type of testing being performed, as found in Appendix D of Chapter 5.
Accreditation will then be granted for compliance with the relevant EPA program, the methods used
by the laboratory, and for individual analytes determined by a particular method; e.g., a laboratory
determining lead in drinking water, in compliance with the Safe Drinking water Act, by both
inductively-coupled plasma mass spectrometry and graphite furnace atomicabsorptionspectrometry
would be accredited for lead by both methods. Loss of accreditation for an analyte would not
automatically result in loss of accreditation for all other analytes accredited under the method,
provided the laboratory remained proficient in the determination of the other analytes.
The following example shows the tiered approach applied to a laboratory seeking accreditation in
hazardous waste organic testing under the auspices of RCRA. The laboratory must meet all the
requirements listed in general laboratory (NELAC Chapter 5), chemistry (NELAC Chapter 5,
Appendix D.1), the RCRA regulations (40CFR261), and the method(s) used (e.g., SW846
5030/8240). In all cases, a NELAC accredited laboratory must be accredited for the specific method
it uses. In some cases the regulations mandate the method to be used (e.g., 40CFR261 specifies
SW846 Method 1311, TCLP). In other cases the regulations provide guidance for the methods
which can be used (e.g., 40CFR264, Appendix IX, suggests applicable methods). Finally, in some
situations the regulations provide no guidance as to the methods to be used (e.g., 40CFR268 lists
analytes required to be measured, with no guidance on methods). In those cases where the test
method is not mandated by regulation, the laboratory must be accredited for the specific method
used, as documented in the laboratory's SOP (see Chapter 5). This method must meet the relevant
start-up, calibration, and on-going validation and QC requirements specified in Chapter 5. The
tiered approach allows for the incorporation of performance based measurement systems (PBMS)
by substituting PBMS for the specified analytical methods when allowed under EPA regulations.
The tiered approach eliminates redundancy by allowing for the incorporation of new methods or new
instrumentation without the laboratories repeatedly demonstrating the basic requirements. This
structure defines the scope of accreditation for inclusion on the laboratory accreditation certificate.
The on-site assessment, proficiency testing evaluation, and data assessments are the processes for
assessing the capabilities of the laboratories within the tiered structure. These processes, defined
in Chapters 2 and 3, do not necessarily evaluate all tiers within the tiered structure; e.g., proficiency
testing examines the determination of individual analytes in specific matrix types, and is not method-
-------
NELAC
Program Policy and Structure
Revision 8
July 2,1998
Page 14 of 18
specific. However, they are comprehensive enough to assure the accrediting authority that a system
is in place that produces data of known and documented quality.
The procedure and conditions for interim accreditation are described in Chapter 4.
1.8.2 Supplemental Accreditation Requirements
In addition, a category of supplemental accreditation requirements is designated for additional
methods or analytes required by an accrediting authority. Supplemental accreditation requirements
shall be reserved for methods or analytes that are not required under any of the EPA programs that
are part of NELAC, and shall not be used to modify any NELAC standards for analytes or methods.
Any supplemental accreditation requirements essential to meet the specific needs of an accrediting
authority would be added at the method-specific or analyte level, and must be approved by NELAP
and made available to all NELAC participants. Exceptions to this requirement may be necessary
(e.g., national security concerns) and will be processed as waivers by the AARB.
1.8.3 General Laboratory Requirements
The general requirements are applicable to all laboratory applicants regardless of their size, volume
of business, or field of testing. The organizational structure, or procedures used by applicant
laboratory organizations to meet these general requirements may differ as a function of size or
scope of testing of an organization. Under the tiered approach the general requirements shall
include the elements outlined in Chapter 5.
The following applicable requirements are presented in Chapter 5 (Quality Systems): Organization
and Management (5.4); Quality System- Establishment, Audits, Essential Quality Controls and Data
verification (5.5); Personnel (5.6); Physical Facilities - Accommodation and Environment (5.7);
Equipment and Reference Materials (5.8); Measurement Traceability and Calibration (5.9); Test
Methods and Standard Operating procedures (5.10); Sample Handling, Sample Acceptance Policy
and Sample Receipt (5.11); Records (5.12); Laboratory Report Format and Contents (5.13);
Subcontracting Analytical Samples (5.14); Outside Support Services and Supplies (5.15); and
Complaints (5.16).
1.8.4 General Field Sampling Requirements
(To be developed)
1.8.5 Chemistry Requirements
The following applicable requirements are presented in Section D.1 of Appendix D of Chapter 5
(Quality Systems): Positive and Negative Controls (D.1.1); Analytical Variability/Reproducibility
(D.1.2); Method Evaluation (D.1.3); Sensitivity (D.1.4); Data Reduction (D.1.5); Quality of Standards
and Reagents (D.1.6); Selectivity (D.1.7); and Constant and Consistent Test Conditions (D.1.8).
1.8.6 Whole Effluent Toxicity Requirements
The following applicable requirements are presented in Section D.2 of Appendix D of Chapter 5
(Quality Systems): Positive and Negative Controls (D.2.1); Variability and/or Reproducibility (D.2.2);
Accuracy (D.2.3); Test Sensitivity (D.2.4); Selection of Appropriate Statistical Analysis Methods
(D.2.5); Selection and Use of Reagents and Standards (D.2.6); Selectivity (D.2.7); and Constant and
Consistent Test Conditions (D.2.8).
-------
NELAC
Program Policy and Structure
Revision 8
July 2,1998
Page 15 of 18
1.8.7 Microbiology Requirements
The following applicable requirements are presented in Section D.3 of Appendix D of Chapter 5
(Quality Systems): Positive and Negative Controls (D.3.1); Test Variability/Reproducibility (D.3.2);
Method Evaluation (D.3.3); Test Performance (D.3.4); Data Reduction (D.3.5); Quality of Standards,
Reagents and Media (D.3.6); Selectivity (D.3.7); and Constant and Consistent test Conditions
(D.3.8).
1.8.8 Radiochemistry Requirements
The following applicable requirements are presented in Section D.4 of Appendix D of Chapter 5
(Quality Systems); Negative Controls (D.4.1); Positive Controls (D.4.2); Test
Variability/Reproducibility (0.4.3); Other Quality Control Measures (D.4.4); Method Evaluation
(D.4.5); Radiation Measurement System Calibration (D.4.6); Method Detection Limits (D.4.7); Data
Reduction (0.4.8); Quality of Standards and Reagents (D.4.9); and Constant and Consistent Test
Conditions (D.4.10).
1.8.9 Microscopy Requirements
(To be developed)
1.8.10 Field Measurement Requirements
(To be developed)
-------
NELAC
Program Policy and Structure
Revision 8
July 2,1998
Page 16 of 18
III
O
Ill
I
O
R
1
ui
!
!i
0
i!
i
II U II
il
IIH
•
I H I
'I I
«a
j
ii
jfl
Jli
i
i
2
CO
O
uu
-------
NELAC
Program Policy and Structure
Revision 8
July 2, 1998
Page 17 of 18
Committee proposes Standards or
changes to Standards
Proposed Standards published by EPA
1
Interim Meeting for
input and preparation of draft Standards
1
Annual Meeting
Committees present Draft Standards as
Resolutions
ouse of Representatives
and House of Delegates
approve Standards?
Approved Standards Published by EPA
State and/or Federal
Agency
Adopts Standards?
and/or Federal Agency
participates
in NELAP for the relevant
field of testing
No
Yes
Laboratories in State seek
accreditation from
the primary accrediting authority in
that State
Laboratories in State seek
accreditation from
any primary accrediting authority
Figure 1-2. Flowchart for Standards Development and Implementation
-------
NELAC
Program Policy and Structure
Revision 8
July 2,1998
Page 18 of 18
il
§
^j _
I
|
I
•s
I
I
8
I
I
V
I!
ii
!]
IS
.1
i
i
8.
8
CO
LU
z
CO
I
o>
-------
_ o
(0 S
g
E o
§ o
.^ o
c
LJJ
CD
o
.2 P
o
O
PROFICIENCY
TESTING
July 2, 1998
-------
NELAC
Proficiency Testing
Revision 9
July 2, 1998
Page i of i
TABLE OF CONTENTS
PROFICIENCY TESTING
2.0 PROFICIENCY TESTING PROGRAM: INTERIM STANDARDS 1
2.1 INTRODUCTION, SCOPE, AND APPLICABILITY 1
2.1.1 Purpose 2
2.1.2 Goals 2
2.1.3 PT Fields of Testing 2
2.2 MAJOR PT GROUPS AND THEIR RESPONSIBILITIES 3
2.2.1 Proficiency Testing Study Providers 3
2.2.2 Proficiency Testing Oversight Body (PTOB)/Proficiency Test Provider Accreditor
(PTPA) 4
2.2.3 Laboratories 4
2.2.4 Accrediting Authorities (AA) 4
2.3 REQUIREMENTS FOR PT PROVIDERS 4
2.3.1 On-Site Inspection of PT Providers 4
2.3.2 Sample Requirements and Design 4
2.3.2.1 Sample Analytes 5
2.3.2.2 Provider Sample Testing 5
2.3.3 PT Study Data Analysis 5
2.3.3.1 Data Acceptance Criteria 5
2.3.4 Generation of Study Reports 5
2.3.5 Provider Conflict of Interest 5
2.3.6 Disapproval of PT Study Providers 5
2.3.7 PTOB/PTPA Listing of PT Providers 5
2.4 LABORATORY ENROLLMENT IN PROFICIENCY TESTING PROGRAM(S) 6
2.4.1 Required Level of Participation 6
2.4.2 Requesting Accreditation 6
2.4.3 Reporting Results 6
2.5 REQUIREMENTS FOR LABORATORY TESTING OF PT STUDY SAMPLES 6
2.5.1 Restrictions on Exchanging Information 6
2.5.2 Maintenance of Records 7
2.6 EVALUATION OF PROFICIENCY TESTING RESULTS 7
2.7 PT CRITERIA FOR LABORATORY ACCREDITATION 7
2.7.1 Result Categories 7
2.7.2 Initial and Continuing Accreditation 7
2.7.3 Supplemental Studies 7
2.7.4 Failed Studies and Corrective Action 8
2.7.5 Second Failed Study 8
2.7.6 Scheduling of PT Studies 8
Figure 2-1. NELAP Proficiency Testing 3
APPENDIX A PT PROVIDER APPROVAL CRITERIA
APPENDIX B PT SAMPLE DESIGN & ACCEPTANCE GUIDELINES
APPENDIX C PT ACCEPTANCE CRITERIA AND PT PASS/FAIL CRITERIA
APPENDIX D PROFICIENCY TESTING OVERSIGHT BODY/PROFICIENCY TEST
PROVIDER ACCREDITOR
APPENDIX E MICROBIOLOGY
-------
NELAC
Proficiency Testing
Revision 9
July 2, 1998
Page 1 of 8
2.0 PROFICIENCY TESTING PROGRAM: INTERIM STANDARDS
For the period beginning with adoption of these standards by the National Environmental Laboratory
Accreditation Conference (NELAC) and ending July 31,1999, all National Environmental Laboratory
Accreditation Program (NELAP)-approved primary accrediting authorities shall accept data only from
proficiency testing programs that meet the requirements of current U.S. Environmental Protection
Agency (USEPA) and state regulations . Following implementation of the National Institute of
Standards and Technology (NIST) National Voluntary Laboratory Accreditation Program (NVLAP)
for Providers of Proficiency Testing, and before a Proficiency Test Provider distributes PT samples
to laboratories for the purpose of the laboratories obtaining or maintaining NELAP accreditation, the
provider shall first obtain NVLAP accreditation for all compounds/matrices for which NIST
accreditation is available, and for which the provider intends to provide NELAC PT samples. The
intent of these interim standards is to ensure that primary accrediting authorities accept for the
purposes of NELAP accreditation all PT samples which are distributed by PT Providers which are
NIST/NVLAP accredited for those compounds / matrices, and to continue the status quo for all other
programs and compounds for which NIST NVLAP accreditation is not currently available.
2.1 INTRODUCTION, SCOPE, AND APPLICABILITY
This chapter and the associated appendices define the major participating organizations and
components of the NELAC Proficiency Testing (PT) Program. In addition to complying with the
requirements of this Chapter, any person, private party or government entity seeking to participate
as a PT Provider in the NELAC program shall also comply with the requirements of the applicable
Appendices A (PT Provider Approval Criteria), B (PT Sample Design and Acceptance Guidelines),
C (Proficiency Testing Acceptance Criteria and Proficiency Testing Pass/Fail Criteria) and D
(Proficiency Testing Oversight Body). The criteria set forth in these standards shall be used by
laboratories and PT Providers for the purposes of obtaining or maintaining NELAP accreditation or
NELAP approval.
In addition to complying with the requirements of this Chapter and Appendices, any entity seeking
to participate as a PT Provider in the NELAP program shall also comply with all applicable
requirements of "National Standards for Water Proficiency Testing Studies, Criteria Document", U.S.
Environmental Protection Agency.
Proficiency Testing (PT) is defined for the purpose of this Chapter as a means of evaluating a
laboratory's performance under controlled conditions relative to a given set of criteria through
analysis of unknown samples provided by an external source. PT is not the sole criterion for
determining accreditation status. Additional essential elements of the overall NELAC accreditation
process, including the on-site assessment, are discussed in other chapters of the NELAC standards.
The PT program is intended to cover all types of federal and state environmental analyses.
However, the body of the PT standard applies primarily to chemistry. Appendices (yet to be
developed) shall describe necessary variations as applied to radiochemistry, environmental
toxicology, and microbiology.
The major components of the NELAC PT Program include:
a) multiple PT Providers who shall meet stringent criteria to become Approved by a Proficiency
Testing Oversight Body (PTOB)/Proficiency Test Provider Accreditor (PTPA), as described in
Section 2.3 and Appendix A;
-------
NELAC
Proficiency Testing Program
Revision 9
July 2, 1998
Page 2 of 8
b) specific requirements for the design of PT samples and studies, to ensure that all samples
provide a consistent, fair and known challenge to laboratories seeking accreditation from a
NELAP-approved accrediting authority, as described in Section 2.3 and Appendix B;
c) specifically defined pass/fail criteria for evaluating PT sample results, as described in Section
2.3 and Appendix C;
d) initial approval and ongoing oversight of PT Providers by a Proficiency Testing Oversight Body
(PTOB)/Proficiency Test Provider Accreditor (PTPA), Section 2.3 and Appendix D;
e) specific requirements for laboratories participating in PTOB-/PTPA-approved PT Programs, as
described in Sections 2.4, 2.5, and 2.7; and
f) oversight of all PT Program activities by the PTOB(s)/PTPA(s), as described in Section 2.2.1.
2.1.1 Purpose
The PT program incorporates several practical purposes, which include:
a) the production and supply of test samples that are procedure-sensitive; that is, the samples
challenge the critical components of each analytical procedure, ranging from initial sample
preparation to final data analysis;
b) the production and supply of test samples that are as similar to real-world samples as is
reasonably possible. It is further expected that the PT samples shall be representative of
materials analyzed for environmental regulatory programs, agencies, and communities;
c) a program which is affordable by all participants;
d) the yielding of PT data that are technically defensible on the basis of the type and quality of the
samples provided;
e) the preparation of samples such that the identification and quantitation of analytes in the
samples poses equivalent difficulty and challenge regardless of the manner in which the
samples are designed and manufactured by the PT Providers, i.e. samples prepared for analysis
by a Drinking Water or Wastewater method would pose equal challenge whether prepared as
whole volume or as a concentrate in ampules.
2.1.2 Goals
The PT program incorporates several practical goals, which include:
a) the generation of data at a quality level required by environmental and regulatory programs;
b) the generation of data that are, at a minimum, comparable in quality to that of currently certified
and/or accredited laboratories; and
c) the improvement of the overall performance of laboratories over time.
2.1.3 PT Fields of Testing
The PT program is organized by PT fields of testing. Laboratories may choose to participate in one
or more PT fields of testing. The following elements collectively define PT fields of testing:
-------
NELAC
Proficiency Testing
Revision 9
July 2, 1998
Page 3 of 8
a) Regulatory or environmental program, as listed in Chapter 1,
b) Matrix type (e.g. gas, aqueous liquid, nonaqueous liquid, solid), and
c) Analyte
2.2 MAJOR PT GROUPS AND THEIR RESPONSIBILITIES
The PT program structure incorporates five major groups with separate and distinct roles and
responsibilities. The groups are NELAC, the Proficiency Testing Oversight Body (PTOB)/Proficiency
Test Provider Accreditor (PTPA), the PT Providers, the testing laboratories, and the primary
Accrediting Authorities (AA). The lines of interaction among these groups are shown in Figure 2-1.
2.2.1 Proficiency Testing Study Providers
The providers shall produce and distribute PT samples, evaluate study results against published
performance criteria, and report the results to the laboratories, the respective primary Accrediting
Authorities, the appropriate PTOB/PTPA, and NELAP. The PT Provider shall meet the
requirements of Appendix A, manufacture samples that meet the requirements of Appendix B, and
score sample results in accordance with the requirements of Appendix C.
Standard-Setting
Authority
NELAC
PTPA/PTOB
Primary Accrediting
Authority
States/EPA
Providers
Laboratories
(Private Sector,
Non-Profits, and/
or States
Figure 2-1. NELAP Proficiency Testing
-------
NELAC
Proficiency Testing Program
Revision 9
July 2, 1998
Page 4 of 8
2.2.2 Proficiency Testing Oversight Body (PTOB)/Proficiency Test Provider Accreditor
(PTPA)
The Proficiency Testing Oversight Body (PTOB)/Proficiency Test Provider Accreditor (PTPA)
establishes and implements a program to accredit PT study suppliers and to monitor accredited
suppliers to ensure that their studies and practices meet all applicable standards. The PTOB/PTPA
shall meet the requirements of Appendix D. Organizations meeting the requirements of this
Standard and its appendices, as determined by the NELAC Standing Committee on Proficiency
Testing, may be nominated by the Committee to NELAP to be designated as a PTOB/PTPA.
NELAP may approve or disapprove the designation of an organization as a PTOB/PTPA. The
Committee may also recommend to NELAP that a PTOB/PTPA's designation be withdrawn for
failing to meet the criteria in this standard and appendices.
2.2.3 Laboratories
Laboratories that seek to become accredited shall perform analyses of PT samples as required by
this chapter. PT samples shall be obtained from NELAP designated PTPA- approved PT Providers.
The laboratory shall obtain PT samples from any so approved PT Provider. The results of the
analyses shall be submitted to the Provider for scoring. Accrediting Authorities shall accept for the
purposes of initial and continuing accreditation, PT results from any NELAP approved provider that
meets the requirements of this standard.
2.2.4 Accrediting Authorities (AA)
The primary accrediting authorities shall make all decisions regarding a laboratory's accreditation
status. They are responsible for taking action to make these determinations including ensuring that
laboratories seeking or holding their accreditations have participated in the PT program.
2.3 REQUIREMENTS FOR PT PROVIDERS
This section and associated Appendix A describe the criteria that all PT providers shall meet in order
to be approved by the PTOB/PTPA as PT Providers. A PTOB/PTPA shall grant approval to PT
providers on a field-of-testing basis, as described in Section 2.1.3.
2.3.1 On-Site Inspection of PT Providers
A PTOB/PTPA shall conduct an on-site inspection of any organization seeking to participate as a
PT Provider in the NELAC Program, as described in Appendix D. The PTOB/PTPA shall determine
whether the Provider meets the applicable requirements described in this Chapter and Appendices
A, B, and C. Approval of a PT Provider shall be the responsibility of a PTOB/PTPA. A
PTOB/PTPA shall conduct ongoing oversight of the PT Providers as necessary to ensure
conformance with all applicable standards.
2.3.2 Sample Requirements and Design
This Section and associated Appendix B describe PT sample design and acceptance criteria. The
matrices of all PT samples shall to the extent possible, resemble the matrices for which the
laboratory seeks accreditation. Samples may not be reused in any subsequent NELAC PT study.
-------
NELAC
Proficiency Testing
Revision 9
July 2, 1998
Page 5 of 8
2.3.2.1 Sample Analytes
The PT Provider shall prepare each sample lot such that the prepared concentration of each analyte
in each lot is unique. The required group of analytes covering each field of testing shall be
determined by NELAC Standing Committee on Proficiency Testing and shall be evaluated and
updated, as necessary. Within each study, a certain minimum number of analytes shall be present.
The group of analytes included shall change over time so that all analytes are included at least once
every three years over a series of sequential studies.
2.3.2.2 Provider Sample Testing
The PT Provider shall design, manufacture, and test the samples for homogeneity, stability, and
verification of prepared values as required by Appendix B. This testing shall verify that the quality
of all samples is acceptable for use in each field of testing PT study.
2.3.3 PT Study Data Analysis
This Section and associated Appendix C describe the criteria to be used by PT Providers when
scoring and evaluating NELAC PT sample results.
2.3.3.1 Data Acceptance Criteria
PT Providers shall use the data acceptance criteria described in Appendix C to evaluate
laboratories' PT data to ensure a laboratory's performance shall be judged fairly and consistently.
2.3.4 Generation of Study Reports
Each PT Provider shall evaluate the data and issue a report within 21 calendar days of the close of
each study.
2.3.5 Provider Conflict of Interest
Each PT Provider shall certify that it is free of any organizational conflict of interest. A PT sample
producer shall never split a sample lot and offer these samples for sale as known-value check
samples before the unknown samples are used in a PT study. In addition, each provider shall follow
procedures and have systems in place that maintain confidentiality and security of all prepared
values through the closing date of each study. All records shall be retained for a period of five years
or as required by the appropriate regulatory program.
2.3.6 Disapproval of PT Study Providers
A PT Provider's approval may be subjected to revocation per the procedures outlined in Appendix
A, Section A.9.2.
2.3.7 PTOB/PTPA Listing of PT Providers
PTOBs/PTPAs shall maintain a list of Approved PT Providers. PTOBs/PTPAs shall evaluate,
update, and publish this list at intervals not to exceed six months. On this same interval,
PTOBs/PTPAs shall also publish the list of PT fields of testing necessary to satisfy the PT
requirements as determined in Section 2.3.2.1.
-------
NELAC
Proficiency Testing Program
Revision 9
July 2, 1998
Page 6 of 8
2.4 LABORATORY ENROLLMENT IN PROFICIENCY TESTING PROGRAM(S)
2.4.1 Required Level of Participation
To be accredited initially and to maintain accreditation, each laboratory shall participate in PT
studies provided by a PTPA-approved PT Provider. Laboratories must participate in PT studies for
each field of testing, as described in Chapter 1. Each laboratory shall participate in at least two PT
studies per year unless a different frequency for a given program is defined in the Appendices. The
PT Provider shall design studies that require the analysis of one test sample for each field of testing.
Section 2.5 describes the time period in which a laboratory must analyze the PT samples and report
the results. Data and laboratory evaluation criteria are discussed in Sections 2.6 and 2.7 of this
Chapter.
2.4.2 Requesting Accreditation
At the time each laboratory applies for accreditation, it shall notify the primary accrediting authority
which field(s) of testing it chooses to become accredited for and shall participate in the appropriate
PT studies. For all fields of testing, including those for which PT samples are not available, the
laboratory shall ensure the reliability of its testing procedures by maintaining a total quality
management system that meets all applicable requirements of Chapter 5 of the NELAC standards.
2.4.3 Reporting Results
Laboratories seeking accreditation may select any provider from the list of PTPA-approved PT study
providers. The laboratories shall bear the cost of any PT study subscription. Each laboratory shall
authorize the PT study provider to release all accreditation and remediation results and pass/fail
status directly to the appropriate accrediting authority, NELAP and the PTOB/PTPA, in addition to
the laboratory.
2.5 REQUIREMENTS FOR LABORATORY TESTING OF PT STUDY SAMPLES
A laboratory must participate in two single-blind, single-concentration PT studies, where available,
provided by a PTPA-approved PT provider per year for each field of testing for which it seeks or
wants to maintain accreditation. The samples shall be analyzed and the results returned to the PT
study provider no later than 45 calendar days from the scheduled study shipment date. The
laboratory's management and all analysts shall ensure that all PT samples are handled (i.e.,
managed, analyzed, and reported) in the same manner as real environmental samples utilizing the
same staff, methods as used for routine analysis of that analyte, procedures, equipment, facilities,
and frequency of analysis.
2.5.1 Restrictions on Exchanging Information
Laboratories shall comply with the following restrictions on the transfer of PT samples and
communication of PT sample results prior to the time the results of the study are released:
a) A laboratory shall not send any PT sample, or a portion of a PT sample, to another laboratory
for any analysis for which it seeks accreditation, or is accredited;
b) A laboratory shall not knowingly receive any PT sample or portion of a PT sample from another
laboratory for any analysis for which the sending laboratory seeks accreditation, or is accredited;
c) Laboratory management or staff shall not communicate with any individual at another laboratory
(including intracompany communication) concerning the PT sample; and
-------
NELAC
Proficiency Testing
Revision 9
July 2, 1998
Page 7 of 8
d) Laboratory management or staff shall not attempt to obtain the prepared value of any PT
sample from their PT Provider.
2.5.2 Maintenance of Records
The laboratory shall maintain copies of all written, printed, and electronic records, including but not
limited to bench sheets, instrument strip charts or printouts, data calculations, and data reports,
resulting from the analysis of any PT sample for five years or for as long as is required by the
applicable regulatory program, whichever is greater. These records shall include a copy of the PT
study report forms used by the laboratory to record PT results. All of these laboratory records shall
be made available to the assessors of the primary accrediting authority during on-site audits of the
laboratory.
2.6 EVALUATION OF PROFICIENCY TESTING RESULTS
PT study providers shall evaluate results from all PT studies using NELAC-mandated acceptance
criteria described in Appendix C. The NELAC Standing Committee on Proficiency Testing shall
provide, and update as necessary, the data acceptance criteria that all PT study providers shall use
for all PT studies. Each result shall be scored on an acceptable/not acceptable basis. The PT study
provider shall provide the participant laboratories and the primary accrediting authority a report
showing at least the prepared value, the acceptance range, and the acceptable/not acceptable status
for each analyte reported by the laboratory. The report shall be sent no later than 21 calender days
from the study closing date. The providers shall not disclose specific laboratory results or
evaluations to any other parties not described in this section.
2.7 PT CRITERIA FOR LABORATORY ACCREDITATION
2.7.1 Result Categories
The criteria described in this section apply individually to each field of testing, as defined by the
laboratory seeking accreditation in its accreditation request. These criteria apply only to the PT
portion of the overall accreditation standard, and the primary accrediting authority shallconsider PT
results along with the other elements of the NELAC standards when determining a laboratory's
accreditation status. The primary accrediting authority ultimately makes all decisions regarding the
accreditation status of the laboratory. There are two PT result categories: "acceptable" and "not
acceptable."
2.7.2 Initial and Continuing Accreditation
A laboratory seeking accreditation shall successfully complete two PT studies for each requested
field of testing within the most recent three rounds attempted. Successful performance is described
in Appendix C. Once a laboratory has been granted accreditation status, it must continue to
complete PT studies and maintain a history of at least two acceptable studies out of the most recent
three. For initial accreditation or remedial testing, the studies must be at least 30 calendar days
apart. For continuing accreditation, completion dates of successive proficiency rounds for a given
field of study must be approximately six months apart. Failure to meet the semiannual schedule is
regarded as a failed study after seven months.
2.7.3 Supplemental Studies
A laboratory may elect to participate in PT studies more frequently than required by the semiannual
schedule as set by the primary accrediting authority. This may be desirable, for example, when a
laboratory first applies for accreditation or when a laboratory fails a study and wishes to quickly re-
-------
NELAC
Proficiency Testing Program
Revision 9
July 2, 1998
Page 8 of 8
establish its history of successful performance. These additional studies are not distinguished from
the routinely scheduled studies; that is, they are counted and scored the same way and must be at
least 30 calendar days apart.
2.7.4 Failed Studies and Corrective Action
Whenever a laboratory fails a study, it shall determine the cause for the failure and take any
necessary corrective action. It shall then document in its own records and provide to the primary
accrediting authority both the investigation and the action taken. If a laboratory fails two out of the
three most recent studies for a given field of testing, its performance is considered unacceptable
under the NELAC PT standard for that field. The laboratory must then meet the requirements of
initial accreditation as described in Section 2.7.2 - Initial and Continuing Accreditation.
2.7.5 Second Failed Study
The PT Provider reports laboratory PT performance results to the primary accrediting authority at
the same time that it reports the results to the laboratory. If a laboratory fails a second study out of
the most recent three, as described above, the accrediting authority shall take action, pursuant to
chapter 4, within 60 calendar days to determine the accreditation status of all methods for the
unacceptable analyte(s) for that program and matrix.
2.7.6 Scheduling of PT Studies
Primary accrediting authorities may specify the months that laboratories within its authority are
required to participate in NELAP PT programs.
-------
PROFICIENCY TESTING
APPENDIXA
PT PROVIDER APPROVAL CRITERIA
-------
NELAC
Proficiency Testing
Appendix A
July 2, 1998
Revision 9
Page 2A-1 of 3
A.0.0 SCOPE
This Appendix describes the responsibilities and requirements a Proficiency Testing (PT) Provider
shall meet in order to be a Proficiency Testing Oversight Body (PTOB) /Proficiency Test Provider
Accreditor (PTPA) Approved PT Provider. In order for a PT Provider to participate in the NELAC
PT Program, a Provider must be approved by a PTOB/PTPA. The criteria provided below are
designated to ensure the integrity and technical excellence of the NELAC PT Program while allowing
all qualified Providers to participate in the program.
A.1.0 APPROVAL PROCESS
The process for approval of a PT Provider includes a biennial on-site inspection by a PTOB/PTPA
to ensure that the technical criteria of this appendix are being met. At the discretion of the
PTOB/PTPA, the PT Provider may be requested to confirm their ability to perform analyses within
the required limits through participation in a proficiency testing program operated by the
PTOB/PTPA, or through the analysis of unknown samples provided by the PTOB/PTPA. Providers
are also required to submit the results of PT programs operated for NELAC to the PTOB/PTPA for
review and evaluation. The PT Provider agrees to accept the findings and decisions of the
PTOB/PTPA as final.
A.2.0 QUALITY SYSTEM REQUIREMENTS
The manufacturing quality system used by the PT Provider must meet the requirements of both
International Organization for Standardization (ISO) 9001 for the design, production, testing, and
distribution of performance evaluation samples and the requirements of ISO Guide 34 Quality
System Guidelines for the Production of Reference Materials. The design and operation of the PT
Provider's proficiency testing program must meet the requirements of ISO Guide 43, Proficiency
Testing by Interlaboratory Comparisons. The testing facilities used to support the verification,
homogeneity, and stability testing required in Appendix B of this document must meet the
requirements of both ISO Guide 25 General Requirements for the Competency of Testing and
Calibration Laboratories and Chapter 5, Quality Systems, of the NELAC standards. The ability to
meet the ISO 9001 quality system requirement may be fulfilled through registration of the PT
Provider's quality system to American National Standards Institute (ANSI) standards by a Registrar
Accreditation Board (RAB) accredited registrar. However, a biennial on-site inspection by the
PTOB/PTPA demonstrating continuing conformance is required.
A.3.0 PROVIDER FACILITIES AND PERSONNEL
Each Provider is required to have systems in place to produce, test, distribute, and provide data
analysis and reporting functions for any series of samples for which they are requesting approval.
Similarly, the Provider shall have in place sufficient technical staff, instrumentation, and computer
capabilities as may be required by the PTOB/PTPA to support the production, distribution, analysis,
data collection, data analysis, and reporting functions of the samples. No portion of the production,
testing, distribution, data collection, data analysis, nor data reporting functions may be outside the
control of the PT Provider for any particular study, since it is essential that the confidentiality of the
samples be maintained throughout the PT study. For the purposes of this requirement "control" can
mean ownership or that the subcontracted service is performed under an agreement which
specifically ensures the ability of the Provider to access and restrict the distribution of information
related to these services. Any subcontracted services must be assessed by a PTOB/PTPA and
meet the same criteria as the PT Provider.
-------
NELAC
Proficiency Testing
Appendix A
July 2, 1998
Revision 9
Page2A-2of3
A.4.0 SAMPLE FORMULATION REVIEW
The PT Provider must demonstrate to the PTOB/PTPA, by the submission of appropriate data, that
the sample formulation for which the PT Provider is seeking approval shall permit participating
laboratories to generate results that fall within the sample acceptance ranges established by the
NELAC Standing Committee on Proficiency Testing and meet the criteria of the National Standards
for Proficiency Testing.
A.4.1 Release of Information
In support of the above requirement, PTOBs/PTPAs agree to treat all sample formulation
information submitted to them for review as the proprietary information of the PT Provider submitting
the information. Such formulation information shall not be released by a PTOB/PTPA without the
prior written consent of the PT Provider.
A.5.0 PROVIDER CONFLICT-OF-INTEREST REQUIREMENTS
PT Providers seeking approval shall document to the satisfaction of the PTOB that they do not have
a conflict-of-interest with any laboratory seeking, or having, NELAP accreditation. PT Providers
shall notify the PTOB of any actual or potential organizational conflicts of interest, including but not
limited to:
a) Any financial interest in a laboratory seeking, or having, NELAP accreditation;
b) The sharing of personnel, facilities or instrumentation with a laboratory seeking, or having,
NELAP accreditation.
The PT Provider is also required to inform all internal and contract personnel who perform work on
NELAC PT samples of their obligation to report personal and organizational conflicts of interest to
the PTOB/PTPA. The Provider shall have a continuing obligation to identify and report any actual
or potential conflicts of interest arising during the performance of work in support of NELAC PT
programs. If an actual or potential organizational conflict of interest is identified during performance
of work in support of NELAC PT programs, the PT Provider shall immediately make a full disclosure
to the PTOB/PTPA. The disclosure shall include a description of any action which the Provider has
taken or proposes to take, after consultation with the PTOB/PTPA, to avoid, mitigate or neutralize
the actual or potential conflict of interest. The PTOB/PTPA may reevaluate a PT Provider's
Approval status as a result of unresolved conflict of interest situations. Any conflict of interest
disputes between the PT Provider and the PTOB/PTPA may be appealed to NELAP for a final
determination.
A.5.1 Ban on Distribution of Samples
PT Providers shall not sell, distribute, or provide samples used in the NELAC PT program prior to
the conclusion of the study for which they were designed. Providers further agree not to sell,
distribute, or provide samples of identical formulation and concentration to those samples which it
is currently using in a NELAC study.
A.6.0 CONFIDENTIALITY OF PT STUDY DATA
The PT Provider shall demonstrate to the PTOB/PTPA that is has systems in place to ensure that
the confidentiality of data associated with NELAC PT samples and programs are not compromised.
PT Providers shall not release the Prepared Value of any sample currently being used in a NELAC
-------
NELAC
Proficiency Testing
Appendix A
July 2, 1998
Revision 9
Page 2A-3 of 3
PT study prior to the conclusion of the study. The PT Provider also agrees that the acceptance
ranges provided to them by either NELAC, or the PTOB/PTPA, are the proprietary information of
NELAC, or the PTOB/PTPA, and shall not be disclosed by the PT Provider without the written
approval of the PTOB/PTPA.
A.7.0 DATA REVIEW AND EVALUATION
The NELAP designated PTOB/PTPA shall review the data from every PT Provider's studies to
ensure that acceptance limits used to evaluate laboratories are consistent with national standards
as established by NELAC. The PTOB/PTPA shall also evaluate the performance of the PT
Providers by monitoring, and reporting, to both the Providers and the NELAC Standing Committee
on Proficiency Testing the pass/fail rates of all Providers on all samples tested. A PTOB/PTPA is
required to investigate any PT Provider whose pass/fail rate is statistically different from the national
average.
A.8.0 COMPLAINTS & CORRECTIVE ACTION
Written complaints received by the PT Provider regarding technical or procedural aspects of the
studies they conduct must be submitted to the PTOB/PTPA. The PT Provider shall resolve the
complaint to the satisfaction of the PTOB/PTPA. The PTOB/PTPA is the sole judge of the adequacy
of the corrective action taken by the PT Provider. The PTOB/PTPA shall provide NELAP with an
annual summary of all PT Provider complaints received during the prior year.
A.9.0 LOSS OF PROVIDER APPROVAL
PT Providers who fail to meet the requirements of these standards may be subject to loss of their
approval as a NELAC PT Provider. Providers may lose approval to provide individual sample sets
based upon review of PT study data by a PTOB/PTPA as required in Appendix A, Section A.7.
Similarly, PT Providers who fail to meet the requirements of Appendix A, Sections A2 through A6,
on a continuous basis may lose their approval as a PTPA-approved PT Provider for all samples.
A.9.1 Periodic Review of PT Providers
A PTOB/PTPA may at any time, review the performance of any approved PT Provider against these
standards. Based upon this review, the PTOB/PTPA may decide that the approval status of a PT
Provider be revoked, adjusted, limited, or otherwise changed based upon failure to meet one or
more of the specified requirements.
A.9.2 Revocation of Approval
Should a PTOB/PTPA propose to revoke or suspend a provider's approval for failure to meet the
requirements of these standards, the PTOB/PTPA shall inform the provider of the reasons for the
proposed revocation or suspension and the procedures for appeal of such a decision. The due
process rights of the provider shall be protected during any revocation or suspension proceedings.
The final decision on the revocation or suspension of a provider's approval to supply PT samples
for the NELAC accreditation program resides with the Executive Director of NELAP. If the provider
loses NVLAP accreditation it shall lose NELAP approval to supply samples for the NELAC PT
program.
-------
PROFICIENCY TESTING
APPENDIX B
PT SAMPLE DESIGN
& ACCEPTANCE GUIDELINES
-------
NELAC
Proficiency Testing
Appendix B
July 2, 1998
Revision 9
Page 2B-1 of 3
B.0.0 INTRODUCTION
An integral element of the NELAC PT Program Standards is the assurance of PT samples which are
of high quality, well documented, homogeneous, and stable. In order to meet the goals of NELAC,
the PT samples used in the program must also provide all laboratories with samples which offer a
consistent challenge. All PT samples must meet all applicable specifications of these standards.
B.1.0 VERIFICATION OF PREPARED VALUE
All PT samples used in the NELAC program must be analyzed by the PT provider prior to shipment
to the laboratories to ensure suitability for use in the program. The Prepared Value of the sample
shall be used to establish acceptance criteria, and it must be verified by analysis. PT providers must
verify the Prepared Value by direct analysis against National Institute of Standards and Technology
(NIST) Standard Reference Materials (SRM), if a suitable NIST SRM is available for use. If a NIST
SRM is not available then verification must be performed against an independently prepared
calibration material. An independently prepared calibrant is one prepared from a separate raw
material source, or one prepared and documented by a source external to the provider.
B.1.1 Relative Standard Deviation of Verification
The method used by the PT provider for verification analysis must have a relative standard deviation
of not more than 50% of the relative standard deviation predicted at the Prepared Value by the
laboratory acceptance criteria being used by NELAC for each parameter. The relative standard
deviation of the provider's verification method shall be established by a method validation study, and
the suitability for use shall be approved by the NELAP designated Proficiency Testing Oversight
Body (PTOB)/Proficiency Test Provider Accreditor (PTPA).
B.1.2 Quality Control Check of the Prepared Value
The prepared value for every parameter in all PT samples must be verified by analysis. The
prepared value of the analyte is verified if the mean of the verification analyses is within 1.5
standard deviations, as calculated as described in Sections C. 1.1.1 or C. 1.1.2, of either a) the
prepared value if an unbiased verification method is used or b) the mean value for the analyte as
calculated in Sections C. 1.1.1 or C. 1.1.2 if a biased method is used. The standard deviation of the
verification analyses also must be less than one standard deviation as calculated in Sections C. 1.1.1
or C. 1.1.2. For analytes that are evaluated using fixed percentages as defined in Section C.1.1.1,
standard deviations are calculated by assuming that the fixed percentage is equal to two standard
deviations.
B.2.0 HOMOGENEITY TESTING
PT sample homogeneity is essential to ensuring that all laboratories are treated fairly. Therefore,
the purpose of the homogeneity testing procedure is to establish at the 95% confidence level that
all samples distributed to the laboratories have the same Prepared Value for every parameter to be
evaluated. Homogeneity testing is required on all PT samples prior to sample shipment to the
laboratories.
B.2.1 Homogeneity Testing Procedure
The homogeneity of the samples must be established using a generally accepted statistical
procedure. The procedure selected by the PT provider must be capable of evaluating the relative
consistency of each analyte across the production run, and must be performed on the final packaged
-------
NELAC
Proficiency Testing
Appendix B
July 2, 1998
Revision 9
Page 2B-2 of 3
samples. The procedure must establish at the 95% confidence level that the Prepared Value is
consistent across the production run. Samples, or parameters, which fail to pass the homogeneity
testing criteria cannot be used in the NELAC PT program to evaluate laboratories.
B.2.2 Suitable Homogeneity Testing Procedures
A suitable homogeneity testing procedure shall be capable of comparing the between sample to
within sample standard deviation across the PT provider's packaging run, and shall ensure
comparability with 95% confidence. Suitable homogeneity testing procedures are available in both
ISO Guide 35 for the Certification of Reference Materials and in the ISO Reference Material
Committee (REMCO)-Association of Official Analytical Chemists (AOAC) Harmonized Protocol for
the Proficiency Testing of Analytical Laboratories. However, the homogeneity testing procedure
used by the PT provider must be approved for use by the PTOB/PTPA.
B.3.0 STABILITY TESTING
The samples used in the NELAC PT program must be verified as stable for the period of each
study. Therefore, the stability of all samples, and parameters, must be established by the PT
provider following the close of data submission from the laboratories. The samples are considered
stable for the period of the study if the Mean analytical value as determined after the study for each
parameter falls within the 95% Confidence Interval calculated for the prior to shipment verification
testing used to establish the Prepared Value. The testing procedure used for stability testing must
be approved for use by the PTOB/PTPA.
B.4.0 SAMPLE FORMULATION APPROVAL
The PT Provider shall demonstrate the adequacy of sample formulation to the satisfaction of the
PTPA. The criteria for formulation adequacy are that the sample shall provide equivalent challenge
to the laboratories under test as similar samples for the same parameters as other providers, and
that the sample shall exhibit laboratory acceptance rates, measured as provider percentage pass/fail
performance, consistent with other samples used in the program for the same parameters.
B.4.1 Adequacy of the Sample Formulation
The testing and verification protocol required to establish sample equivalency shall be agreed to by
both the PT provider and the PTOB/PTPA on a case by case basis. It is the responsibility of the PT
provider to demonstrate the adequacy of sample formulation to the satisfaction of the PTOB/PTPA.
B.5.0 DATA REPORTING BY PT PROVIDERS
The results of sample Prepared Value verification, homogeneity, and stability testing must be
available to the participating laboratories. All data developed by the provider in support of
verification testing, homogeneity testing, and stability analysis must be provided to any laboratory
participating in the program upon request after the close of the study.
B.5.1
The data developed by the PT provider in support of verification, homogeneity, and stability testing
shall be supplied in summary format to the PTOB/PTPA in an electronic format to be determined
by the PTOB/PTPA. Verification and homogeneity data must be supplied to the PTOB/PTPA prior
to sample distribution to the laboratories.
-------
NELAC
Proficiency Testing
Appendix B
July 2, 1998
Revision 9
Page 2B-3 of 3
B.5.2
All data from the laboratories and the results of stability testing must be provided to the PTOB/PTPA
in an electronic format to be determined by the PTOB within 30 calendar days of the close of the
study.
-------
PROFICIENCY TESTING
APPENDIX C
PT ACCEPTANCE CRITERIA
AND
PT PASS/FAIL CRITERIA
-------
NELAC
Proficiency Testing
Appendix C
July 2, 1998
Revision 9
Page 2C-1 of 3
C.0.0 PURPOSE, SCOPE, AND APPLICABILITY
This Appendix defines the criteria to be used by any entity which seeks to participate as a
Proficiency Test Provider in the NELAC Program for scoring the results obtained from the analyses
of samples in any NELAC PT Study. Two distinct sets of scoring criteria are defined: 1) whether or
not an individual analyte result is either "Acceptable" or "Not Acceptable" and 2) whether or not a
laboratory's initial PT performance for a group of interdependent analytes can be evaluated as
"Pass" or "Fail". The PT Providers shall submit all laboratories' performance rating(s) to the Primary
Accrediting Authority, as described in Chapter 2 of the NELAC standards, to be used as a tool for
determining a laboratory's accreditation status. PT acceptance limits and pass/fail criteria are
established on a Program-matrix-analyte basis.
C.1.0 ANALYTE ACCEPTANCE LIMITS
Acceptance limits are established for each analyte. Whether or not a laboratory has passed or failed
a group of interdependent analytes is based on the number of results that are determined to be
acceptable.
C.1.1 Analyte Acceptance Limit Categories
Acceptance limits are separated into two categories. Results for analytes with acceptance limits
determined as described in Sections C. 1.1.1 and C.1.1.2 shall be used in the determination of a
laboratory's PT Program-matrix-analyte pass/fail evaluation. Results for analytes with acceptance
limits determined as described in Section C.1.1.3 shall not be used as part of the Program-matrix-
analyte pass/fail evaluation.
C.1.1.1 Analytes with USEPA Established Acceptance Limits
PT Providers shall utilize the proficiency test acceptance limits that have been established by
USEPA in the National Standards for water proficiency testing studies where they apply. The
National Standards are incorporated into this Appendix by reference. EPA's established proficiency
test acceptance limits for chemical analytes are typically expressed in the following manner:
• Prepared ± fixed percentage. Acceptance limits shall be set at plus and minus the published
fixed percentage of the analyte's verified prepared value.
• Mean ± 2 standard deviations. For those analytes for which the NELAC Standing Committee
on Proficiency Testing has established linear regression equations relating prepared value to
mean and prepared value to standard deviation, acceptance limits shall be set using said
equations and the sample's verified prepared value. Linear regression equations may only be
used for prepared values that fall within the range of prepared values used to establish said
equations. In the event that there are no linear regression equations available for a given
analyte, that analyte shall be treated as described in Section C.1.1.3.
C.1.1.2 Analytes with acceptance limits derived from regression equations established by the
NELAC Standing Committee on Proficiency Testing
When USEPA Program regulations for establishing acceptance criteria are not available Proficiency
Test providers shall set acceptance limits using regression equations that predict the mean and
standard deviation for an analyte in a given range of concentrations. Regression equations shall
be derived by the NELAC Standing Committee on Proficiency Testing and shall be made available
to PTPA-approved PT Providers by the PT Committee Chair or the Executive Director of NELAP.
-------
NELAC
Proficiency Testing
Appendix C
July 2, 1998
Revision 9
Page 2C-2 of 3
Data from sources such as the USEPA PE studies, interlaboratory results from professional
organizations such as ASTM, other proficiency testing providers, commercial and non-profit
organizations, shall be used to establish the equations. All regression equations shall be approved
by the NELAC Standing Committee on Proficiency Testing prior to use by a PTPA-approved PT
Provider. For these analytes, the PT Provider shall use the sample's verified prepared value and
said equations to determine the mean and standard deviation.
C.1.1.3 Experimental Data: Analytes without promulgated acceptance limits or established
regression equations
For those analytes not included in categories C.1.1.1 or C.1.1.2, e.g., newly regulated analytes, or
analytes in a matrix that have not been fully evaluated in interlaboratory studies, NELAC
acceptance limits shall be established only after interlaboratory data has been collected for a
minimum of one year unless the NELAC Standing Committee on Proficiency Testing determines that
sufficient data have been collected in less time. The data obtained during the one-year period shall
be referred to as "experimental data". The NELAC Standing Committee on Proficiency Testing shall
derive regression equations to be used to establish acceptance limits for analytes in the
experimental category after sufficient data have been collected. The laboratory shall receive a copy
of its own experimental data from the PT Provider at the conclusion of the PT study.
C.2.0 ACCEPTABLE PT RESULTS FOR CHEMICAL ANALYTES IN POTABLE WATER AND
NON-POTABLE WATER PT SAMPLES
A laboratory's PT analyte result is acceptable when it falls within the regulatory promulgated
acceptance limits (Section C.1.1.1). For Section C.1.1.2 analytes, PT Providers shall use the PT
sample's verified prepared value and said regression equations to determine the mean and standard
deviation. Acceptance limits shall be set at the mean ± two standard deviations for potable water
analytes and the mean ± three standard deviations for non-potable water analytes. A result is
acceptable when it falls within these derived acceptance limits.
C.3.0 NOT ACCEPTABLE PT RESULTS FOR POTABLE WATER AND NON-POTABLE
WATER PT SAMPLES
A laboratory's result for any analyte is considered unacceptable if it meets any of the following
criteria:
a) The result falls outside the USEPA's promulgated acceptance limits (Section C. 1.1.1) or outside
prediction interval derived from established regression equations (Section C.2.0);
b) The lab reports a result for an analyte not present in the PT sample (i.e., a false positive);
c) The lab reports a result of "Not Detected", (or similar indication of no detection), for an analyte
present in the PT sample (i.e., a false negative);
NOTE: If a laboratory reports a result less then the lowest concentration contained in the
NELAC-approved PT concentration range for an analyte present in the PT sample at a
concentration within the NELAC-approved PT concentration range, the result shall be
classified as a false negative and scored as "not acceptable".
d) The lab fails to submit its results to the PT Provider on or before the deadline for the PT study.
-------
NELAC
Proficiency Testing
Appendix C
July 2, 1998
Revision 9
Page 2C-3 of 3
C.4.0 ADDITIONAL REQUIREMENTS FOR PT PROVIDERS
PT Providers shall examine all data sets for bimodal distribution and/or situations where results from
a given method have disproportionally large failure rates or reporting anomalies to the Proficiency
Testing Oversight Body/Proficiency Test Provider Accreditor. All proficiency test data are to be
submitted to the PTOB/PTPA in the format specified by the PTOB/PTPA and shall be reviewed
annually by the NELAC Standing Committee for Proficiency Testing for the purpose of revising
existing and establishing new linear regression equations.
C.5.0 NELAC PT STUDY PASS/FAIL CRITERIA
NELAC PT samples are designed to meet the requirements of Chapter 2 and associated
appendices. Once data acceptability has been determined as described in Sections C.1 through
C.3 of this appendix, the laboratory's PT "Pass" or "Fail" evaluation is determined as described in
this Section. Pass/Fail criteria are used when groups of interdependent analytes are evaluated as
a unit for the laboratory's initial demonstration of proficiency.
C.5.1 Interdependent Analyte PT Samples
Interdependent analyte PT Samples are those that are analyzed using methods in which the ability
to correctly identify and quantitate a series of analytes is indicative of the laboratory's ability to
correctly determine the presence or absence of similar analytes. Examples of interdependent PT
Samples are those used for the following series of analytes; volatiles, semivolatiles, pesticides,
herbicides, etc..
C.5.2 Non-interdependent Analyte PT Samples
Non-interdependent PT Samples are those that are analyzed using methods in which the ability to
correctly identify and quantitate an analyte or a series of analytes in a sample is not indicative of the
laboratory's ability to correctly identify and quantitate similar analytes. Non-interdependent analyte
PT samples may contain a single analyte, e.g., pH, BOD, TSS, etc., or may contain multiple
analytes, e.g., metals, major ions, etc.
C.5.3 Promulgated USEPA Pass/fail Criteria
In all cases, promulgated USEPA pass/fail criteria, e.g., drinking water volatiles as listed in 40 CFR
141.61 (a), subsection (m)(1), shall be used as NELAC PT pass/fail criteria as applicable. The
criteria described in Section 5.4 shall be used in the absence of promulgated USEPA pass/fail
guidelines.
C.5.4 Pass/fail Criteria For Interdependent Analyte PT Samples
Proficiency Testing pass/fail evaluationsfor Interdependent Analyte PT samples shall be determined
as follows. To receive a score of "Pass", a laboratory must produce "Acceptable" results as defined
in Section C.1 for 80% of the analytes in an Interdependent Analyte PT Sample. Greater than 20%
"Not Acceptable" results shall result in the laboratory receiving a score of "Fail" for that series of
analytes. For example, a laboratory must report all "Acceptable" results for an Interdependent
Analyte PT Sample containing 1-4 analytes, may report no more then one "Not Acceptable" result
for a Sample containing 5-9 analytes, two "Not Acceptable" results for a Sample containing 10-14
analytes. A "Not Acceptable" result for the same analyte in two consecutive PT studies shall also
result in the laboratory receiving a score of "Fail" for that analyte.
-------
PROFICIENCY TESTING
APPENDIX D
PROFICIENCY TESTING OVERSIGHT
BODY/
PROFICIENCY TEST PROVIDER
ACCREDITOR
-------
NELAC
Proficiency Testing
Appendix D
July 2, 1998
Revision 9
Page 2D-1 of 5
D.0.0 PURPOSE, SCOPE, AND APPLICABILITY
This Appendix defines the qualifications, scope of responsibilities and requirements for a NELAP
designated Proficiency Testing Oversight Body (PTOB)/Proficiency Test Provider Accreditor (PTPA)
as defined in Section 2.2.2 of the NELAC document. In addition to complying with the requirements
of this Appendix, a PTOB/PTPA, for this oversight function, shall comply with the applicable
requirements described in Chapter 2 and associated Appendices A (PT Provider Acceptance
Criteria), B (PT Sample Design and Acceptance Guidelines), and C (Criteria for Setting PT Data
Acceptance Limits).
D.1.0 TECHNICAL AND ADMINISTRATIVE QUALIFICATIONS
An organization shall demonstrate to the NELAC Standing Committee on Proficiency Testing by
the submission of a current Statement of Qualifications that it has the technical expertise,
administrative capacity, and financial resources sufficient to implement and operate a national
program of PT Provider evaluation and oversight. In the event that the organization is not a
nationally or internationally recognized authority, the NELAC Standing Committee on Proficiency
Testing reserves the right to requestfurther documentation detailing the organization's qualifications.
The organization shall meet the following general requirements:
a) Demonstrate the capability to manage and evaluate complex environmental reference materials
in a variety of matrices;
b) Demonstrate expertise in statistical applications as related to large interlaboratory performance
evaluation programs;
c) Demonstrate the capability to conduct on-site audits of PT Providers;
d) Demonstrate the capability to conduct technical reviews of Initial Applications;
e) Demonstrate a knowledge and understanding of the ISO guides 9001,34,43, and Chapter 2 of
the NELAC standards including Appendices A, B, and C.
D.2.0 PTOB/PTPA RESPONSIBILITIES REGARDING INITIAL ASSESSMENT OF PT
PROVIDERS
PTOB/PTPA responsibilities are described in this section. The primary responsibility of a
PTOB/PTPA is the oversight and ongoing monitoring and evaluation of the PT Providers. The
oversight activities of a PTOB/PTPA shall be designed to ensure that the PT Provider meets the
requirements specified in Chapter 2 and Appendices A, B and C. Any variations from these
requirements shall be approved by the NELAC Standing Committee on Proficiency Testing prior to
a body being approved as a NELAC PTOB/PTPA. All activities described herein shall be conducted
by a PTOB/PTPA.
D.2.1 Development of Standard Operating Procedures and Forms
PTOBs/PTPAs shall develop the Standard Operating Procedures (SOPs) necessary to conduct the
PT Provider evaluation process. These documents shall be based upon the requirements of Chapter
2 of the NELAC standards and the associated Appendices A, B, and C. The NELAC Standing
Committee on Proficiency Testing has the authority to review and approve, as necessary, the SOPs
developed by a PTOB/PTPA.
-------
NELAC
Proficiency Testing
Appendix D
July 2, 1998
Revision 9
Page 2D-2 of 5
D.2.1.1 SOP(s) for the Assessment Process
The PTOB/PTPA shall develop and implement SOP(s) including but not limited to: the initial
application submittal and review process, on site inspection, submittal of final reports to NELAP, the
procedures for determining that a PT Provider's approval be revoked, the procedures for appealing
approval determinations, and any other procedures deemed necessary by NELAC.
D.2.1.2 Initial Application
A PTOB/PTPA shall develop the initial application process to be submitted by PT Providers applying
for approval as PT Providers of NELAC samples. The application shall include questions regarding
the qualifications of the organization seeking approval. In addition to completing the initial
application process, a PTOB/PTPA shall require that the PT Provider submit copies of its current
ISO 9001 registration certificate or any other documents which detail the quality systems required
by the provisions of Chapter 2 and associated Appendices.
D.2.1.3 SOP(s) for On-Site Inspections and Checklist(s)
A PTOB/PTPA shall develop SOP(s) for conducting consistent, effective, on-site inspections of PT
Providers. The SOP shall include policies which describe the circumstances for conducting any
additional inspections, and circumstances for determining whether on-site inspections shall be
announced or unannounced. A PTOB/PTPA shall develop standard, consistent checklists) to be
used during any and all inspections of PT Providers.
D.2.2 Initial Application Review and On-site Inspections
A PTOB/PTPA shall follow the procedures described in this section for the review of applications
and on-site inspections of any candidate PT Provider.
a) A PTOB/PTPA shall review the initial application documents, described in D.2.1.2, for
compliance with the PT Provider qualifications described in Appendix A and other applicable
documents.
b) A PTOB/PTPA shall review the sample designs used by the PT Provider for compliance with
Appendix B and other applicable documents.
c) A PTOB/PTPA shall review the PT analyte and sample scoring procedures used by the PT
Provider for compliance with Appendix C and other applicable documents.
d) Following the review of the Initial Application and associated documents, a PTOB/PTPA shall
conduct an on-site inspection of the PT Provider. The PT Provider shall be provided with
checklist(s) to be used during the inspection as part of the initial application process.
e) Following the inspection, a PTOB/PTPA shall conduct an exit meeting with the PT Provider,
which shall include discussion of deficiencies and discrepancies found; however, a PTOB/PTPA
may further revise the findings after the closing of the exit meeting, if necessary.
The inspection shall include, at a minimum:
1) Review of the quality system for adherence to the requirements of Appendices A, B and
C;
-------
NELAC
Proficiency Testing
Appendix D
July 2, 1998
Revision 9
Page 2D-3 of 5
2) Review of staff qualifications and technical expertise necessary to produce acceptable
proficiency testing samples;
3) Review of the sample manufacturing and verification procedures to ensure that the
requirements of Appendices A and B are met;
4) Review of the procedures in place to ensure that all personnel are aware of and abide
by standards of conduct for PT Providers and confidentiality of sample values; and
5) Review of data reporting systems to ensure that the requirements of Appendix C are
met within the time periods specified in Chapter 2.
f) A PTOB/PTPA shall send a draft report to the PT Provider after the completion date of the
inspection. A PTOB/PTPA shall allow the PT Provider to review and comment on the draft if
the PT Provider finds any discrepancies and determines that revisions are necessary. A
PTOB/PTPA shall then submit a final inspection report to the PT Provider after the completion
of the on-site inspection. The final report may only contain discrepancies and findings identified
during the on site inspection or discussed during the exit briefing.
g) A PTOB/PTPA shall allow the Provider to submit their response to the report. In order for the
Provider's response to be considered acceptable, a PTOB/PTPA shall require that it include a
description of corrective actions necessary to meet the criteria of Chapter 2, and Appendices
A, B, and C.
D.3.0 PTOB/PTPA RESPONSIBILITIES REGARDING APPROVAL OF PT PROVIDERS
A PTOB/PTPA shall utilize the appropriate final report and associated documents submitted by the
PT Provider to grant or deny approval to that Provider.
D.4.0 PTOB/PTPA RESPONSIBILITIES FOR ONGOING OVERSIGHT OF PT PROVIDERS
A PTOB/PTPA shall conduct ongoing oversight of all approved PT Providers. The oversight shall
include at a minimum:
a) the use of referee laboratories to verify the concentrations of analytes in randomly selected PT
Provider samples;
b) the statistical monitoring of PT Provider's study data to detect occurrences which indicate
samples of unacceptable quality, i.e., failure rates that exceed expected norms, analyte
standard deviations that exceed expected intervals, and analyte mean recoveries which are
significantly above or below historical trends. The ongoing monitoring criteria to be used by a
PTOB/PTPA shall be developed by NELAC.
c) biennial on-site inspections of the PT provider review and monitoring of critical operational
parameters of the PT provider, i.e., change in senior management, sale of the company.
d) on-site inspections of the PT provider for cause.
Based upon the results of its ongoing oversight, the PTOB/PTPA may determine that the Provider's
approval status be Devaluated.
-------
NELAC
Proficiency Testing
Appendix D
July 2, 1998
Revision 9
Page 2D-4 of 5
D.5.0 DEVELOPMENT AND MAINTENANCE OF A COMPREHENSIVE PT DATABASE
A comprehensive PT database shall be developed and maintained by the PTOB(s)/PTPA(s) in
conjunction with NELAC.
D.6.0 COMPLAINTS AND CORRECTIVE ACTION
A PTOB/PTPA shall evaluate all complaints that it receives regarding either approved or candidate
PT Providers. If the PTOB/PTPA determines that a complaint warrants investigation, the
PTOB/PTPA shall notify the Provider of the complaint. The PT Provider is required to resolve the
complaint to the satisfaction of the PTOB/PTPAA PTOB/PTPA shall provide to the NELAC Standing
Committee on Proficiency Testing a summary of all PT Provider complaints received the previous
year.
D.7.0 LIST OF APPROVED PT PROVIDERS
A PTOB/PTPA shall maintain a list of approved PT Providers. The list shall be maintained on a
continuing basis on an electronic bulletin board or similar means and shall be readily available to
laboratories seeking NELAC accreditation, state accrediting authorities and other interested parties.
PT Providers must agree to abide by the provisions of NELAC regarding the advertising and
marketing use of the designation, "NELAP-designated PTOB/PTPA Approved Proficiency Test
Provider"
D.8.0 SPONSORSHIP OF ANNUAL NELAC PROFICIENCY TESTING CAUCUS
The PTOB(s)/PTPA(s) shall, in conjunction with NELAC, sponsor an annual NELAC Proficiency
Testing Caucus. The Caucus shall, if possible, be held in conjunction with the annual NELAC
meeting. The purpose of the Caucus is to provide a forum for PT Providers, Accrediting Authorities,
laboratories, federal agencies, and other interested parties to exchange information regarding the
PT study results of the previous year. The Caucus shall include technical presentations and open
discussions on means to improve the Proficiency Testing aspect of NELAC with a continuing goal
of improving the quality of environmental data generated by the NELAC accredited laboratories.
D.9.0 PTOB/PTPA ETHICS
This section describes the overall ethics and standards of conduct that must be adhered to in order
for a PTOB/PTPA to implement and administer a successful PT Provider oversight program. A
PTOB/PTPA shall serve as an impartial body designed to objectively evaluate information about PT
Providers and use this information to make sound determinations regarding Providers' approval
status. A PTOB/PTPA shall be able to certify to any interested party that it is free of any
organizational or financial conflict of interest, which would prevent it from complying with the
requirements of Appendix D. A PTOB/PTPA shall remain unbiased in evaluating information
gathered and received including inspection reports, referee sample results, complaints, and any
other information obtained regarding a PT Provider. The PTOB/PTPA shall evaluate all information
gathered and received about a Provider related to providing NELAC PT samples, and determine
which information is relevant to the approval status of a Provider, and provide that information to
NELAP, the primary Accrediting Authorities, the laboratories, and the public as appropriate.
-------
NELAC
Proficiency Testing
Appendix D
July 2, 1998
Revision 9
Page 2D-5 of 5
D.10.0 CONFIDENTIALITY
A portion of the information provided to a PTOB/PTPA by the PT Provider in the course of its
inspection and oversight activities shall be proprietary in nature. A PTOB/PTPA shall agree to
maintain the confidentiality of proprietary information provided to it by the PT provider.
-------
PROFICIENCY TESTING
APPENDIX E
MICROBIOLOGY
-------
NELAC
Proficiency Testing
Appendix E
July 2, 1998
Revision 9
Page2E-1 of 2
E.0.0 PURPOSE
This appendix outlines the requirements for microbiological proficiency testing under the Safe
Drinking Water (SDWA) and Clean Water (CWA) Acts. Microbiological testing for other USEPA
Programs shall be added as required. Semi-annual proficiency testing is required per the schedule
contained in Section 2.4.
E.1.0 SAMPLES
E.1.1 SDWA Samples
PT Providers shall present samples as either as full volume samples or preparations easily
reconstituted to full volume samples. For the SDWA, there shall be ten 100+ ml. samples for the
qualitative determination (Presence/Absence) of total coliform and fecal coliform (or £ coif).
Sample sets which are provided to the laboratories shall contain bacteria that produce the following:
• Verification as total and fecal coliforms (E. coli).
• Verification as total coliforms, but not as fecal coliforms.
• Bacterial contaminates which shall not verify as total or fecal coliforms.
Furthermore, each set shall contain the following samples:
• One to four samples containing an aerogenic strain of Escherichia coli for total and fecal
coliform positive results using all USEPA approved methods.
• One to four samples containing Enterobacter sp. or other microorganisms ensuring a total
coliform positive and fecal coliform negative result using all USEPA approved methods.
• One to four samples containing Pseudomonas sp. or other microorganisms ensuring a total and
fecal coliform negative result using all USEPA approved methods.
• One to four blank samples.
• Optionally, one sample for the quantitative determination of Heterotrophic Plate Count.
Sample sets for qualitative analysis shall be randomly composed of samples that are Total coliform
absent, Total coliform only present and Fecal coliform (E. coli) present.
E.1.2 CWA Samples
For the CWA, one sample shall be provided for the quantitative determination of Total coliform or
Fecal coliform. Providers may require laboratories to analyze samples during a fixed time period
after sample shipment or at any time during the testing period which shall not exceed the time limit
set in Chapter 2.
E.2.0 SAMPLE PREPARATION AND QUALITY CONTROL
Proficiency test sample providers shall select bacterial strains and holding media that produce the
appropriate biochemical reactions for all approved analytical methods. This shall be documented
by analyses performed by the provider prior to sample shipment. The provider must also
demonstrate that the samples are stable by analysis of a randomly selected set either after the study
closing date or in the case of a study with a fixed testing period, on the last working day of the
testing period.
-------
NELAC
Proficiency Testing
Appendix E
July 2,1998
Revision 9
Page2E-2of2
E.3.0 SCORING
E.3.1 Qualitative Analyses, SDWA Samples
Participating laboratory results shall be considered Acceptable or Unacceptable when compared to
the known presence or absence of Total coliform or Fecal coliform (or £. coli) bacteria. Passing
shall be considered as nine out of ten samples having acceptable results, and no false negatives
reported.
E.3.2 Quantitative Analyses
Quantitative result data sets shall be evaluated by analytical method using standard statistical
analysis with outlier rejection. Most Probable Number data shall be transformed to logs prior to
statistical analysis. Acceptable results are those that are within the 99% confidence limits as set by
the mean, standard deviation and set size (n) for their respective data set.
E.3.2.1 Requirement for Quantitative Data Set Size
Each PT provider's microbiological data set shall be comprised of at least 20 valid data points for
each method evaluated. Sample sets of less than 20 data points may be used only with the
approval of the PTOB/PTPA.
-------
_ o
CO +3
LLJ
g
b= o
-------
NEU\C
On-Site Assessment
Revision 10
July 2,1998
Page i of i
TABLE OF CONTENTS
ON-SITE ASSESSMENT
3.0 ON-SITE ASSESSMENT 1
3.1 INTRODUCTION 1
3.2 ON-SITE ASSESSMENT PERSONNEL 1
3.2.1 Basic Qualifications 1
3.2.2 Assessor Qualification 2
3.2.3 Training 2
3.3 FREQUENCY OF ON-SITE ASSESSMENTS 5
3.3.1 Frequency 5
3.3.2 Follow-up Assessments 5
3.3.3 Changes in Laboratory Capabilities 6
3.3.4 Announced and Unannounced Visits 6
3.4 PRE-ASSESSMENT PROCEDURES 6
3.4.1 Assessment Planning 6
3.4.2 Scope of the Assessment 6
3.4.2.1 Laboratory Assessments 6
3.4.2.2 Records Review 7
3.4.3 Information Collection and Review 7
3.4.4 Assessment Documents 7
3.4.5 Confidential Business Information (CBI) Considerations 8
3.4.6 National Security Considerations 9
3.5 ASSESSMENT SCHEDULE/FORMAT 9
3.5.1 Length of Assessment 9
3.5.2 Opening Conference 9
3.5.3 Records Review 10
3.5.4 Staff Interviews 11
3.5.5 Closing Conference 11
3.5.6 Follow-up and Reporting Procedures 12
3.5.7 Assessment Closure 12
3.6 STANDARDS FOR ASSESSMENT 12
3.6.1 Assessor Training Manual 12
3.6.2 Assessor's Role 13
3.6.3 Checklists 13
3.6.4 Assessment Standards 13
3.7 DOCUMENTATION OF ON-SITE ASSESSMENT 14
3.7.1 Checklists 14
3.7.2 Report Format 14
3.7.3 Distribution 15
3.7.4 Release of Report 15
3.7. 5 Record Retention Time 15
-------
NELAC
On-Site Assessment
Revision 10
July 2,1998
Page 1 of 15
3.0 ON-SITE ASSESSMENT
3.1 INTRODUCTION
The on-site assessment is an integral and requisite part of a laboratory accreditation program and
will be one of the primary means of determining a laboratory's capabilities and qualifications. During
the on-site assessment, the assessment team will collect and evaluate information and make
observations which will be used to judge the laboratory'sconformance with established accreditation
standards.
It is essential that the on-site assessment conducted by any accrediting authority in the United States
wishing to be recognized by the National Environmental Laboratory Accreditation Program be
conducted in a uniform, consistent manner. Reasons for fostering this consistency include a need
to assure the base quality of data coming from the laboratories; to allow more confident comparison
of results generated by different laboratories; to facilitate reciprocity; and for the laboratory
community to accept the accreditation standards.
This section describes the essential elements that are to be included in any acceptable on-site
assessment and the qualifications and requirements for assessors.
The responsibility for promulgating and enforcing occupational safety and health standards rests with
the U.S. Department of Labor. While it is not within the scope of the assessment team to evaluate
all health and safety regulations, any obviously unsafe condition(s) observed should be described
to the appropriate laboratory official and reported to the accrediting authority. The accreditation on-
site assessment is not intended to certify that the laboratory is in compliance with any applicable
health and safety regulations.
3.2 ON-SITE ASSESSMENT PERSONNEL
3.2.1 Basic Qualifications
A laboratory assessor may work for a Federal, State, or a third party assessor body. An assessor
must be an experienced professional and hold at least a Bachelor's degree in a basic science, or
have equivalent education and experience in laboratory assessment or related fields.
Each assessor must also have satisfactorily completed an approved assessor training program. All
assessors must take annual update/refresher train ing as specified by the NELAC.
Each new candidate assessor must undergo training with a qualified assessor during four or more
actual assessments until judged proficient by the accrediting authority. Assessors employed by
accrediting authorities (either directly or third party) when the authority is granted NELAP recognition
(see section 6.7) are exempt from the requirement to undergo training with a qualified assessor
during four or more actual on-site assessments, provided they have previously conducted four
assessments and been judged proficient by the accrediting authority. Assessors employed by
accrediting authorities on the date that the first Accrediting Authority is granted NELAP recognition
must meet the NELAC-specified basic course requirements within two years and the applicable
technical course requirements within four years of that date.
In addition, the assessors must:
a) Be familiar with the relevant legal regulations, accreditation procedures, and accreditation
requirements;
-------
NELAC
On-Site Assessment
Revision 10
July 2, 1998
Page 2 of 15
b) Have a thorough knowledge of the relevant assessment methods and assessment documents;
c) Be thoroughly familiar with the various forms of records described in Section 3.5.3 - Records
review;
d) Be thoroughly cognizant of data reporting, analysis, and reduction techniques and procedures;
e) Be technically knowledgeable and conversant with the specific tests or types of tests for which
the accreditation is sought and, where relevant, with the associated sampling and preservation
procedures; and
f) Be able to communicate effectively, both orally and in writing.
3.2.2 Assessor Qualification
Before an assessor can conduct on-site assessments, the individual must be qualified by an
accrediting authority. Each assessor must sign a statement before conducting an assessment
certifying that no conflict of interest exists and provide any supporting information as required by the
accrediting authority. Failure to provide this information will make the proposed assessor ineligible
to participate in the assessment program.
3.2.3 Training
The National Environmental Laboratory Accreditation Conference (NELAC) specifies the minimum
level of education and training for assessors, including refresher/update training. The NELAC also
develops standards for training requirements. The assessor training program will be implemented
by either accrediting authorities, assessor bodies, or other entities. All assessor training programs,
must meet the NELAC standards.
The purpose of the basic assessor training course is to familiarize the assessor with the NELAC
standards and the skills and techniques associated with auditing. The assessor training program
is defined as follows:
-------
NELAC
On-Site Assessment
Revision 10
July 2, 1998
Page 3 of 15
NELAC Basic Assessor Training Course
DAY1
Basic Auditing Techniques and Skills
DAY 2
- NELAC Overview (Chapter 1 NELAC Standards)
- Accrediting Authority (Chapter 6)
- Accreditation Process (Chapter 4)
Proficiency Testing (Chapter 2)
DAY 3
- Quality Systems (Chapter 5)
DAY 4
- On-Site Assessment (Chapter 3)
Course Summary
Written Examination
NOTE: Until such time as the NELAC has developed the training program for laboratory assessors,
each accrediting authority shall approve the training for each of its assessors (federal, state and/or
third party).
When the NELAC has approved the assessor training program standards, accrediting authorities,
assessor bodies, or other entities may petition for approval of various formal training programs that
address auditing skills which may meet the NELAC standards (Day 1). It is the intent of this chapter
to allow those assessors that produce evidence of successful completion of an approved alternative
training course concerning auditing to be exempt from the analogous NELAC training (Day 1). The
specific training associated with the NELAC standards (Days 2 - 5) is required and must be
successfully completed. All assessor candidates must pass the written examination (Day 5).
In addition to the basic NELAC assessor training, each assessor must successfully complete
additional technical training in up to seven (7) separate analytical disciplines. Each assessor may
pursue recognition in one or more analytical disciplines according to individual wants or needs.
The purpose of the technical training courses is to familiarize the assessor with the scientific
principals, quality systems, record keeping practices, and reporting protocols associated with each
analytical procedure that will confirm the scientific validity and legal defensibility of the data
generated. The technical training program will consist of the following courses:
-------
NELAC
On-Site Assessment
Revision 10
July 2, 1998
Page 4 of 15
NELAC Technical Training Courses for Assessors
COURSES
1. Microbiology (2.5 days)
Bacteriology
- Viruses/Parasites
- Microscopic Particulate Analysis (MPA)
2. Biological (2.5 days)
Aquatic Toxicity Testing
Freshwater/Marine/Estuarine Fish
Freshwater/Marine/Estuarine
Icthyoplankton
Macrophytes
Periphyton
Phytoplankton
- Zooplankton
Biomass
Chlorophyll a (Spectrophotometric and Fluorometric)
3. Inorganic - Nonmetals/Misc (2.5 days)
Spectrophotometric
Titrimetric
Potentiometric
Colorimetric
- TOCATOX
Residue/Solids
- COD/BOD
- IR
- 1C
4. Inorganic - Metals (2.5 days)
- FAA
- GFAA
- ICP
- ICP/MS
Sample Preparation (Digestion/TCLP/etc.)
-------
NELAC
On-Site Assessment
Revision 10
July 2,1998
Page 5 of 15
NELAC Technical Training Courses for Assessors (cont'd)
5. Orqanics (5 days)
Sample Preparation
- HPLC
- GC
- GC/MS
Instrument Software
6. Asbestos (2.5 days)
- Bulk
- Air
Water/TEM (Day 1. Assessors not requiring TEM could begin course on second day)
7. Radiochemistry (2.5 days)
The purpose for requiring refresher/update training for all assessors is to ensure that the assessors
are aware of changes to the standards and/or approved analytical methodology as they occur and
to enhance and improve skills associated with auditing. Initially, the refresher/update training is
conceptualized as follows:
NELAC Refresher/Update Training for Assessors
Day 1
- Changes to the NELAC Standards and the Resulting Checklist Changes
Technical Changes Associated with Approved Methodology and the Resulting
Checklist Changes
Auditing Skills and Techniques
Current Developments
3.3 FREQUENCY OF ON-SITE ASSESSMENTS
3.3.1 Frequency
Accrediting authorities must require a comprehensive on-site assessment of each facility that is
accredited at least every two years. Assessments may be conducted more frequently for cause, at
the option of the accrediting authority.
3.3.2 Follow-up Assessments
In addition to routine assessments, assessors may need to conduct follow-up assessments at
laboratories where a deficiency was identified by the previous assessment. These assessments may
be, but are not necessarily limited to, determining whether a laboratory has corrected its
deficiency(ies), or determining the merit of a formal appeal from the laboratory. When deficiencies
are of such severity as to possibly warrant the downgrading of a laboratory's accreditation status,
any follow-up assessment that is planned or conducted should be completed and reported within
forty-five (45) calendar days after the original assessment.
-------
NELAC
On-Site Assessment
Revision 10
July 2, 1998
Page 6 of 15
Nothing in this section should be construed as requiring an accrediting authority to reassess a facility
prior to taking a regulatory or administrative action affecting the status of the facility's accreditation.
Nothing in this section should be construed as limiting in any way the accrediting authorities ability
to revoke or otherwise limit a laboratory's accreditation upon the identification of such deficiencies
as to warrant such action.
3.3.3 Changes in Laboratory Capabilities
The accrediting authority may also deem necessary an assessment when a major change occurs
at a laboratory in personnel, equipment, or in a laboratory's location that might alter or impair
analytical capability and quality.
3.3.4 Announced and Unannounced Visits
The accrediting authority, at its discretion, may conduct either unannounced or announced on-site
assessments. The accrediting authority is not required to provide advance notice of an assessment.
To the maximum extent practical, accrediting authorities, when necessary, shall work with Federal
departments/agencies/contractors to obtain government security clearances for their assessors as
far in advance as possible. Federal departments/agencies/contractors shall facilitate expeditious
attainment of the necessary clearances.
3.4 PRE-ASSESSMENT PROCEDURES
3.4.1 Assessment Planning
A good assessment begins with planning, which should commence well before the assessment team
visits the laboratory. Planning is the means by which the lead assessor identifies all the required
activities to be completed during the assessment process. Planning includes conducting a thorough
review of NELAP and/or State records pertaining to the laboratory to be inspected. This may save
time because familiarity with the operation, history, and compliance status of the laboratory
increases the efficiency and focus of an on-site visit.
Pre-assessment activities include: deciding the scope of the assessment; reviewing NELAP/State
information; providing advance notification of the assessment to the laboratory, when appropriate;
obtaining any security clearances which may be necessary; coordinating the assessment team; and
gathering assessment documents. Section 3.4.5 discusses Confidential Business Information (CBI)
issues.
34.2 Scope of the Assessment
The first step in the assessment planning process is deciding what type of assessment will be
conducted. The assessment may be a general one to determine the capability of the laboratory to
perform environmental testing or a specific examination of a certain area of testing. The assess-
ment must include both an appraisal of the laboratory's operations and a review of the appropriate
records. The assessment for a field of testing must cover all of the tests for which the laboratory
seeks accreditation.
3.4.2.1 Laboratory Assessments
A laboratory assessment must review the ability of the lab to conduct environmental testing. The
examination of the systems, processesand procedures of the laboratory shouldgive a general sense
of its past and present capabilities to perform work of known and documented quality. During a
-------
NELAC
On-Site Assessment
Revision 10
July 2,1998
Page 7 of 15
laboratory assessment, the assessment team may identify a number of samples or a recently
completed or on-going project and evaluate to what extent the tests are being conducted according
to NELAC standards.
3.4.2.2 Records Review
The purpose of a records review is to determine whether the testing laboratory has maintained
necessary documentation of data and other information to technically substantiate reports previously
issued. During a records review, the assessment team will conduct an overall audit of data and will
compare data with submitted reports to determine whether the data were collected, generated, and
reported following the NELAC standards.
3.4.3 Information Collection and Review
Prior to initiating an on-site assessment, the assessment team shall make determinations as to
which laboratory records they wish to review prior to the actual site visit. These records, from the
files of the accrediting authority, the national laboratory accreditation database, or the laboratory
itself may include, but are not limited to:
a) Copies of previous assessment reports and proficiency testing sample results;
b) General laboratory information such as laboratory submitted self-assessment forms, SOPs and
Quality Assurance Plan(s);
c) Official laboratorycommunicationsand associated recordswith appropriateaccreditingauthority
staff.
d) Available documents from recipients of reports from the laboratory;
e) The laboratory's application for accreditation;
f) The existing program regulations and special requirements that apply to the areas for which
accreditation is sought (i.e. security clearances, radioactive exposure protocols, etc.); and
g) The most recently approved analytical methods for the tests for which the laboratory has
requested accreditation.
3.4.4 Assessment Documents
Documents necessary for the assessment and which may need to be provided to the laboratory
management or staff should be assembled before the assessment, whenever possible. The lead
assessor should obtain copies of the required assessment forms, including the appropriate
checklist(s) as documented in the NELAC Assessor Training Manual. Othertypesofdocumentsthat
may be required include:
- Assessment Confidentiality Notice;
- Conflict of Interest Form;
- Assessor Credentials;
- Assessment Assignments);
- Assessment Notification Letter;
- Attendance Sheet(s) (opening and closing conference); and,
- Assessment Appraisal Form.
-------
NELAC
On-Site Assessment
Revision 10
July 2, 1998
Page 8 of 15
In addition, the lead assessor should be able to provide information about how to obtain copies of
documents and materials associated with an assessment from the accrediting authority.
3.4.5 Confidential Business Information (CBI) Considerations
During on-site assessments, on-site assessors may come into possession of information claimed as
business confidential. The EPA regulations for handling confidential business information are
detailed in Title 40. Code of Federal Regulations. Part 2, Subpart B and will be followed in NELAP
related matters. Subpart B defines a business confidentiality claim as "a claim or allegation that
business information is entitled to confidential treatment for reasons of business confidentiality or
a request for a determination that such information is entitled to such treatment."
NELAC standards must, consistent with 40 CFR Part 2, protect Confidential Business Information
(CBI) from disclosure. For this information to be adequately protected, certain actions are required,
by NELAP, on-site assessors and the laboratory. The lead assessor must provide a NELAP
assessment confidentiality notice to the responsible laboratory official at the beginning of the
assessment. This notice informs laboratory officials of their right to claim any portion of the
information requested during the assessment data as CBI. NELAP personnel, assessors and other
users of said information must have CBI training. The assessors should be familiar with the
procedures for asserting a CBI claim and handling information which contain the information claimed
as CBI. The lead assessor must take custody of all CBI information before leaving the laboratory,
and must maintain them in custody, using all proper procedures and safeguards, until they can be
received by the accrediting authority, who must also treat such information as CBI, until an official
determination has been made in accordance with Federal and State laws.
Certain actions are required of the responsible laboratory official when claiming information as
business confidential. The laboratory representative must place on (or attach to) the information at
the time it is submitted to the assessor, a cover sheet, stamped or typed legend, or other suitable
form of notice, employing language such as "trade secret", "proprietary" or "company confidential".
Allegedly confidential portions of otherwise non-confidential information should be clearly identified
by the business, and may be submitted separately to facilitate identification and handling by the
assessor. CBI may be purged of references to client identity by the responsible laboratory official
at the time of removal from the laboratory. However, sample identifiers may not be obscured from
the information. If the information claimed as business confidential suggests the need for further
action, the information may be forwarded to the appropriate agency which may take further action
outside the scope of the accreditation process, to obtain the client's identity. If the information
claimed as business confidential suggests the need for further enforcement action, the accrediting
authority is responsible for ensuring that all CBI issues are handled in accordance with NELAC
standards.
If a business confidentiality claim is received after the on-site assessment by the accrediting
authority, the authority should make such efforts as are administratively practical to associate the
late claim with copies of the previously submitted information in its files. However the accrediting
authority cannot assure that such efforts will be effective in light of the possibility of prior disclosure
or dissemination of the information.
It is not the responsibility of the on-site assessor to make any determination with respect to the
validity of a confidential business information claim; this responsibility rests with the accrediting
authority. The assessor must maintain custody of CBI-claimed information collected during the
assessment until they are delivered to an authorized official of the accrediting authority. CBI-
claimed information may be the intellectual property of the laboratory. Therefore, all CBI-claimed
information must be held in a secure manner throughout the holding period of assessment records
and may not be reproduced or distributed inconsistent with 40 CFR Part 2. If the accrediting
-------
NELAC
On-Site Assessment
Revision 10
July 2, 1998
Page 9 of 15
authority questions the claim that certain information is CBI, the host laboratory must be contacted
and given twenty-one (21) calendar days to:
(1) provide justification of their claim to CBI,
(2) remove the claim of CBI,
(3) resolve the issue in a manner agreeable to both the laboratory and the accrediting authority,
(4) engage legal assistance,
(5) appeal the action to NELAP, or
(6) withdraw their NELAC accreditation application for the field of testing associated with the
CBI information.
In no instance may the accrediting authority declassify CBI-claimed information without notification
of the laboratory. If the responsible laboratory official does not consent to declassification of the
CBI-claimed information, the laboratory may pursue any or all of the above stated actions.
3.4.6 National Security Considerations
Assessors performing assessments at facilities owned and/or operated by Federal
departments/agencies/contractors may need security clearances, appropriate badging, and/or a
security briefing before proceeding with the on-site assessment. Assessors shall be informed in
writing of any information, including analytical data, that is controlled for national security reasons
and cannot be released to the public.
3.5 ASSESSMENT SCHEDULE/FORMAT
3.5.1 Length of Assessment
The length of an on-site assessment will depend upon a number of factors such as the number of
tests for which a laboratory desires accreditation, the number of assessors available, the size of the
laboratory, the number of problems encountered during the assessment, and the cooperativeness
of the laboratory staff. The assessor body should assign an adequate number of assessors to
complete the assessment within a reasonable period of time. Assessors must strike a balance
between thoroughness and practicality, but in all cases must determine to what effect the
laboratories' operations meet NELAC standards.
3.5.2 Opening Conference
Arrival at the facility should normally occur during established working hours. The responsible
laboratory official(s) should be located as soon as the assessment team arrives on the premises.
A laboratory's refusal to admit the assessment team for an assessment will result in an automatic
failure of the laboratory to receive accreditation or loss of an existing accreditation by the laboratory,
unless there are extenuating circumstances that are accepted and documented by the accreditation
authority. The team leader must notify the accrediting authority as soon as possible after refusal of
entry.
-------
NELAC
On-Site Assessment
Revision 10
July 2,1998
Page 10 of 15
An opening conference must be conducted and shall address the following topics:
a) the purpose of the assessment;
b) the identification of the assessment team;
c) the tests that will be examined;
d) any pertinent records and operating procedures to be examined during the assessment and the
names of the individuals in the laboratory responsible for providing the assessment team with
the necessary documentation;
e) the roles and responsibilities of key managers and staff in the laboratory;
f) the procedures related to Confidential Business Information;
g) any special safety procedures that the laboratory may think necessary for the protection of the
assessment team while in certain parts of the facility (under no circumstance is an assessment
team required or even allowed to sign any waiver of responsibility on the part of the laboratory
for injuries incurred by a team member during an inspection to gain access to the facility);
h) the standards that will be used by the assessors in judging the adequacy of the laboratory
operation;
i) confirmation of the tentative time for the exit conference;
j) provision of the assessment appraisal form to the responsible laboratory official (to be submitted
to NELAP and the accrediting authority); and
k) discussion of any questions the laboratory may have about the assessment process.
3.5.3 Records Review
Records will be reviewed by assessment team members for accuracy, completeness and the use
of proper methodology for each test and analyte to be evaluated.
A minimum record set that must be examined as part of a accreditation assessment includes;
a) application for accreditation from the laboratory;
b) previous assessment results and reports including proficiency testing results;
c) laboratory management structure and chains of responsibility (e.g. organizational charts);
d) qualifications statements of all key staff involved in the analysis or reporting of results for which
accreditation has been requested and a matching of the staff qualifications with the statements
submitted with the applications;
e) quality assurance plan(s) for the laboratory;
f) standard operating procedures and methodologies for each parameter for which accreditation
is sought;
-------
NELAC
On-Site Assessment
Revision 10
July 2,1998
Page 11 of 15
g) maintenance and calibration records of laboratory equipment and instrumentation;
h) procedures for the make-up and calibration of stock solutions and standard reagents;
i) origins, purities, assays and expiration dates of primary standards, analytical reagents and
standard reference materials;
j) records associated with method-specific QA\QC requirements;
k) the specific records associated with the initial method validation study in the laboratory which
must be examined in detail with the historical calibration data;
I) records associated with the methods used to estimate precision and accuracy in general for
specific analyses;
m) sample receipt and handling documentation;
n) proficiency testing sample receipt and handling procedures;
o) information about the proficiency testing providers;
p) records of any internal audits conducted or corrective actions taken by the laboratory itself; and
q) documentation of the laboratory's annual and/or ongoing management review.
The laboratory must mark all confidential information. The lead assessor must handle it as required
by appropriate laws and regulations. All other information for all aspects of application, assessment
and accreditation of laboratories is considered public information. If the laboratory requests that
information other than noted above is confidential, the information should be treated as confidential
until a ruling can be made by the accreditation authority.
3.5.4 Staff Interviews
As an element of the assessment process, the assessment team should evaluate an analysis
regimen by requesting that the analyst normally conducting the procedure give a step-by-step
description of exactly what is done and what equipment and supplies are needed to complete the
regimen. Any deficiencies shall be noted and discussed with the analyst. The deficiencies will also
be discussed in the closing conference.
The assessment team members shall have the authority to conduct interviews with any/all staff .
Calculations, data transfers, calibration procedures, quality control/assurance practices, adherence
to SOPs and report preparation shall be assessed for each test with the appropriate analysts(s).
3.5.5 Closing Conference
The assessment team must meet with representative(s) of the laboratory following the assessment
for an informal debriefing and discussion of findings with the possible exception of any issues of
improper and/or potentially illegal activity which may be the subject of further action. It should be
noted that the assessment team in no way limits its ability to identify additional problem areas in the
final report should it become necessary.
In the event the laboratory disagrees with the findings of the assessors), and the team leader
adheres to the original findings, the deficiencies with which the laboratory takes exception shall be
-------
NELAC
On-Site Assessment
Revision 10
July 2, 1998
Page 12 of 15
documented by the team leader and included in the report to the accreditation authority for
consideration. The accrediting authority will make the final determination as to the validity of the
contested elements.
The assessment team should inform the laboratory representative(s) that an assessment report
encompassing all relevant information concerning the ability of the applicant laboratory to comply
with the accreditation requirements is forthcoming.
3.5.6 Follow-up and Reporting Procedures
The accrediting authority or its authorized third party must present a deficiency report to the
laboratory within thirty (30) calendar days of the assessment. The laboratory will have thirty (30)
calendar days from the date of receipt of the report to provide a corrective action report to the
accrediting authority (Chapter4, section 4.1.3). An exception to these deadlines may be necessary
in those circumstances where a possible enforcement investigation or other action has been
initiated.
3.5.7 Assessment Closure
After reviewing the assessor's report(s) and any completed corrective action(s) reported by the
laboratory, the accrediting authority will make the determination of the accreditation status for a
laboratory.
If the deficiencies listed are substantial or numerous, an additional on-site assessment may be
conducted before a final decision for accreditation can be made.
3.6 STANDARDS FOR ASSESSMENT
3.6.1 Assessor Training Manual
The NELAC Assessor Training Manual is available on the NELAC Bulletin Board and will be
provided at all NELAC assessor training courses. The manual will be used when assessors take
the NELAC required training (Section 3.2.3) and will serve as a reference for on-site assessment
personnel.
The manual for on-site assessors shall include guidance for evaluating the following items:
a) Size, appearance, and adequacy of the laboratory facility;
b) Organization and management of the laboratory;
c) Qualifications and experience of laboratory personnel;
d) Receipt, tracking and handling of samples;
e) Listing/inventory, condition, and performance of laboratory instrumentation and equipment;
f) Source, traceability and preparation of calibration/verification standards;
g) Test methods (Including the adequacy of the laboratory's standard operating procedures as well
as confirmation of the analyst's adherence to SOPs, and the analyst's proficiency with the
described task);
-------
NELAC
On-Site Assessment
Revision 10
July 2, 1998
Page 13 of 15
h) Data reduction procedures, including an examination of raw data and confirmation that final
reported results are derived from raw data and original observations;
i) Quality assurance/quality control procedures, including adherence to the laboratory's quality
assurance plan and adequacy of the plan.
3.6.2 Assessor's Role
When performing an on-site laboratory assessment, the assessor must appraise each of the areas
listed in Section 3.6.1 and perform a thorough assessment of the records for each of the tests for
which accreditation has been requested.
The on-site assessor should use a variety of tools in the assessment process. The experience of
the assessor, his/her observations, interviews with laboratory staff, and examination of SOPs, raw
data, and the laboratory's documentation all play important roles in the assessment. The
accreditation of a particular laboratory will depend to a large extent on the assessment team's
findings and recommendations. Much of the on-site assessment will depend upon the assessor's
observations of existing conditions. The recommendation not to accredit a laboratory, or to change
a laboratory's accreditation status, must be based on factual information and not upon subjective
evaluations. Therefore, it is crucial that the on-site assessor have a clear understanding of the
laboratory's procedures and policies and that the assessor document any deficiencies in the report
of the on-site assessment.
The assessment team must use specific documentation in its reporting of deficiencies. The assessor
should discuss any deficiencies with the laboratory's management at the exit conference.
During the assessment, sufficient information may become available to suspect that a particular
person has violated an environmental law or regulation, such as knowingly making a false statement
on a report. This information should be carefully documented since further action may be
necessary. In the event that evidence of improper and/or potentially illegal activities have or may
have occurred, the assessment team should present such information to the accrediting authority
for appropriate action(s). These issues, at the discretion of the accrediting authority, may or may
not be subjects or issues of the closing conference. However, the assessor should continue to
gather the information necessary to complete the accreditation assessment.
3.6.3 Checklists
Standardized checklists, as documented in the NELAP Assessor Training Manual, must be used for
the on-site assessment. The use of checklists does not replace the need for assessor observations
and staff interviews, but is another tool which assists in conducting a thorough and efficient
assessment. A checklist is not a substitute for assessor training and experience.
3.6.4 Assessment Standards
The areas to be evaluated in an on-site assessment shall include:
a) Size, appearance, and adequacy of the laboratory facility;
b) Organization and management of the laboratory;
c) Qualifications and experience of laboratory personnel;
d) Receipt, tracking and handling of samples;
-------
NELAC
On-Site Assessment
Revision 10
July 2,1998
Page 14 of 15
e) Quantity, condition, and performance of laboratory instrumentation and equipment;
f) Preparation and traceability of calibration standards;
g) Test methods (Including the adequacy of the laboratory's standard operating procedures as well
as confirmation of the analyst(s) adherence to SOPs, and the analyst(s) proficiency with the
described task);
h) Data reduction procedures, including an examination of raw data and confirmation that final
reported results can be traced to the raw data/original observations;
i) Quality assurance/quality control procedures, including adherence to the laboratory's quality
assurance plan(s) and adequacy of the plan(s).
These areas must be evaluated against the standards detailed in Chapter 5, Quality Systems, of
the NELAC Standards and the appropriate method references. Additional information on the
process for evaluating these areas can be found in the Assessor Training Manual.
3.7 DOCUMENTATION OF ON-SITE ASSESSMENT
3.7.1 Checklists
The checklists used by the assessors during the assessment shall become a part of the permanent
file kept by the accrediting authority for each laboratory.
3.7.2 Report Format
The final site visit report shall be written to contain a description of the adequacy of the laboratory
as it relates to the assessment standards in Section 3.6.4. Assessment reports should be generated
in a narrative format. Deficiencies must be addressed at a minimum. Documentation of existing
conditions at the laboratory should be included in each report to serve as a baseline for future
contacts with the facility.
Assessment reports will contain:
a) Identification of the organization assessed (name and address),
b) Date of the assessment,
c) Identification and affiliation of each assessment team member,
d) Identification of participants in the assessment process,
e) Statement of the objective of the assessment,
f) Summary,
g) Assessment findings (deficiencies) and requirements, and
h) Comments and recommendations.
The Findings and Requirements Section must be referenced to the NELAC standards so that both
the finding (deficiency) is understood and the specific requirement is outlined. The team leader shall
-------
NELAC
On-Site Assessment
Revision 10
July 2,1998
Page 15 of 15
assure that the results within the final report conform to established standards for the evaluated
parameters.
The Comments and Recommendations Section can be used to convey recommendations aimed at
helping the laboratory improve.
3.7.3 Distribution
The accrediting authority shall be recognized as having the responsibility for the distribution of the
assessment reports. The assessment team leader shall compile, edit and submit the final report to
the accrediting authority.
3.7.4 Release of Report
On-site assessment reports should be released initially by the accrediting authority only. The reports
will be released to the responsible laboratory official(s). The assessment report shall not be released
to the National Accreditation Database and the public until findings of the assessment and the
corrective actions have been finalized, all Confidential Business Information and information related
to national security has been stricken from the report in accordance with prescribed procedures, and
the report has been provided to the laboratory (Section 4.1.3).
In accordance with the Freedom of Information requirements, any documentation adjudged to be
proprietary, financial and/or trade information, or relevant to an ongoing enforcement investigation,
will be considered exempt from release to the public.
3.7.5 Record Retention Time
Copies of all assessment reports, checklists, and laboratory responses must be retained by the
assessors and the accrediting authority for a period of at least ten (10) years, or longer if required
by specific State or Federal regulations.
-------
•HIM
__ _O
g =5
I 2>
e 8
LJJ
If
.2 P
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page i of i
TABLE OF CONTENTS
ACCREDITATION PROCESS
4.0 ACCREDITATION PROCESS 1
4.1 COMPONENTS OF ACCREDITATION 1
4.1.1 Personnel Qualifications 1
4.1.2 On-Site Assessments 3
4.1.3 Corrective Action Reports In Response to On-Site Assessment 4
4.1.4 Proficiency Testing Samples 4
4.1.5 Accountability for Analytical Standards 5
4.1.6 Fee Process for National Accreditation 5
4.1.7 Application 5
4.1.8 Change of Ownership and/or Location of Laboratory 6
4.1.9 "Certification of Compliance" Statement 7
4.2 PERIOD OF ACCREDITATION 8
4.3 MAINTAINING ACCREDITATION 8
4.3.1 Quality Systems 8
4.3.2 Notification and Reporting Requirements 9
4.3.3 Record Keeping and Retention 9
4.4 DENIAL, SUSPENSION, AND REVOCATION OF ACCREDITATION 9
4.4.1 Denial 9
4.4.2 Suspension 10
4.4.3 Revocation 10
4.4.4 Voluntary Withdrawal 11
4.5 INTERIM ACCREDITATION 11
4.5.1 Interim Accreditation 11
4.5.2 Revocation of Interim Accreditation 12
4.6 AWARDING OF ACCREDITATION 12
4.6.1 The Certificate of Accreditation 12
4.6.2 Use of NELAC Accreditation by Accredited Laboratories 12
4.6.3 Changes in Fields of Testing 12
4.7 ENFORCEMENT 13
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page 1 of 13
4.0 ACCREDITATION PROCESS
(NB. MANY OF THE STANDARDS AND ELEMENTS LISTED IN THIS CHAPTER ARE
REFLECTIVE OF STANDARDS SET FORTH IN CHAPTERS DEALING WITH DETAILED
EXPLANATIONS OF THESE ELEMENTS. THEREFORE, IT IS ANTICIPATED THAT SOME OF
THE DETAILS MAY CHANGE AS THE DISCUSSIONS AND CONCLUSIONS IN THESE
CHAPTERS CHANGE.)
4.1 COMPONENTS OF ACCREDITATION
The components of accreditation include review of personnel qualifications, on-site assessment
proficiency testing and quality assurance/quality control standards. These criteria must be fulfilled
for accreditation. The components and criteria are herein described. Details of some of the
requirements described below will be found in other sections of these Standards.
4.1.1 Personnel Qualifications
Persons who do not meet the education credential requirements of 4.1.1.1 of the NELAC standards
and are the technical director(s) on the date that the laboratory becomes subject to these NELAC
Standards, shall qualify as technical director(s) of that laboratory or any other NELAC-accredited
laboratory if that laboratory can demonstrate the ability to comply with the Accrediting Authority's
proficiency testing and quality control requirements and possesses the requisites experience.
4.1.1.1. Definition, Technical Director(s)
The technical directors) means a full-time member of the staff of an environmental laboratory who
exercises actual day-to-day supervision of laboratory procedures and reporting of results. The title
of such person may include but is not limited to laboratory director, technical director, laboratory
supervisor or laboratory manager. A laboratory may appoint one or more technical directors for the
appropriate fields of testing for which they are seeking accreditation. His/her name must appear in
the national database. This person's duties shall include, but not be limited to, monitoring standards
of performance in quality control and quality assurance; monitoring the validity of the analyses
performed and data generated in the laboratory to assure reliable data; ensuring that sufficient
numbers of qualified personnel are employed to supervise and perform the work of the laboratory;
and providing educational direction to laboratory staff. An individual shall not be the technical
director(s) of more than one accredited environmental laboratory without authorization from the
primary Accrediting Authority. Circumstances to be considered in the decision to grant such
authorization shall include, but not be limited to, the extent to which operating hours of the
laboratories to be directed overlap, adequacy of supervision in each laboratory, and the availability
of environmental laboratory services in the area served. The technical directors) who is absent for
a period of time exceeding 15consecutive calendar days shall designate another full-time staff
member meeting the qualifications of the technical directors) to temporarily perform this function.
If this absence exceeds 65 consecutive calendar days, the primary accrediting authority shall be
notified in writing.
Qualifications of the technical director(s).
a) The technical director(s) of an accredited environmental laboratory engaged in chemical
analysis shall be a person with a bachelors degree in the chemical, environmental, biological
sciences, physical sciences or engineering, with at least 24 college semester credit hours in
chemistry and at least two years of experience in the environmental analysis of representative
inorganic and organic analytes for which the laboratory is seeking approval. A masters or
doctoral degree in one of the above disciplines may be substituted for one year of experience.
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page 2 of 13
b) The technical director(s) of an accredited environmental laboratory limited to inorganic
chemical analysis, other than metals analysis, shall be a person with at least an earned
associate's degree in the chemical, physical or environmental sciences, or two years of
equivalent and successful college education, with a minimum of 16 college semester credit
hours in chemistry. In addition, such a person shall have at least two years of experience
performing such analysis.
c) The technical director(s) of an accredited environmental laboratory engaged in microbiological
or biological analysis shall be a person with a bachelors degree in microbiology, biology,
chemistry, environmental sciences, physical sciences or engineering with a minimum of 16
college semester credit hours in general microbiology and biology and at least two years of
experience in the environmental analysis of representative analytes for which the laboratory is
seeking approval. A masters or doctoral degree in one of the above disciplines may be
substituted for one year of experience.
A person with an associate's degree in an appropriate field of the sciences or applied sciences,
with a minimum of four college semester credit hours in general microbiology may be the
technical director(s) of a laboratory engaged in microbiological analysis limited to fecal coliform,
total coliform and standard plate count. Two years of equivalent and successful college
education, including the microbiology requirement, may be substituted for the associate's
degree. In addition, each person shall have one year of experience in environmental analysis.
d) The technical directors) of an accredited environmental laboratory engaged in radiological
analysis shall be a person with a bachelor's degree in chemistry, physics or engineering with
24 college semester credit hours of chemistry with two or more years of experience in the
radiological analysis of environmental samples. A masters or doctoral degree in one of the
above disciplines may be substituted for one year experience.
The technical directors) of an accredited environmental laboratory engaged in microscopic
examination of asbestos and/or airborne fibers shall meet the following requirements:
i) For procedures requiring the use of a transmission electron microscope, a bachelor's
degree, successful completion of courses in the use of the instrument, and one year
of experience, under supervision, in the use of the instrument. Such experience shall
include the identification of minerals.
ii) For procedures requiring the use of a polarized light microscope, an associate's degree
or two years of college study, successful completion of formal coursework in polarized
light microscopy, and one year of experience, under supervision, in the use of the
instrument. Such experience shall include the identification of minerals.
iii) For procedures requiring the use of a phase contrast microscope, as in the
determination of airborne fibers, an associate's degree or two years of college study,
documentation of successful completion of formal coursework in phase contrast
microscopy, and one year of experience, under supervision, in the use of the
instrument.
e) The technical director(s) of an accredited environmental laboratory engaged in the examination
of radon in air shall have at least an associate's degree or two years of college and one year of
experience in radiation measurements, including at least one year of experience in the
measurement of radon and/or radon progeny.
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page 3 of 13
4.1.1.2 Personnel Qualification Clarifications and Exceptions
a) Notwithstanding any other provision of this section, a full-time employee of a drinking water or
sewage treatment facility who holds a valid treatment plant operator's certificate appropriate to
the nature and size of such facility shall be deemed to meet the educational and experience
requirements serving as the director of the accredited laboratory devoted exclusively to the
examination of environmental samples taken within such facility. Such accreditation for a water
treatment facility and/or a sewage treatment facility shall be limited to the scope of that facility's
regulatory permit.
b) A full-time employee of an industrial waste treatment facility with a minimum of one year of
experience under supervision in environmental analysis shall be deemed to meet the
requirements for serving as the director of an accredited laboratory devoted exclusively to the
examination of environmental samples taken within such facility for the scope of that facility's
regulatory permit.
4.1.2 On-Site Assessments
On-Site assessments are a requirement of the Accreditation Process and a summary of the process
requirements are described. Refer to On-Site Assessment (Chapter 3) for additional information
regarding frequency, procedures, criteria, scheduling and documentation of on-site assessments.
On-Site assessments shall be of two types: announced and unannounced. The on-site assessment
of each accredited laboratory must be performed a minimum of one time per two years. On-site
assessments may be conducted more frequently for cause or at the option of the primary accrediting
authority. Situations which might trigger more frequent on-site assessments include, review of a
previously deficient on-site assessment, poor performance on a PT sample, change in other
accreditation elements, or other information concerning the capabilities or practices of the accredited
laboratory. The on-site assessment ensures that the environmental laboratory is in compliance with
NELAC standards..
The responsibility and accountability for meeting the NELAC standards are the responsibility of the
primary accrediting authority. The primary accrediting authority has the responsibility for conducting
on-site assessments for national accreditation based on the following factors:
a) Individual sites are subject to the same application process, assessments and other
requirements as environmental laboratories. Any remote laboratory sites are considered
separate sites and subject to separate on-site assessments, again provided that the analysis or
any portion of the analysis take place at that site. A mobile laboratory owned by an accredited
fixed based laboratory which is equipped with instrumentation to address a temporary situation,
not to exceed 90 calendar days, and is performing a subset of analyses for which the parent
laboratory is accredited, is considered an extension of the parent laboratory and will not require
separate accreditation . A location that only does sample collections is not considered an
environmental laboratory and shall not be subject to these requirements;
b) The assessment may consist of all of the fields of testing and/or methods for which the
laboratory wants to obtain accreditation;
c) The laboratory may be required to analyze PT samples during the on-site assessment under the
observation of an assessor;
d) The number of assessors conducting the on-site assessment should be appropriate for the
laboratory's scope and testing.
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page 4 of 13
e) The on-site assessment should be conducted during normal working hours.
Laboratories shall be furnished with a report documenting any deficiencies found by the assessor.
This shall be known as an assessment report. All such reports are public record and any or all of
the information contained therein may be put into the National Database except as noted in Section
3.4.6 National Security Considerations.
4.1.3 Corrective Action Reports In Response to On-Site Assessment
A corrective action report must be submitted by the laboratory to the primary accrediting authority
in response to any assessment report received by the laboratory after an on-site assessment. The
corrective action report shall include the action that the laboratory shall implement to correct each
deficiency and the time period required to accomplish the corrective action.
a) The primary accrediting authority shall present an assessment report to the laboratory within
30 calendar days of the on-site assessment.
b) After being notified of deficiencies, the laboratory shall have 30 calendar days from the date
of receipt of the assessment report to provide a corrective action report.
c) The primary accrediting authority shall respond to the action noted in the corrective action
report within 30 calendar days of receipt.
d) If the corrective action report (or a portion) is deemed unacceptable to remediate a deficiency
the laboratory shall have an additional 30 calendar days to submit a revised corrective action
report.
e) If the corrective action report is not acceptable to the primary accrediting authority after the
second submittal, the laboratory shall have accreditation revoked pursuant to Section 4.4.3
for all or any portion of its scope of accreditation for any or all of a field of testing, a method,
or analyte within a field of testing..
f) All information included and documented in an assessment report and the corrective action
report are considered to be public information and are to be released pursuant to Chapter 3,
section 3.7.4. Other accrediting authorities participating in the NELAP would have access to
this information through a national database.
g) If the laboratory fails to implement the corrective actions as stated in their corrective action
report, accreditation for fields of testing, specific methods, or analytes within those fields of
testing shall be revoked. All such deficiency and corrective action reports are public record and
any or all of the information contained therein may be put into the national database. Proprietary
data and Confidential Business Information and classified national security information will be
excluded from all public records.
4.1.4 Proficiency Testing Samples
A critical component of laboratory assessments is the analysis of proficiency testing (PT) samples.
Refer to Proficiency Testing (Chapter 2) for additional information. PT samples are used and
evaluated in the accreditation process as follows:
a) Each laboratory seeking accreditation must receive, and analyze initial PT samples from a
NELAP approved PT study provider for each field of testing (program-method-analyte) in which
they are requesting accreditation.
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page 5 of 13
b) Unless otherwise specified by the proficiency testing standard, each laboratory seeking or
maintaining accreditation shall be required to perform analyses on one PT sample twice per year
in each field of testing (program-method-analyte) for which they have applied for accreditation
or for which they are currently accredited.
c) The laboratory shall be informed of their score on the PT samples by the primary accrediting
authority or the NELAP approved PT provider within 21 calendar days from the closing date of
submission. The results of all of the PT sample tests including "pass" or "fail" shall be part of
the public record. The result of passing or failing a PT sample shall apply to all accredited
methods within that matrix for which a laboratory employs for an analyte.
d) When a laboratory initially requests accreditation, it must successfully analyze two sets of PT
samples, the analyses to be performed 30 calendar days apart. Each set shall contain one
sample for each requested field of testing (program-method-analyte). Once a laboratory has
been granted accreditation status, it must maintain a history of at least two passing results out
of the most recent three for each field of testing (program-method-analyte).
e) The results of the PT sample analyses shall be considered by the primary accrediting authority,
in determining whetheraccreditationshould be granted, denied, revoked, or suspended pursuant
to this Chapter, for a field of testing (program-method-analyte) or an analyte within a field of
testing (program-method-analyte).
4.1.5 Accountability for Analytical Standards
Elements in NELAP that shall ensure consistency and promote the use of quality assurance/quality
control procedures to generate quality data for regulatory purposes are:
a) In accordance with Chapter 5 each laboratory seeking NELAP accreditation shall have a
named quality assurance officer or a person designated as accountable for data quality.
b) NELAC requires that each laboratory seeking NELAP accreditation have a developed and
maintained Quality Assurance Manual on-site, as required in Chapters. The primary accrediting
authority may request the manual prior to the on-site assessment.
c) The primary accrediting authority shall consider that the accountability for negligence, the
falsification of data, shall rest upon the analyst, the laboratory management and the company.
4.1.6 Fee Process for National Accreditation
Refer to Policy and Structure, Chapter 1, specifically funding of this program (Section 1.5.2.3.3).
Where required and if applicable, the level and timing of fee payments shall be established by the
primary accrediting authority to which the laboratory is applying for accreditation. Additional fees
on the laboratory may be levied by other secondary accrediting authorities with which the laboratory
chooses to do business.
4.1.7 Application
The NELAP encompasses a standardized set of elements in each application for accreditation that
shall be reported to and recorded in the national database. The application package includes any
specific state regulatory requirements that are essential for accreditation within an individual state.
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page 6 of 13
An accrediting authority participating in NELAC shall include in its application form the following:
a) Legal name of laboratory
b) Laboratory mailing address
c) Billing address (if different from b)
d) Name of owner
e) Address of owner
f) Location (full address) of laboratory
g) Name and phone number of technical director(s),however named, and the lead technical
director (if applicable)
h) Name and phone number of Quality Assurance Officer
i) Name and phone number of laboratory contact person
j) Laboratory hours of operation
k) Primary Accrediting Authority
I) Fields of Testing for which the laboratory is requesting accreditation
m) Methods employed including analytes
n) Description of laboratory type (for example)
Commercial
Federal
Hospital or health care
- State
- Academic Institutes
Public water system
Public wastewater system
Industrial (an industry with discharge permits)
- Mobile
- Other (Describe)
o) Certification of compliance by laboratory management
(vide infra: 4.1.9)
p) Fee enclosed (if applicable)
q) Description of geographical location
r) FAX number
s) Lab identification number (for renewal)
t) Quality Manual
A laboratory seeking renewal of accreditation shall follow the process outlined by the accrediting
authority in which they are currently accredited.
4.1.8 Change of Ownership and/or Location of Laboratory
Accreditation may be transferred when the legal status or ownership of an accredited laboratory
changes without affecting its staff, equipment, and organization. The primary accrediting authority
may charge a transfer fee and may conduct an On-site assessment to verify affects of such changes
on laboratory performance.
The following conditions apply to the change in ownership and/or the change in location of a
laboratory that has national accreditation.
a) Any change in ownership and/or location of an accredited laboratory must be reported in writing
to the primary accrediting authority and entered into the national database by the primary
accrediting authority.
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page 7 of 13
b) Such a change in ownership and/or location shall not necessarily require reaccreditation or
reapplication in any or all of the categories in which the laboratory is currently accredited.
c) Change in ownership and/or location may require an on-site assessment with the elements of
the assessment being determined by the assessor.
d) Any change in ownership must assure historical traceability of the laboratory accreditation
number(s).
e) For a change in ownership, the following conditions must be in effect:
1. The previous (transferring) owner must agree in writing, before the transfer of ownership
takes place, to be accountable and liable for any analyses, data and reports generated up
to the time of legal transfer of ownership; and
2. The buyer (transferee) must agree in writing to be accountable and liable for any analyses,
data and reports generated after the legal transfer of ownership occurs.
3 All records and analyses performed pertaining to accreditation must be kept for a minimum
of 10 years and are subject to inspection by the accrediting authorities during this period
without prior notification to the laboratory. This stipulation is applicable regardless of
change in ownership, accountability or liability.
4. If ownership is transferred, the transferee may not be responsible for payment of fees to the
accrediting authorities during the remainder of the yearly period, provided that the previous
owner has fully paid the required fees to the accrediting authorities.
4.1.9 "Certification of Compliance" Statement
The following "Certification of Compliance" statement must accompany the application for laboratory
accreditation. It must be signed and dated by both the laboratory management and the quality
assurance officer, or other designated person, for that laboratory.
CERTIFICATION BY APPLICANT
The applicant understands and acknowledges that the laboratory is required to be continually in
compliance with the National Environmental Laboratory Accreditation Conference (NELAC)
standards and shall be subject to the penalty provisions provided therein.
The applicant understands and acknowledges that accreditation is specifically subject to
unannounced assessments.
Authorized representatives of any primary accrediting authority may make an announced or
unannounced assessment, search, or examination of an accredited or interim accredited laboratory
whenever the primary accrediting authority, at its discretion, considers such an assessment, search
or examination necessary to determine the extent of the laboratory's compliance with the NELAC
standards. Additionally, the applicant authorizes the primary accrediting authority assessor to; 1)
make copies of any analyses or records relevant to the accreditation process, and 2) remove any
or all such copies from the laboratory for purposes of assessment or compliance with the NELAC
standards. Any refusal to allow entry to the primary accrediting authority's representatives during
normal business hours or to allow copies of records relevant to laboratory accreditation to be made
shall constitute a violation of a condition of accreditation and grounds for denial, suspension, or
revocation of accreditation.
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page 8 of 13
The applicant hereby certifies that all accredited environmental analyses performed are done in
accordance with the NELAC standards.
I hereby certify that I am authorized to sign this application on behalf of the applicant/owner and that
there are no misrepresentations in my answer to the questions on this application.
Signature Quality Assurance Officer Name of Quality Assurance Officer
or other designated individual
Print Name of Applicant Laboratory Date
(Legal Name)
Signature Name
Technical Director(s) Technical Directors)
4.2 PERIOD OF ACCREDITATION
For a laboratory in good standing, the period for accreditation within fields of testing for methods
or analytes shall be 12 months and will be considered to be ongoing once a laboratory has been
accredited for that field of testing method or analyte within a field of testing. To maintain
accreditation the laboratory shall meet the requirements of Section 4.3, Maintaining Accreditation.
Failure to meet the requirements delineated in Section 4.3 shall constitute grounds for suspension
or revocation of accreditation as specified in Section 4.4. Additionally, failure to pay the required
fees as determined by the accrediting authority within the stipulated deadlines or by the stipulated
dates shall result in revocation of accreditation. This information may be entered into the national
database in a timely and effective manner. The NELAP recognizes that different accrediting
authorities operate the yearly period with different start times. The individual laboratory being
accredited is responsible for tracking an accrediting authority's period of accreditation and is
responsible for paying the necessary fees (if applicable) to those accrediting authorities to maintain
accreditation.
4.3 MAINTAINING ACCREDITATION
Accreditation remains in effect until revoked by the accrediting authority, withdrawn at the written
request of the accredited laboratory, or until expiration of accreditation period. To maintain
accreditation, the accredited laboratory shall complete or comply with elements 4.3.1 to 4.3.3.
Failure to complete or comply with these elements shall be cause for suspending or revoking
accreditation as specified in section 4.4 of this chapter.
4.3.1 Quality Systems
Laboratories seeking accreditation under NELAP must assure consistency and promote the use of
quality assurance/quality control procedures. Chapter 5, Quality Systems provides the details
concerning quality assurance and quality control requirements for the evaluation of laboratories.
The quality assurance policies, which establish essential quality control procedures, are applicable
to all environ mental laboratories regardless of size, volume of business and fields of testing. Failure
to maintain, revise, or replace any of these key components may be cause for suspending or
revoking a laboratory's accreditation status, as specified in section 4.4 of this chapter.
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page 9 of 13
4.3.2 Notification and Reporting Requirements
The accredited laboratory shall notify the accrediting authority of any changes in key accreditation
criteria within 30 calendar days of the change. This written notification change includes but is not
limited to the laboratory ownership, location, key personnel, and major instrumentation. . All such
updates are public record and any or all of the information contained therein may be put into the
national database.
4.3.3 Record Keeping and Retention
All laboratory records associated with accreditation parameters shall meet the requirements of
Chapter 5, Section 5.12 and shall be maintained for a minimum of five years unless otherwise
designated for a longer period in another regulation or authority. In the case of data used in litigation,
the laboratory is required to store such records for a longer period upon written notification from the
accrediting authority.
4.4 DENIAL, SUSPENSION, AND REVOCATION OF ACCREDITATION
4.4.1 Denial
Denial - shall mean to refuse to accredit in total or in part a laboratory applying for initial
accreditation or resubmission of initial application.
a) Reasons to deny an initial application shall include:
1) Failure to submit a completed application.
2) Failure of laboratory staff to meet the personnel qualifications as required by the NELAC
standards. These qualifications shall include education, training and experience
requirements.
3) Failure to successfully analyze and report proficiency testing samples as required by the
NELAC standards, Chapter 2.
4) Failure to respond to an assessment report from the on-site assessment with a corrective
action report within the required 30 calendar days after receipt of the assessment report.
5) Failure to implement the corrective actions detailed in the corrective action report within the
time frame as specified by the primary accrediting authority.
6) Failure to pay required fees.
7) Failure to pass required on-site assessment(s) as specified in the NELAC standards,
Chapter 3.
8) Misrepresentation of any fact pertinent to receiving or maintaining accreditation.
9) Denial of entry during normal business hours for an on-site assessment as required by the
NELAC standards, Chapter 3.
b) If the laboratory is not successful in correcting the deficiencies as required by the NELAC
standards, the laboratory must wait six months before again reapplying for accreditation.
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page 10 of 13
c) Upon reapplication, the laboratory may again be responsible for all or part of the fees as
applicable incurred as part of the initial application for accreditation.
d) No laboratory's accreditation shall be denied without the right to due process.
4.4.2 Suspension
Suspension - shall mean the temporary removal of a laboratory's accreditation for a defined period
of time which shall not exceed six months. The purpose of suspension is to allow a laboratory time
to correct deficiencies or area of non-compliance with the NELAC standards.
a) A laboratory's accreditation shall be suspended in total or in part. The laboratory shall retain
accreditation for the field of testings, methods and analytes where it continues to meet the
requirements of the NELAC standards.
b) Reasons for suspension shall include:
1) If the primary accrediting authority finds during the on-site assessment that the public
interest, safety or welfare imperatively requires emergency action;
2) Failure to complete proficiency testing studies and maintain a history of at least two
successful proficiency testing studies for each affected accredited field of testing out of the
three most recent proficiency testing studies as defined in NELAC, Chapter 2;
3) Failure to notify the primary accrediting authority of any changes in key accreditation
criteria, as set forth in Section 4.3.2 of this Chapter;
c) A suspended laboratory cannot continue to analyze samples for the affected fields of testing for
which it holds accreditation.
d) The laboratory's suspended accreditation status will change to accredited when the laboratory
demonstrates to the primary accrediting authority that the laboratory complies with the NELAC
standards.
e) A suspended laboratory would not have to reapply for accreditation if the cause/causes for
suspension are corrected within six months.
f) If the laboratory fails to correct the causes of suspension within six months after the effective
date of the suspension, the primary accrediting authority shall revoke in total or part the
laboratory's accreditation.
g) No laboratory's accreditation shall be suspended without the right to due process as set forth
by the primary accrediting authority.
4.4.3 Revocation
Revocation - shall mean the in part or total withdrawal of a laboratory's accreditation by the
accrediting authority.
a) The accrediting authority shall revoke a laboratory's accreditation, in part or in total for failure
to correct the deficiencies as set forth in section 4.1.3 e) of this Chapter and failure to correct
the reasons for being suspended . The laboratory shall retain accreditation for the fields of
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page 11 of 13
testing, methods and analytes where it continues to meet the requirements of the NELAC
standards.
b) Reasons for revocation in part or in total include a laboratory's:
1) Failure to submit an acceptable corrective action report, in response to an assessment
report and failure to implement corrective action(s)related to any deficiencies found during
a laboratory assessment. The laboratory may submit two corrective action reports within the
time limits specified in section 4.1.3.
2) After being suspended due to failure of proficiency testing samples, if the laboratory's
analysis of the next proficiency testing study results in three consecutively failed proficiency
testing studies, the laboratory shall be revoked for each affected accredited field of testing
as defined in NELAC Chapter 2.
c) Reasons for total revocation include a laboratory's:
1) Failure to respond with a corrective action report within the required 30 calendar days.
2) Failure to participate in the proficiency testing program as required by the NELAC
standards, Chapter 2.
3) Submittal of proficiency test sample results generated by another laboratory as its own.
4) Misrepresentation of any material fact pertinent to receiving initial approval.
5) Denial of entry during normal business hours for an on-site assessment as required by the
NELAC standards, Chapter 3.
6) Conviction of charges relating to the falsification of any report relating to a laboratory
analysis.
7) Failure to remit the accreditation fees, if applicable, within the time limit as established by
the accrediting authority shall be grounds for immediate revocation.
d) After correcting the reason/cause for total revocation, the laboratory may reapply for
accreditation no sooner than 6 months from the official date of revocation.
e) No laboratory's accreditation shall be revoked without the right to due process.
4.4.4 Voluntary Withdrawal
If an environmental laboratory wishes to withdraw from NELAP, in total or in part, it must notify the
primary accrediting authority no laterthan 30 calendar days before the end of the accreditation year.
4.5 INTERIM ACCREDITATION
4.5.1 Interim Accreditation
If a laboratory completes all of the requirements for accreditation except that of an on-site
assessment because the accrediting authority is unable to schedule the assessment in a timely
manner, the accrediting authority may issue an interim accreditation. Interim accreditation shall
allow a laboratory to perform analyses and report results with the same status as an accredited
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page 12 of 13
laboratory until the on-site assessment requirements have been completed. Interim accreditation
status shall not exceed twelve months. The interim accreditation status is a matter of public record
and shall be entered into the National Database.
4.5.2 Revocation of Interim Accreditation
Revocation of interim accreditation may be initiated for due cause as described in 4.4. 3 by order
of the primary accrediting authority.
4.6 AWARDING OF ACCREDITATION
When a participating laboratory has met the requirements specified for receiving accreditation, the
laboratory shall receive a certificate awarded on behalf of the accrediting authority. The certificate
shall provide the following information: the name of the laboratory, address of the laboratory, the
specifications of the accreditation action (for example, the laboratory may be accredited for analysis
of water or for use of a specific analytical methodology, etc.). Addenda or attachments to the
certificate are allowed and shall be considered to be official documents. Information on the
addenda or attachments may include scope, methods, analytes...etc. The laboratory must have a
certificate for each state or Federal Department/Agency in which it is accredited. Even though a
parent laboratory is accredited, the subfacilities (laboratories operating under the same parent
organization, analytical procedures, and quality assurance system) are inspected or processed
separately and shall be issued their own Certificate of Accreditation. Any subfacilities or remote
laboratory sites are considered separate sites and subject to separate announced and unannounced
assessments, again provided that the analysis or any portion of the analysis take place at that site.
4.6.1 The Certificate of Accreditation
The certificate shall be signed by a member of the accrediting authority and shall be considered
an official document. It will be transmitted as a sealed and dated (effective date and expiration
date)document containing the NELAC Insignia. The certificate shall include specific fields of
testing,, analytes, and methods that the laboratory or subfacility is accredited for.
The certificate shall explain that continued accredited status depends on successful ongoing
participation in the program. The certificate shall urge a customer to verify the laboratory's current
accreditation standing within a particular state. The certificate must be returned to the accrediting
authority upon loss of accreditation. However, this does not require the return of a certificate which
has simply expired (reached the expiration date).
4.6.2 Use of NELAC Accreditation by Accredited Laboratories
An accredited laboratory shall not misrepresent its NELAP accredited fields of testing, methods,
analytes, or its NELAP accreditation status on any document. This includes laboratory reports,
catalogs, advertising, business solicitations, proposals, quotations or other materials, (pursuant to
NELAC Chapter 6.8)
4.6.3 Changes in Fields of Testing
If an accredited laboratory changes its scope of accreditation, a new certificate shall be issued
which details the laboratory's accreditation(s).
-------
NELAC
Accreditation Process
Revision 9
July 2, 1998
Page 13 of 13
4.7 ENFORCEMENT
Since NELAC is a standard setting body, it can not enforce civil or criminal penalties but rather all
enforcement actions are taken independently by the accrediting authorities.
The enforcement component of the accrediting authorities should be based on explicit values, or
principles, with which all participants concur. The proposed basic principles are:
a) The program should be equitable to all participants;
b) The rules should be well publicized;
c) The program needs of the participating agencies must be upheld; and
d) The due process rights of participating laboratories must be protected.
-------
_ o
co ys
g
I s>
o o
.E o
LJJ
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page i of iii
TABLE OF CONTENTS
QUALITY SYSTEMS
5.0 QUALITY SYSTEMS 1
5.1 SCOPE 1
5.2 REFERENCES 1
5.3 DEFINITIONS 2
5.4 ORGANIZATION AND MANAGEMENT 2
5.4.1 Legal Definition of Laboratory 2
5.4.2 Organization 2
5.5 QUALITY SYSTEM - ESTABLISHMENT, AUDITS, ESSENTIAL QUALITY CONTROLS AND
DATA VERIFICATION 3
5.5.1 Establishment 3
5.5.2 Quality Manual 4
5.5.3 Audits 5
5.5.3.1 Internal Audits 5
5.5.3.2 Managerial Review 6
5.5.3.3 Audit Review 6
5.5.3.4 Performance Audits 6
5.5.3.5 Corrective Actions 6
5.5.4 Essential Quality Control Procedures 7
5.6 PERSONNEL 8
5.6.1 General Requirements for Laboratory Staff 8
5.6.2 Laboratory Management Responsibilities 8
5.6.3 Records 9
5.7 PHYSICAL FACILITIES -ACCOMMODATION AND ENVIRONMENT 9
5.7.1 Environment 9
5.7.2 Work Areas 10
5.8 EQUIPMENT AND REFERENCE MATERIALS 10
5.9 MEASUREMENT TRACEABILITY AND CALIBRATION 11
5.9.1 General Requirements 11
5.9.2 Traceability of Calibration 11
5.9.3 Reference Standards 11
5.9.4 Calibration 12
5.9.4.1 General Requirements 12
5.9.4.2 Acceptance Criteria for Support Equipment 12
5.9.4.3 Instrument Calibrations 13
5.9.4.4 Calibration Verification 14
5.10 TEST METHODS AND STANDARD OPERATING PROCEDURES 15
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page ii of iii
5.10.1 Methods Documentation 15
5.10.1.1 Standard Operating Procedures (SOPs) 15
5.10.1.2 Laboratory Method Manual(s) 15
5.10.2 Test Methods 16
5.10.2.1 Method Validation/Initial Demonstration of Capability 16
5.10.3 Sample Aliquots 17
5.10.4 Data Verification 17
5.10.5 Documentation and Labeling of Standards and Reagents 17
5.10.6 Computers and Electronic Data Related Requirements 18
5.11 SAMPLE HANDLING, SAMPLE ACCEPTANCE POLICY AND SAMPLE RECEIPT . . 18
5.11.1 Sample Tracking 18
5.11.2 Sample Acceptance Policy 19
5.11.3 Sample Receipt Protocols 19
5.11.4 Storage Conditions 21
5.11.5 Sample Disposal 21
5.12 RECORDS 21
5.12.1 Record Keeping System and Design 22
5.12.2 Records Management and Storage 22
5.12.3 Laboratory Sample Tracking 23
5.12.3.1 Sample Handling 23
5.12.3.2 Laboratory Support Activities 24
5.12.3.3 Analytical Records 24
5.12.3.4 Administrative Records 24
5.12.4 Legal or Evidentiary Custody 25
5.12.4.1 Basic Requirements 25
5.12.4.2 Required Information in Custody Records 26
5.12.4.3 Controlled Access to Samples 26
5.12.4.4 Transfer of Samples to Another Party 26
5.12.4.5 Sample Disposal 26
5.13 LABORATORY REPORT FORMAT AND CONTENTS 27
5.14 SUBCONTRACTING ANALYTICAL SAMPLES 29
5.15 OUTSIDE SUPPORT SERVICES AND SUPPLIES 29
5.16 COMPLAINTS 29
Appendix A - REFERENCES 1
Appendix B - DEFINITIONS FOR QUALITY SYSTEMS 1
Appendix C - INITIAL DEMONSTRATION OF CAPABILITY 1
C.1 PROCEDURE FOR INITIAL DEMONSTRATION OF CAPABILITY 1
C.2 CERTIFICATION STATEMENT 2
Appendix D - ESSENTIAL QUALITY CONTROL REQUIREMENTS 1
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page iii of hi
D.1 CHEMICAL TESTING 1
D.1.1 Positive and Negative Controls 1
D.1.2 Analytical Variability/Reproducibility 2
D.1.3 Method Evaluation 2
D.1.4 Method Detection Limits 2
D.1.5 Data Reduction 3
D.1.6 Quality of Standards and Reagents 3
D.1.7 Selectivity 3
D.1.8 Constant and Consistent Test Conditions 4
D.2WHOLE EFFLUENTTOXICITY 4
D.2.1 Positive and Negative Controls 4
D.2.2 Variability and/or Reproducibility 5
D.2.3 Accuracy 5
D.2.4 Test Sensitivity 5
D.2.5 Selection of Appropriate Statistical Analysis Methods 5
D.2.6 Selection and Use of Reagents and Standards 6
D.2.7 Selectivity 6
D.2.8 Constant and Consistent Test Conditions 6
D.3 MICROBIOLOGY 7
D.3.1 Positive and Negative Controls 8
D.3.2 Test Variability/Reproducibility 8
D.3.3 Method Evaluation 8
D.3.4 Test Performance 9
D.3.5 Data Reduction 9
D.3.6 Quality of Standards, Reagents and Media 9
D.3.7 Selectivity 10
D.3.8 Constant and Consistent Test Conditions 10
Figure D-1. USE OF REFERENCE CULTURES (BACTERIA) 12
D.4 RADIOCHEMICAL ANALYSIS 13
D.4.1 Negative Controls 13
D.4.2 Positive Controls 13
D.4.3 Test Variability/Reproducibility 14
D.4.4 Other Quality Control Measures 14
D.4.5 Method Evaluation 15
D.4.6 Radiation Measurement System Calibration 15
D.4.7 Method Detection Limits 16
D.4.8 Data Reduction 16
D.4.9 Quality of Standards and Reagents 17
D.4.10 Constant and Consistent Test Conditions 17
D.5 AIR TESTING 18
Appendix E - PERFORMANCE BASED MEASUREMENT SYSTEM 1
E.1 CHECKLIST OVERVIEW 1
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Pagel of 29
5.0 QUALITY SYSTEMS
INTRODUCTION
Quality Systems include all quality assurance (QA) policies and quality control (QC) procedures,
which shall be delineated in a Quality Manual and followed to ensure and document the quality of
the analytical data. Laboratories seeking accreditation under NELAP must assure implementation
of all QA policies and the essential applicable QC procedures specified in this chapter. The QA
policies, which establish essential QC procedures, are applicable to environmental laboratories
regardless of size and complexity.
The intent of this Chapter is to provide sufficient detail concerning quality management requirements
so that all accrediting authorities evaluate laboratories consistently and uniformly.
NELAC is committed to the use of Performance Based Measurement Systems (PBMS) in
environmental testing and provides the foundation for PBMS implementation in these standards.
While this standard may not currently satisfy all the anticipated needs of PBMS, NELAC will address
future needs within the context of state statutory and regulatory requirements and the finalized EPA
implementation plans for PBMS.
Chapter 5 is organized according to the structure of ISO/IEC Guide 25, 1990. Where deemed
necessary, specific areas within this Chapter may contain more information than specified by
ISO/IEC Guide 25.
All items identified in this chapter shall be available for on-site inspection or data audit.
5.1 SCOPE
a) This Standard sets out the general requirements in accordance with which a laboratory has to
demonstrate that it operates, if it is to be recognized as competent to carry out specific
environmental tests.
b) This standard includes additional requirementsand information for assessing competence or for
determining compliance by the organization or accrediting authority granting the recognition (or
approval).
If more stringent standards or requirements are included in a mandated test method or by
regulation, the laboratory shall demonstrate that such requirements are met. (See the
supplemental accreditation requirements in Section 1.8.2.)
c) This Standard is for use by environmental testing laboratories in the development and
implementation of their quality systems. It shall be used by accreditation authorities, in
assessing the competence of environmental laboratories.
5.2 REFERENCES
See Appendix A
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 2 of29
5.3 DEFINITIONS
The relevant definitions from ISO/IEC Guide2, ISO 8402, ANSI/ASQC E-4,1994, the EPA "Glossary
of Quality Assurance Terms and Acronyms", and the International vocabulary of basic and general
terms in metrology (VIM) are applicable, the most relevant being quoted in Appendix B together with
further definitions applicable for the purposes of this Standard.
See Appendix B
5.4 ORGANIZATION AND MANAGEMENT
5.4.1 Legal Definition of Laboratory
The laboratory shall be legally identifiable. It shall be organized and shall operate in such a way that
its permanent, temporary and mobile facilities meet the requirements of this Standard .
5.4.2 Organization
The laboratory shall:
a) have managerial staff with the authority and resources needed to discharge their duties;
b) have processes to ensure that its personnel are free from any commercial, financial and other
undue pressures which might adversely affect the quality of their work;
c) be organized in such a way that confidence in its independence of judgment and integrity is
maintained at all times;
d) specify and document the responsibility, authority, and interrelationship of all personnel who
manage, perform or verify work affecting the quality of calibrations and tests;
Such documentation shall include:
1) a clear description of the lines of responsibility in the laboratory and shall be proportioned
such that adequate supervision is ensured and
2) job descriptions for all positions.
e) provide supervision by persons familiar with the calibration or test methods and procedures, the
objective of the calibration or test and the assessment of the results. The ratio of supervisory
to non-supervisory personnel shall be such as to ensure adequate supervision;
f) have a technical director(s) (however named) who has overall responsibility for the technical
operation of the environmental testing laboratory;
The technical director(s) shall certify that personnel with appropriate educational and/or
technical background perform all tests for which the laboratory is accredited. Such certification
shall be documented.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 3 of 29
The technical directors) shall meet the requirements specified in the Accreditation Process.
(see 4.1.1.1)
g) have a quality assurance officer (however named) who has responsibility for the quality system
and its implementation. The quality assurance officer shall have direct access to the highest
level of management at which decisions are taken on laboratory policy or resources, and to the
technical director. Where staffing is limited, the quality assurance officer may also be the
technical director or deputy technical director;
The quality assurance officer (and/or his/her designees) shall:
1) serve as the focal point for QA/QC and be responsible for the oversight and/or review of
quality control data;
2) have functions independent from laboratory operations for which they have quality
assurance oversight;
3) be able to evaluate data objectively and perform assessments without outside (e.g.,
managerial) influence;
4) have documented training and/or experience in QA/QC procedures and be knowledgeable
in the quality system as defined under NELAC;
5) have a general knowledge of the analytical test methods for which data review is performed;
6) arrange for or conduct internal audits on the entire technical operation annually; and
7) notify laboratory management of deficiencies in the quality system and monitor corrective
action.
h) nominate deputies in case of absence of the technical director(s) and/or quality assurance
officer;
i) have documented policy and procedures to ensure the protection of clients' confidential
information and proprietary rights (this may not apply to in-house laboratories);
j) when available, participate in inter-laboratory comparisons and proficiency testing programs.
For purposes of qualifying for and maintaining accreditation, each laboratory shall participate
in a proficiency test program as outlined in Chapter 2.0.
5.5 QUALITY SYSTEM - ESTABLISHMENT, AUDITS, ESSENTIAL QUALITY CONTROLS AND
DATA VERIFICATION
5.5.1 Establishment
The laboratory shall establish and maintain a quality system based on the required elements
contained in this chapter and appropriate to the type, range and volume of environmental testing
activities it undertakes.
a) The elements of this quality system shall be documented in the organization's quality manual.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 4 of 29
b) The quality documentation shall be available for use by the laboratory personnel.
c) The laboratory shall define and document its policies and objectives for, and its commitment to
accepted laboratory practices and quality of testing services.
d) The laboratory management shall ensure that these policies and objectives are documented in
a quality manual and communicated to, understood, and implemented by all laboratory
personnel concerned.
e) The quality manual shall be maintained current under the responsibility of the quality assurance
officer.
5.5.2 Quality Manual
The quality manual, and related quality documentation, shall state the laboratory's policies and
operational procedures established in order to meet the requirements of this Standard.
The Quality Manual shall list on the title page: a document title; the laboratory's full name and
address; the name, address (if different from above), and telephone number of individual(s)
responsible for the laboratory; the name of the quality assurance officer (however named); the
identification of all major organizational units which are to be covered by this quality manual and the
effective date of the version;
The quality manual and related quality documentation shall also contain:
a) a quality policy statement, including objectives and commitments, by top management;
b) the organization and management structure of the laboratory, its place in any parent
organization and relevant organizational charts;
c) the relationship between management, technical operations, support services and the quality
system;
d) procedures to ensure that all records required under this Chapter are retained, as well as
procedures for control and maintenance of documentation through a document control system
which ensures that all standard operating procedures, manuals, or documents clearly indicate
the time period during which the procedure or document was in force;
e) job descriptions of key staff and reference to the job descriptions of other staff;
f) identification of the laboratory's approved signatories; at a minimum, the title page of the Quality
Manual must have the signed concurrence, (with appropriate titles) of all responsible parties
including the QA officer, technical director, and the agent who is in charge of all laboratory
activities such as the laboratory director or laboratory manager;
g) the laboratory's procedures for achieving traceability of measurements;
h) a list of all test methods under which the laboratory performs its accredited testing;
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 5 of 29
i) mechanisms for ensuring that the laboratory reviews all new work to ensure that it has the
appropriate facilities and resources before commencing such work;
j) reference to the calibration and/or verification test procedures used;
k) procedures for handling submitted samples;
I) reference to the major equipment and reference measurement standards used as well as the
facilities and services used by the laboratory in conducting tests;
m) reference to procedures for calibration, verification and maintenance of equipment;
n) reference to verification practices including interlaboratory comparisons, proficiency testing
programs, use of reference materials and internal quality control schemes;
o) procedures to be followed for feedback and corrective action whenever testing discrepancies
are detected, or departures from documented policies and procedures occur;
p) the laboratory management arrangements for exceptionally permitting departures from
documented policies and procedures or from standard specifications;
q) procedures for dealing with complaints;
r) procedures for protecting confidentiality (including national security concerns), and proprietary
rights;
s) procedures for audits and data review;
t) processes/procedures for establishing that personnel are adequately experienced in the duties
they are expected to carry out and/or receive any needed training;
u) reference to procedures for reporting analytical results; and
v) a Table of Contents, and applicable lists of references and glossaries, and appendices.
5.5.3 Audits
5.5.3.1 Internal Audits
The laboratory shall arrange for annual internal audits to verify that its operations continue to
comply with the requirements of the laboratory's quality system. Such audits shall be carried out
by the quality assurance officer or designee(s) who are trained and qualified as auditors, and who
are, wherever possible, independent of the activity to be audited. Where the audit findings cast
doubt on the correctness or validity of the laboratory's calibrations or test results, the laboratory shall
take immediate corrective action and shall immediately notify, in writing, any client whose work may
have been affected.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 6 of29
5.5.3.2 Managerial Review
At least once per year, the laboratory management shall conduct a review of its quality system and
its testing and calibration activities to ensure its continuing suitability and effectiveness and to
introduce any necessary changes or improvements in the quality system and laboratory operations.
The review shall take account of reports from managerial and supervisorial personnel, the outcome
of recent internal audits, assessments by external bodies, the results of interlaboratory comparisons
or proficiency tests, any changes in the volume and type of work undertaken, feedback from clients,
corrective actions and other relevant factors. The laboratory shall have a procedure for review by
management and maintain records of review findings and actions.
5.5.3.3 Audit Review
All audit and review findings and any corrective actions that arise from them shall be documented.
The laboratory management shall ensure that these actions are discharged within the agreed time
frame.
5.5.3.4 Performance Audits
In addition to periodic audits, the laboratory shall ensure the quality of results provided to clients by
implementing checks to monitor the quality of the laboratory's analytical activities. Examples of
such checks are:
a) internal quality control procedures using whenever possible statistical techniques; (see 5.5.4
below)
b) participation in proficiency testing or other interlaboratory comparisons (See Chapter 2.0);
c) use of certified reference materials and/or in-house quality control using secondary reference
materials as specified in Section 5.5.4;
d) replicate testings using the same or different test methods;
e) re-testing of retained samples;
f) correlation of results for different parameters of a sample (for example, total phosphorus should
be greater than or equal to orthophosphate).
5.5.3.5 Corrective Actions
a) In addition to providing acceptance criteria and specific protocols for corrective actions in the
Method Standard Operating Procedures (see 5.10.1.1), the laboratory shall implement general
procedures to be followed to determine when departures from documented policies, procedures
and quality control have occurred. These procedures shall include but are not limited to the
following:
1) identity the individual(s) responsible for assessing each QC data type;
2) identify the individual(s) responsible for initiating and/or recommending corrective actions;
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 7 of 29
3) define how the analyst should treat a data set if the associated QC measurements are
unacceptable;
4) specify how out-of-control situations and subsequent corrective actions are to be
documented; and
5) specify procedures for management (including the QA officer) to review corrective action
reports.
b) To the extent possible, samples shall be reported only if all quality control measures are
acceptable. If a quality control measure is found to be out of control, and the data is to be
reported, all samples associated with the failed quality control measure shall be reported with
the appropriate data qualifiers).
5.5.4 Essential Quality Control Procedures
The following general quality control principles shall apply, where applicable, to all testing
laboratories. The manner in which they are implemented is dependent on the types of tests
performed by the laboratory (e.g., chemical, microbiological, radiological) and are further described
in Appendix D. The standards for any given test type shall assure that the applicable principles are
addressed:
a) All laboratories shall have protocols (as required in Section 5.10.1.1) in place to monitor the
following quality controls:
1) Adequate positive and negative controls to monitor tests such as blanks, spikes, reference
toxicants;
2) Adequate tests to define the variability and/or repeatability of the laboratory results such
as replicates;
3) Measures to assure the accuracy of the test method including sufficient calibration and/or
continuing calibrations, use of certified reference materials, proficiency test samples, or
other measures;
4) Measures to evaluate test method capability, such as method detection limits and
quantisation limits or range of applicability such as linearity;
5) Selection of appropriate formulae to reduce raw data to final results such as regression
analysis, comparison to internal/external standard calculations, and statistical analyses;
6) Selection and use of reagents and standards of appropriate quality;
7) Measures to assure the selectivity of the test for its intended purpose; and
8) Measures to assure constant and consistent test conditions (both instrumental and
environmental) where required by the test method such as temperature, humidity, light, or
specific instrument conditions.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 8 of 29
b) All quality control measures shall be assessed and evaluated on an on-going basis, and quality
control acceptance criteria shall be used to determine the useability of the data (See
Appendix D).
c) The laboratory shall have procedures for the development of acceptance/rejection criteria where
no method or regulatory criteria exist. (See 5.11.2, Sample Acceptance Policy.)
d) The quality control protocols specified by the laboratory's method manual (5.10.1.2) shall be
followed. The laboratory shall ensure that the essential standards outlined in Appendix D are
incorporated into their method manuals
The essential quality control measures for testing are found in Appendix D of this chapter.
5.6 PERSONNEL
5.6.1 General Requirements for Laboratory Staff
The laboratory shall have sufficient personnel, having the necessary education, training, technical
knowledge and experience for their assigned functions.
All personnel shall be responsible for complying with all quality assurance/quality control
requirements that pertain to their organizational/technical function. Each technical staff member
must have a combination of experience and education to adequately demonstrate a specific
knowledge of their particular function and a general knowledge of laboratory operations, analytical
test methods, quality assurance/quality control procedures and records management.
5.6.2 Laboratory Management Responsibilities
In addition to 5.4.2.d, the laboratory management shall be responsible for:
a) Defining the minimal level of qualification, experience and skills necessary for all positions in
the laboratory. In addition to education and/or experience, basic laboratory skills such as using
a balance, colony counting, aseptic or quantitative techniques shall be considered;
b) Ensuring that all technical laboratory staff have demonstrated initial and ongoing proficiency in
the activities for which they are responsible. Such demonstration shall be documented;
c) Ensuring that the training of its personnel is kept up-to-date by the following:
1) Evidence must be on file that demonstrates that each employee has read, understood, and
is using the latest version of the laboratory's in-house quality documentation, which relates
to his/her job responsibilities.
2) Training courses or workshops on specific equipment, analytical techniques or laboratory
procedures shall all be documented.
3) Analyst training shall be considered up to date if an employee file contains a certification
that technical personnel have read, understood and agreed to perform the most recent
version of the test method (the approved method or standard operating procedure) and
documentation of continued proficiency by at least one of the following once per year:
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 9 of 29
i. Acceptable performance of a blind sample (single blind to the analyst);
ii. Another initial demonstration of method performance;
iii. Successful analysis of a blind performance sample on a similar test method using
the same technology (e.g., GC/MS volatiles by purge and trap for 524.2, 624 or
5035/8260) would only require documentation for one of the test methods;
iv. At least four consecutive laboratory control samples with acceptable levels of
precision and accuracy;
v. If i-iv cannot be performed, analysis of authentic samples that have been analyzed
by another trained analyst with statistically identical results.
d) Documenting all analytical and operational activities of the laboratory;
e) Supervising all personnel employed by the laboratory;
f) Ensuring that all sample acceptance criteria (Section 5.11) are verified and that samples are
logged into the sample tracking system and properly labeled and stored; and
g) Documenting the quality of all data reported by the laboratory.
5.6.3 Records
Records on the relevant qualifications, training, skills and experience of the technical personnel shall
be maintained by the laboratory [see 5.6.2.C)], including records on demonstrated proficiency for
each laboratory test method, such as the criteria outlined in 5.10.2.1 for chemical testing.
5.7 PHYSICAL FACILITIES-ACCOMMODATION AND ENVIRONMENT
5.7.1 Environment
a) Laboratory accommodation, test areas, energy sources, lighting, heating and ventilation shall
be such as to facilitate proper performance of tests.
b) The environment in which these activities are undertaken shall not invalidate the results or
adversely affect the required accuracy of measurement. Particular care shall be taken when
such activities are undertaken at sites other than the permanent laboratory premises.
c) The laboratory shall provide for the effective monitoring, control and recording of environmental
conditions as appropriate. Such environmental conditions may include biological sterility, dust,
electromagnetic interference, humidity, mains voltage, temperature, and sound and vibration
levels.
d) In instances where monitoring or control of any of the above mentioned items are specified in
a test method or by regulation, the laboratory shall meet and document adherence to the
laboratory facility requirements.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 10 of 29
NOTE - It is the laboratory's responsibility to comply with the relevant health and safety
requirements. This aspect, however, is outside the scope of this Standard.
5.7.2 Work Areas
a) There shall be effective separation between neighboring areas when the activities therein are
incompatible including culture handling or incubation areas and volatile organic chemicals
handling areas.
b) Access to and use of all areas affecting the quality of these activities shall be defined and
controlled.
c) Adequate measures shall be taken to ensure good housekeeping in the laboratory and to ensure
that any contamination does not adversely affect data quality.
d) Work spaces must be available to ensure an unencumbered work area. Work areas include:
1) access and entryways to the laboratory;
2) sample receipt area(s);
3) sample storage area(s);
4) chemical and waste storage area(s); and
5) data handling and storage area(s).
5.8 EQUIPMENT AND REFERENCE MATERIALS
a) The laboratory shall be furnished with all items of equipment (including reference materials)
required for the correct performance of tests for which accreditation is sought. In those cases
where the laboratory needs to use equipment outside its permanent control it shall ensure that
the relevant requirements of this Standard are met.
b) All equipment shall be properly maintained, inspected and cleaned. Maintenance procedures
shall be documented.
c) Any item of the equipment which has been subjected to overloading or mishandling, or which
gives suspect results, or has been shown by verification or otherwise to be defective, shall be
taken out of service, clearly identified and wherever possible stored at a specified place until it
has been repaired and shown by calibration, verification or test to perform satisfactorily. The
laboratory shall examine the effect of this defect on previous calibrations or tests.
d) Each item of equipment including reference materials shall, when appropriate, be labeled,
marked or otherwise identified to indicate its calibration status.
e) Records shall be maintained of each major item of equipment and all reference materials
significant to the tests performed. These records shall include documentation on all routine and
non-routine maintenance activities and reference material verifications.
The records shall include:
1) the name of the item of equipment;
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 11 of 29
2) the manufacturer's name, type identification, and serial number or other unique
identification;
3) date received and date placed in service (if available);
4) current location, where appropriate;
5) if available, condition when received (e.g. new, used, reconditioned);
6) copy of the manufacturer's instructions, where available;
7) dates and results of calibrations and/or verifications and date of the next calibration and/or
verification;
8) details of maintenance carried out to date and planned for the future; and
9) history of any damage, malfunction, modification or repair.
5.9 MEASUREMENT TRACEABILITY AND CALIBRATION
5.9.1 General Requirements
All measuring operations and testing equipment having an effect on the accuracy or validity of tests
shall be calibrated and/or verified before being put into service and on a continuing basis. The
laboratory shall have an established program for the calibration and verification of its measuring and
test equipment. This includes balances, thermometers and control standards.
5.9.2 Traceability of Calibration
a) The overall program of calibration and/or verification and validation of equipment shall be
designed and operated so as to ensure that, wherever applicable, measurements made by the
laboratory are traceable to national standards of measurement where available.
b) Calibration certificates shall when available indicate the traceability to national standards of
measurement and shall provide the measurement results and associated uncertainty of
measurement and/or a statement of compliance with an identified metrological specification.
The laboratory shall maintain records of all such certifications.
c) Where traceability to national standards of measurement is not applicable, the laboratory shall
provide satisfactory evidence of correlation of results, for example by participation in a suitable
program of interlaboratory comparisons, proficiency testing, or independent analysis.
5.9.3 Reference Standards
a) Reference standards of measurement held by the laboratory (such as Class S or equivalent
weights or traceable thermometers) shall be used for calibration only and for no other purpose,
unless it can be demonstrated that their performance as reference standards have not been
invalidated. Reference standards of measurement shall be calibrated by a body that can
provide, where possible, traceability to a national standard of measurement.
b) There shall be a program of calibration and verification for reference standards.
c) Where relevant, reference standards and measuring and testing equipment shall be subjected
to in-service checks between calibrations and verifications. Reference materials shall, where
possible, be traceable to national or international standards of measurement, or to national or
international standard reference materials.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 12 of 29
5.9.4 Calibration
5.9.4.1 General Requirements
a) Each calibration shall be dated and labeled with or traceable to the test method, instrument,
analysis date, and each analyte name, concentration and response (or response factor).
b) Sufficient information shall be recorded to permit reconstruction of the calibration.
c) Criteria for the acceptance of a calibration procedure, such as calibration curves and
concentration (liter) determinations of titrants, shall be established. If applicable, the method
specified criteria shall be met.
5.9.4.2 Acceptance Criteria for Support Equipment
5.9.4.2.1 Analytical Support Equipment
These standards apply to all devices that may not be the actual test instrument, but are necessary
to support laboratory operations. These include but are not limited to: balances, ovens,
refrigerators, freezers, incubators, water baths, temperature measuring devices (including
thermometers and thermistors) and volumetric dispensing devices (such as Eppendorf®, or
automatic dilutor/dispensing devices) if quantitative results are dependent on their accuracy, as in
standard preparation and dispensing or dilution into a specified volume. All support equipment shall
be:
a) maintained in proper working order. The records of all activities including service calls shall be
kept.
b) calibrated or verified at least annually, using NIST traceable references when available, over
the entire range of use. The results of such calibration shall be within the specifications required
of the application for which is equipment is used or:
1) The equipment shall be removed from service until repaired; or
2) The laboratory shall prepare a deviation curve and correct all measurements for the
deviation. All measurements shall be recorded and maintained.
c) Prior to use on each working day, balances, ovens, refrigerators, freezers, incubators and water
baths shall be checked with NIST traceable references (where possible) in the expected use
range. Additional monitoring as prescribed by the test method shall be performed for any device
that is used in a critical test (such as incubators or water baths). The acceptability for use or
continued use shall be according to the needs of the analysis or application for which the
equipment is being used .
d) Mechanical volumetric dispensing devices (except Class A glassware) shall be checked for
accuracy on a monthly use basis.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 13 of 29
5.9.4.2.2 Autoclaves
The sterilization temperature and pressure of each run must be documented by the use of
appropriate chemical or biological sterilization indicators. Autoclave tape may be used to indicate
by color change that a load has been processed, but not to demonstrate completion of an acceptable
sterilization cycle. Demonstration of sterilization may be provided by a continuous temperature
recording or with the use of spore strips.
5.9.4.3 Instrument Calibrations
a) When available, all initial calibrations shall be verified with a standard obtained from a second
or different source. This verification standard shall be analyzed with each initial calibration and
shall be within 15% of the true value unless the laboratory can demonstrate through historical
data that wider limits are applicable.
b) Calibration curves shall be prepared as specified in the test method. If a test method does not
provide guidance in the preparation of a calibration curve, the laboratory shall establish the
appropriate number of standards for use in the initial calibration using the following:
1) Determine the percent relative standard deviation (%RSD) by:
i. Taking at least seven replicate measurements of a standard with a concentration
approaching the lowest quantitation level or;
ii. Performing a calibration linearity test (such as response factor or calibration factor) on
at least 3 standards having concentrations that cover the expected calibration range.
2) The minimum number of standards to be used in the initial calibration is dependent on the
resulting %RSD:
%RSD Number of Calibration Points
0-<2 1**
2-<10 3
10-<25 5
>25 7
** Assumes linearity through the origin (0.0). For analytes for which there is no origin
(such as pH), a two point calibration curve shall be used.
3) If the resulting curve is non-linear, additional standards shall be used.
4) The number of standards as determined from the above table and a blank shall be used for
the initial calibration of the test method.
c) In addition to the verification by second-source standards [see a) above], the calibration curve
shall be subjected to a calibration linearity test, such as a linear regression or percent RSD of
response factors (internal standard calibration) or calibration factors (external standard
calibration).
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 14 of 29
1) If, over the calibration range, the RSD of response factors is less than 15 percent, or the
RSD of calibration factors is less than 30 percent, linearity through the origin can be
assumed and an average relative response factor may be used; otherwise, the complete
calibration curve shall be used.
2) If a linear regression is used, the correlation coefficient (R) shall be no less than 0.995
unless the laboratory can demonstrate that a lowered correlation coefficient consistently
produces accurate results.
d) For results to be reported as quantitative [i.e., those greater than 3.18 times the Method
Detection Limit (MDL)] they must be bracketed by calibration or calibration verification
standards. All other results must be reported as having a lower confidence level.
5.9.4.4 Calibration Verification
When not included in the analytical test method, the value of the analyte(s) in the following
calibration verification standards shall be within 15% of the true value unless the laboratory can
demonstrate through historical data that wider limits are applicable.
5.9.4.4.1 Initial Calibration Verification
a) When an initial calibration curve is not established on the day of analysis, the integrity of the
initial calibration curve shall be verified on each day of use (or 24 hour period) by initially
analyzing a blank and a standard at the method defined concentration or a mid-level
concentration if not included in the test method.
b) If the initial calibration verification fails, the analysis procedure shall be stopped and evaluated.
For example, a second standard may be analyzed and evaluated or a new initial calibration
curve may be established and verified. In all cases, the initial calibration verification must be
acceptable before analyzing any samples.
5.9.4.4.2 Continuing Calibration Verification
Additional standards shall be analyzed after the initial calibration curve or the integrity of the initial
calibration curve (see 5.9.4.3.a or 5.9.4.4.1 above) has been accepted.
a) These standards shall be analyzed at a frequency of 5% or every 12 hours whichever is more
frequent and may be the standards used in the original calibration curve or standards from
another source. The frequency shall be increased if the instrument consistently drifts outside
acceptance criteria before the next calibration.
b) The concentration of these standards shall be determined by the anticipated or known
concentration of the samples and/or method specified levels. At least one standard shall be at
a low level concentration. To the extent possible, the samples in each interval (i.e. every 20
samples or every 12 hours) should be bracketed with standard concentrations closely
representing the lower and upper range of reported sample concentrations. If this is not
possible, the standard calibration checks should vary in concentration throughout the range of
the data being acquired.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 15 of 29
c) If a calibration check standard fails, and routine corrective action procedures fail to produce a
second consecutive calibration check within acceptance criteria, a new initial calibration curve
shall be constructed. When the continuing calibration [check] acceptance criteria are exceeded
high (i.e., high bias), and there are non-detects for the corresponding analyte in all
environmental samples associated with the continuing calibration check, then those non-detects
may be reported, otherwise the samples affected by the unacceptable check shall be reanalyzed
after a new calibration curve has been established, evaluated and accepted. Additional sample
analysis shall not occur until a new calibration curve is established and verified.
5.10 TEST METHODS AND STANDARD OPERATING PROCEDURES
5.10.1 Methods Documentation
a) The laboratory shall have documented instructions on the use and operation of all relevant
equipment, on the handling and preparation of samples and for calibration and/or testing, where
the absence of such instructions could jeopardize the calibrations or tests.
b) All instructions, standards, manuals and reference data relevant to the work of the laboratory
shall be maintained up-to-date and be readily available to the staff.
5.10.1.1 Standard Operating Procedures (SOPs)
Laboratories shall maintain standard operating procedures that accurately reflect all phases of
current laboratory activities such as assessing data integrity, corrective actions, handling customer
complaints, and all test methods.
a) These documents, for example, may be equipment manuals provided by the manufacturer, or
internally written documents.
b) The test methods may be copies of published methods as long as any changes in the methods
are documented and included in the methods manual (see 5.10.1.2).
c) Copies of all SOPs shall be accessible to all personnel.
d) The SOPs shall be organized.
e) Each SOP shall clearly indicate the effective date of the document, the revision number and the
signature(s) of the approving authority.
5.10.1.2 Laboratory Method Manual(s)
a) The laboratory shall have and maintain an in-house methods manual(s) for each accredited
analyte or test method.
b) This manual may consist of copies of published or referenced test methods or standard
operating procedures that have been written by the laboratory. In cases where modifications to
the published method have been made by the laboratory or where the referenced test method
is ambiguous or provides insufficient detail, these changes or clarifications shall be clearly
described. Each test method shall include or reference where applicable:
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 16 of 29
1) identification of the test method;
2) applicable matrix or matrices;
3) method detection limit;
4) scope and application, including components to be analyzed;
5) summary of the test method;
6) definitions;
7) interferences;
8) safety;
9) equipment and supplies;
10) reagents and standards;
11) sample collection, preservation, shipment and storage;
12) quality control;
13) calibration and standardization;
14) procedure;
15) calculations;
16) method performance;
17) pollution prevention;
18) data assessment and acceptance criteria for quality control measures;
19) corrective actions for out-of-control data;
20) contingencies for handling out-of-control or unacceptable data;
21) waste management;
22) references; and
23) any tables, diagrams, flowcharts and validation data
5.10.2 Test Methods
a) The laboratory shall use appropriate test methods and procedures for all tests and related
activities within its responsibility (including sample collection, sample handling, transport and
storage, sample preparation and sample analysis ). The method and procedures shall be
consistent with the accuracy required, and with any standard specifications relevant to the
calibrations or tests concerned.
1) When the use of specific test methods for a sample analysis are mandated or requested,
only those methods shall be used.
2) Where test methods are employed that are not required, as in the Performance Based
Measurement System approach, the methods shall be fully documented and validated (see
5.10.2.1), and be available to the client and other recipients of the relevant reports.
5.10.2.1 Method Validation/Initial Demonstration of Capability
a) Prior to acceptance and institution of any test method, satisfactory initial demonstration of
method performance is required.
1) The laboratory's use of mandated test methods [see 5.10.2.a)1] or EPA reference test
methods, shall follow the protocols outlined in Appendix C of this document.
2) All other test methods (including Performance Based Measurements Systems) shall follow
the protocols outlined in Appendix E of this document.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 17 of 29
3) Exceptions to these requirements are microbiology and tests for which spiking solutions are
not available, for example, total suspended solids, total dissolved solids, total volatile solids,
total solids, pH, color, odor, temperature, dissolved oxygen or turbidity.
b) Thereafter, continuing demonstration of method performance (such as laboratory control
samples) is required.
c) In all cases, the appropriate forms such as the Certification Statement (Appendix C) or standard
performance checklists (see Appendix E) must be completed and retained by the laboratory to
be made available upon request. All associated supporting data necessary to reproduce the
analytical results summarized in the checklists must be retained by the laboratory.
d) Initial demonstration of method performance must be completed each time there is a significant
change in instrument type, personnel, or test method.
5.10.3 Sample Aliquots
Where sampling (as in obtaining sample aliquots from a submitted sample) is carried out as part of
the test method, the laboratory shall use documented procedures and appropriate techniques to
obtain representative subsamples.
5.10.4 Data Verification
Calculations and data transfers shall be subject to appropriate checks.
a) The laboratory shall establish Standard Operating Procedures to ensure that the reported data
is free from transcription and calculation errors.
b) The laboratory shall establish a Standard Operating Procedures to ensure that all quality control
measures are reviewed, and evaluated before data are reported.
5.10.5 Documentation and Labeling of Standards and Reagents
Documented procedures shall exist for the purchase, reception and storage of consumable materials
used for the technical operations of the laboratory.
a) The laboratory shall retain records for all standards including the manufacturer/vendor, the
manufacturer's Certificate of Analysis or purity (if supplied), the date of receipt, recommended
storage conditions, and an expiration date after which the material shall not be used unless it
is verified by the laboratory.
b) Original containers (such as provided by the manufacturer or vendor) shall be labeled with an
expiration date.
c) Detailed records shall be maintained on reagent and standard preparation. These records shall
indicate traceability to purchased stocks or neat compounds, reference to the method of
preparation, date of preparation, expiration date and preparer's initials.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 18 of 29
d) All containers of prepared reagents and standards must bear a unique identifier and expiration
date and be linked to the documentation requirements in 5.10.5.C) above.
5.10.6 Computers and Electronic Data Related Requirements
Where computers or automated equipment are used for the capture, processing, manipulation,
recording, reporting, storage or retrieval of test data, the laboratory shall ensure that:
a) all requirements of this Standard (i.e. Chapter 5) are complied with. Section 8.1 through 8.11
of the EPA Document "2185 - Good Automated Laboratory Practices" (1995), shall be adopted
as the standard for all laboratories employing microprocessors and computers.
b) computer software is documented and adequate for use;
c) procedures are established and implemented for protecting the integrity of data; such
procedures shall include, but not be limited to, integrity of data entry or capture, data storage,
data transmission and data processing;
d) computer and automated equipment are maintained to ensure proper functioning and provided
with the environmental and operating conditions necessary to maintain the integrity of calibration
and test data;
e) it establishes and implements appropriate procedures for the maintenance of security of data
including the prevention of unauthorized access to, and the unauthorized amendment of,
computer records.
5.11 SAMPLE HANDLING, SAMPLE ACCEPTANCE POLICY AND SAMPLE RECEIPT
While the laboratory may not have control of field sampling activities, the following are essential to
ensure the validity of the laboratory's data.
5.11.1 Sample Tracking
a) The laboratory shall have a documented system for uniquely identifying the items to be tested,
to ensure that there can be no confusion regarding the identity of such items at any time. This
system shall include identification for all samples, subsamples and subsequent extracts and/or
digestates. The laboratory shall assign a unique identification (ID) code to each sample
container received in the laboratory. The use of container shape, size or other physical
characteristic, such as amber glass, or purple top, is not an acceptable means of identifying the
sample.
b) This laboratory code shall maintain an unequivocal link with the unique field ID code assigned
each container.
c) The laboratory ID code shall be placed on the sample container as a durable label.
d) The laboratory ID code shall be entered into the laboratory records (see 5.11.3.d) and shall be
the link that associates the sample with related laboratory activities such as sample preparation
or calibration.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 19 of 29
e) In cases where the sample collector and analyst are the same individual or the laboratory
preassigns numbers to sample containers, the laboratory ID code may be the same as the field
ID code.
5.11.2 Sample Acceptance Policy
The laboratory shall have a written sample acceptance policy that clearly outlines the circumstances
under which samples will be accepted. Data from any samples which do not meet the following
criteria must be flagged in an unambiguous manner clearly defining the nature and substance of the
variation. This sample acceptance policy shall be made available to sample collection personnel
and shall include, but is not limited to, the following areas of concern:
a) Proper, full, and complete documentation, which shall include sample identification, the location,
date and time of collection, collector's name, preservation type, sample type and any special
remarks concerning the sample;
b) Proper sample labeling to include unique identification and a labeling system for the samples
with requirements concerning the durability of the labels (water resistant) and the use of indelible
ink;
c) Use of appropriate sample containers;
d) Adherence to specified holding times;
e) Adequate sample volume. Sufficient sample volume must be available to perform the
necessary tests; and
f) Procedures to be used when samples which show signs of damage or contamination.
5.11.3 Sample Receipt Protocols
a) Upon receipt, the condition of the sample, including any abnormalities or departures from
standard condition as prescribed in the relevant test method, shall be recorded. All items
specified in 5.11.2 above shall be checked.
1) All samples which require thermal preservation shall be considered acceptable if the arrival
temperature is either within +1-2 °C of the required temperature or the method specified
range. For samples with a specified temperature of 4 °C, samples with a temperature
ranging from just above the freezing temperature of water to 6 °C shall be acceptable.
Samples that are hand delivered to the laboratory immediately after collection may not meet
this criteria. In these cases, the samples shall be considered acceptable if there is
evidence that the chilling process has begun such as arrival on ice.
2) The laboratory shall implement procedures for checking chemical preservation using readily
available techniques, such as pH, free chlorine or temperature, prior to or during sample
preparation or analysis.
b) The results of all checks shall be recorded.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 20 of 29
c) Where there is any doubt as to the item's suitability for testing, where the sample does not
conform to the description provided, or where the test required is not fully specified, the
laboratory should consult the client for further instruction before proceeding. The laboratory
shall establish whether the sample has received all necessary preparation, or whether the client
requires preparation to be undertaken or arranged by the laboratory. If the sample does not
meet the sample receipt acceptance criteria listed in 5.11.3.a, 5.11.3.b or 5.11.3.C, the
laboratory shall either:
1) Retain correspondence and/or records of conversations concerning the final disposition of
rejected samples; or
2) Fully document any decision to proceed with the analysis of samples not meeting
acceptance criteria.
i. The condition of these samples shall, at a minimum, be noted on the chain of custody
or transmittal form and laboratory receipt documents.
ii. The analysis data shall be appropriately "qualified" on the final report.
d) The laboratory shall utilize a permanent chronological record such as a log book or electronic
database to document receipt of all sample containers.
1) This sample receipt log shall record the following:
i. Client/Project Name
ii. Date and time of laboratory receipt
iii. Unique laboratory ID code (see 5.11.1)
iv. Signature or initials of person making the entries.
2) During the log in process, the following information must be unequivocally linked to the log
record or included as a part of the log. If such information is recorded/documented
elsewhere, the records shall be part of the laboratory's permanent records, easily retrievable
upon request and readily available to individuals who will process the sample. Note: the
placement of the laboratory ID number on the sample container is not considered a
permanent record.
i. The field ID code which identifies each container must be linked to the laboratory ID
code in the sample receipt log.
ii. The date and time of sample collection must be linked to the sample container and to
the date and time of receipt in the laboratory.
iii. The requested analyses (including applicable approved test method numbers) must be
linked to the laboratory ID code.
iv. Any comments resulting from inspection for sample rejection shall be linked to the
laboratory ID code.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 21 of 29
e) All documentation, such as memos or transmittal forms, that is transmitted to the laboratory by
the sample transmitter shall be retained.
f) A complete chain of custody record (Section 5.12.4), if utilized, shall be maintained.
5.11.4 Storage Conditions
The laboratory shall have documented procedures and appropriate facilities to avoid deterioration,
contamination, or damage to the sample during storage, handling, preparation, and testing; any
relevant instructions provided with the item shall be followed. Where items have to be stored or
conditioned under specific environmental conditions, these conditions shall be maintained,
monitored and recorded where necessary.
a) Samples shall be stored according to the conditions specified by preservation protocols:
1) Samples which require thermal preservation shall be stored under refrigeration which is +/-
2° of the specified preservation temperature unless method specific criteria exist. For
samples with a specified storage temperature of 4 °C, storage at a temperature above the
freezing point of water to 6 °C shall be acceptable.
2) Samples shall be stored away from all standards, reagents, food and other potentially
contaminating sources. Samples shall be stored in such a manner to prevent cross
contamination.
b) Sample fractions, extracts, leachates and other sample preparation products shall be stored
according to 5.11.4.a above or according to specifications in the test method.
c) Where a sample or portion of the sample is to be held secure (for example, for reasons of
record, safety or value, or to enable check calibrations or tests to be performed later), the
laboratory shall have storage and security arrangements that protect the condition and integrity
of the secured items or portions concerned.
5.11.5 Sample Disposal
The laboratory shall have standard operating procedures for the disposal of samples, digestates,
leachates and extracts or other sample preparation products.
5.12 RECORDS
The laboratory shall maintain a record system to suit its particular circumstances and comply with
any applicable regulations. The system shall produce unequivocal, accurate records which
document all laboratory activities. The laboratory shall retain on record all original observations,
calculations and derived data, calibration records and a copy of the test report for an appropriate
period.
There are two levels of record keeping: 1) sample custody or tracking and 2) legal or evidentiary
chain of custody. All essential requirements for sample custody are outlined in Sections 5.12.1,
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 22 of 29
5.12.2 and 5.12.3. The basic requirements for legal chain of custody (if required or implemented)
are specified in Section 5.12.4.
5.12.1 Record Keeping System and Design
The record keeping system must allow historical reconstruction of all laboratory activities that
produced the resultant sample analytical data. The history of the sample must be readily understood
through the documentation. This shall include interlaboratory transfers of samples and/or extracts.
a) The records shall include the identity of personnel involved in sampling, preparation, calibration
or testing.
b) All information relating to the laboratory facilities equipment, analytical test methods, and related
laboratory activities, such as sample receipt, sample preparation, or data verification shall be
documented.
c) The record keeping system shall facilitate the retrieval of all working files and archived records
for inspection and verification purposes.
d) All documentation entries shall be signed or initialed by responsible staff. The reason for the
signature or initials shall be clearly indicated in the records such as "sampled by", "prepared by",
or "reviewed by").
e) All generated data except those that are generated by automated data collection systems, shall
be recorded directly, promptly and legibly in permanent ink.
f) Entries in records shall not be obliterated by methods such as erasures, overwritten files or
markings. All corrections to record-keeping errors shall be made by one line marked through
the error. The individual making the correction shall sign (or initial) and date the correction.
These criteria also shall apply to electronically maintained records.
g) Refer to 5.10.6 for Computer and Electronic Data.
5.12.2 Records Management and Storage
a) All records (including those pertaining to calibration and test equipment), certificates and reports
shall be safely stored, held secure and in confidence to the client. NELAP-related records shall
be available to the accrediting authority.
b) All records, including those specified in 5.12.3 and 5.12.4, shall be retained for a minimum of
five years. All information necessary for the historical reconstruction of data must be maintained
by the laboratory. Records which are stored only on electronic media must be supported by the
hardware and software necessary for their retrieval.
c) Records that are stored or generated by computers or personal computers (PCS) shall have
hard copy or write-protected backup copies.
d) The laboratory shall establish a record management system for control of laboratory notebooks;
instrument logbooks; standards logbooks; and records for data reduction, validation storage and
reporting;
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 23 of 29
e) Access to archived information shall be documented with an access log. These records shall
be protected against fire, theft, loss, environmental deterioration, vermin and, in the case of
electronic records, electronic or magnetic sources.
f) The laboratory shall have a plan to ensure that the records are maintained or transferred
according to the clients' instructions (see 4.1.8.e) in the event that a laboratory transfers
ownership or goes out of business.
5.12.3 Laboratory Sample Tracking
5.12.3.1 Sample Handling
A record of all procedures to which a sample is subjected while in the possession of the laboratory
shall be maintained. These shall include but are not limited to all records pertaining to:
a) Sample preservationincludingappropriatenessof sample containerand compliance with holding
time requirement;
b) Sample identification, receipt, acceptance or rejection and log-in;
c) Sample storage and tracking including shipping receipts, transmittal forms, and internal routing
and assignment records;
d) Sample preparation including cleanup and separation protocols, ID codes, volumes, weights,
instrument printouts, meter readings, calculations, reagents;
e) Sample analysis;
f) Standard and reagent origin, receipt, preparation, and use;
g) Equipment receipt, use, specification, operating conditions and preventative maintenance;
h) Calibration criten'a, frequency and acceptance criteria;
i) Data and statistical calculations, review, confirmation, interpretation, assessment and reporting
conventions;
j) Method performance criten'a including expected quality control requirements;
k) Quality control protocols and assessment;
I) Electronic data security, software documentation and verification, software and hardware audits,
backups, and records of any changes to automated data entries;
m) All automated sample handling systems; and
n) Disposal of hazardous samples including the date of sample or subsample disposal and name
of the responsible person.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 24 of 29
5.12.3.2 Laboratory Support Activities
In addition to documenting all the above-mentioned activities, the following shall be retained:
a) All original raw data, whether hard copy or electronic, for calibrations, samples and quality
control measures, including analysts work sheets and data output records (chromatograms, strip
charts, and other instrument response readout records);
b) A written description or reference to the specific test method used which includes a description
of the specific computational steps used to translate parametric observations into a reportable
analytical value;
c) Copies of final reports;
d) Archived standard operating procedures;
e) Correspondence relating to laboratory activities for a specific project;
f) All corrective action reports, audits and audit responses;
g) Proficiency test results and raw data; and
h) Data review and cross checking.
5.12.3.3 Analytical Records
The essential information to be associated with analysis, such as strip charts, tabular printouts,
computer data files, analytical notebooks, and run logs, shall include:
a) Laboratory sample ID code;
b) Date of analysis;
c) Instrumentation identification and instrument operating conditions/parameters (or reference to
such data);
d) Analysis type;
e) All manual calculations; and
f) Analyst's or operator's initials/signature.
5.12.3.4 Administrative Records
The following shall be maintained:
a) Personnel qualifications, experience and training records;
b) Initial and continuing demonstration of proficiency for each analyst; and
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 25 of 29
c) A log of names, initials and signatures for all individuals who are responsible for signing or
initialing any laboratory record.
5.12.4 Legal or Evidentiary Custody
The use of legal chain of custody (COC) protocols is strongly recommended and may be required
by some state or federal programs. In addition to the records listed in 5.12.3 and the performance
standards outlined in 5.12.1 and 5.12.2, the following protocols shall be incorporated if legal COC
is implemented by the organization.
5.12.4.1 Basic Requirements
The legal chain of custody records shall establish an intact, continuous record of the physical
possession, storage and disposal of sample containers, collected samples, sample aliquots, and
sample extracts or digestates. For ease of discussion, the above-mentioned items shall be referred
to as samples:
a) A sample is in someone's custody if:
1) It is in one's actual physical possession;
2) It is in one's view, after being in one's physical possession;
3) It is in one's physical possession and then locked up so that no one can tamper with it;
4) It is kept in a secured area, restricted to authorized personnel only.
b) The COC records shall account for all time periods associated with the samples.
c) The COC records shall identify all individuals who physically handled individual samples.
d) In order to simplify record-keeping, the number of people who physically handle the sample
should be minimized. A designated sample custodian, who is responsible for receiving, storing
and distributing samples is recommended.
e) The COC records are not limited to a single form or document. However, organizations should
attempt to limit the number of documents that would be required to establish COC.
f) Legal chain of custody shall begin at the point established by the federal or state oversight
program. This may begin at the point that cleaned sample containers are provided by the
laboratory or the time sample collection occurs.
g) The COC forms shall remain with the samples during transport or shipment.
h) If shipping containers and/or individual sample container^ are submitted with sample custody
seals, and any seals are not intact, the lab shall note this on the chain of custody.
i) Mailed packages should be registered with return receipt requested. If packages are sent by
common carrier, receipts should be retained as part of the permanent chain-of-custody
documentation.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 26 of 29
j) Once received by the laboratory, laboratory personnel are responsible for the care and custody
of the sample and must be prepared to testify that the sample was in their possession and view
or secured in the laboratory at all times from the moment it was received from the custodian
until the time that the analyses are completed or the sample is disposed.
5.12.4.2 Required Information in Custody Records
In addition to the information specified in 5.11.1.a and 5.11.1.b, tracking records shall include, by
direct entry or linkage to other records:
a) Time of day and calendar date of each transfer or handling procedure;
b) Signatures of all personnel who physically handle the sample(s);
c) All information necessary to produce unequivocal, accurate records that document the
laboratory activities associated with sample receipt, preparation, analysis and reporting; and
d) Common carrier documents.
5.12.4.3 Controlled Access to Samples
Access to all legal samples and subsamples shall be controlled and documented.
a) A clean, dry, isolated room, building, and/or refrigerated space that can be securely locked from
the outside must be designated as a custody room.
b) Where possible, distribution of samples to the analyst performing the analysis must be made
by the custodian(s).
c) The laboratory area must be maintained as a secured area, restricted to authorized personnel
only.
d) Once the sample analyses are completed, the unused portion of the sample, together with all
identifying labels, must be returned to the custodian. The returned tagged sample must be
retained in the custody room until permission to destroy the sample is received by the custodian
or other authority.
5.12.4.4 Transfer of Samples to Another Party
Transfer of samples, subsamples, digestates or extracts to another party are subject to all of the
requirements for legal chain of custody.
5.12.4.5 Sample Disposal
a) If the sample is part of litigation, disposal of the physical sample shall occur only with the
concurrence of the affected legal authority, sample data user and/or submitter of the sample.
b) All conditions of disposal and all correspondence between all parties concerning the final
disposition of the physical sample shall be recorded and retained.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 27 of 29
c) Records shall indicate the date of disposal, the nature of disposal (such as sample depleted,
sample disposed in hazardous waste facility, or sample returned to client), and the name of the
individual who performed the task.
5.13 LABORATORY REPORT FORMAT AND CONTENTS
The results of each test, or series of tests carried out by the laboratory shall be reported accurately,
clearly, unambiguously and objectively. The results shall normally be reported in a test report and
shall include all the information necessary for the interpretation of the test results and all information
required by the method used. Some regulatory reporting requirements or formats such as monthly
operating reports, may not require all items listed below, however, the laboratory shall provide all
the required information to their client for use in preparing such regulatory reports.
a) Except as discussed in 5.13.b), each report to an outside client shall include at least the
following information (those prefaced with "where relevant" are not mandatory):
1) a title, e.g., "Test Report", or "Test Certificate", "Certificate of Results" or "Laboratory
Results";
2) name and address of laboratory, and location where the test was carried out if different from
the address of the laboratory and phone number with name of contact person for questions;
3) unique identification of the certificate or report (such as serial number) and of each page,
and the total number of pages;
This requirement may be presented in several ways:
i. The total number of pages may be listed on the first page of the report as long as the
subsequent pages are identified by the unique report identification and consecutive
numbers, or
ii. Each page is identified with the unique report identification, the pages are identified as
a number of the total report pages (example: 3 of 10, or 1 of 20).
Other methods of identifying the pages in the report may be acceptable as long as it is clear
to the reader that discrete pages are associated with a specific report, and that the report
contains a specified number of pages.
4) name and address of client, where appropriate and project name if applicable;
5) description and unambiguous identification of the tested sample including the client
identification code;
6) identification of test results derived from any sample that did not meet NELAC sample
acceptance requirements such as improper container, holding time, or temperature;
7) date of receipt of sample, date and time of sample collection, date(s) of performance test,
and time of sample preparation and/or analysis if the required holding time for either activity
is less than or equal to 48 hours;
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 28 of 29
8) identification of the test method used, or unambiguous description of any non-standard
method used;
9) if the laboratory collected the sample, reference to sampling procedure;
10) any deviations from (such as failed quality control), additions to or exclusions from the test
method (such as environmental conditions), and any non-standard conditions that may have
affected the quality of results, and including the use and definitions of data qualifiers.
11) measurements, examinations and derived results, supported by tables, graphs, sketches
and photographs as appropriate, and any failures identified; identify whether data are
calculated on a dry weight or wet weight basis; identify the reporting units such as //g/l or
mg/kg; and for Whole Effluent Toxicity, identify the statistical package used to provide data.
12) when required, a statement of the estimated uncertainty of the test result;
13) a signature and title, or an equivalent electronic identification of the person(s) accepting
responsibility for the content of the certificate or report (however produced), and date of
issue;
14) at the laboratory's discretion, a statement to the effect that the results relate only to the
items tested or to the sample as received by the laboratory;
15) at the laboratory's discretion, a statement that the certificate or report shall not be
reproduced except in full, without the written approval of the laboratory;
16) clear identification of all test data provided by outside sources, such as subcontracted
laboratories, clients, etc; and
17) clear identification of numerical results with values below 3.18 times the MDL (10 standard
deviations as determined by the method detection limit study).
b) Laboratories that are operated by a facility and whose sole function is to provide data to the
facility management for compliance purposes (in-house or captive laboratories) shall have all
applicable information specified in 1 through 17 above readily available for review by the
accrediting authority. However formal reports detailing the information are not required if:
1) The in-house laboratory is itself responsible for preparing the regulatory reports; or
2) The laboratory provides information to another individual within the organization for
preparation of regulatory reports. The facility management must ensure that the
appropriate report items are in the report to the regulatory authority if such information is
required.
c) Where the certificate or report contains results of tests performed by sub-contractors, these
results shall be clearly identified by subcontractor name or applicable accreditation number.
d) After issuance of the report, the laboratory report shall remain unchanged. Material
amendments to a calibration certificate, test report or test certificate after issue shall be made
only in the form of a further document, or data transfer including the statement "Supplement to
-------
NELAC NELAC
Quality Systems Quality Systems
Revision 9 Revision 9
July 2, 1998 July 2, 1998
Page 29 of 29 Page 29 of 29
Test Report or Test Certificate, serial number... [or as otherwise identified]", or equivalent form
of wording. Such amendments shall meet all the relevant requirements of this Standard.
e) The laboratory shall notify clients promptly, in writing, of any event such as the identification of
defective measuring or test equipment that casts doubt on the validity of results given in any
calibration certificate, test report or test certificate or amendment to a report or certificate.
f) The laboratory shall ensure that, where clients require transmission of test results by telephone,
telex, facsimile or other electronic or electromagnetic means, staff will follow documented
procedures that ensure that the requirements of this Standard are met and that confidentiality
is preserved.
g) Laboratories accredited to be in compliance with these standards shall certify that the test results
meet all requirements of NELAC or provide reasons and/or justification if they do not.
5.14 SUBCONTRACTING ANALYTICAL SAMPLES
a) The laboratory shall advise the client in writing of its intention to sub-contract any portion of the
testing to another party.
b) Where a laboratory sub-contracts any part of the testing covered under NELAP, this work shall
be placed with a laboratory accredited under NELAP for the tests to be performed.
c) The laboratory shall retain records demonstrating that the above requirements have been met.
5.15 OUTSIDE SUPPORT SERVICES AND SUPPLIES
a) Where the laboratory procures outside services and supplies, otherthan those referred to in this
Standard, in support of tests, the laboratory shall use only those outside support services and
supplies that are of adequate quality to sustain confidence in the laboratory's tests.
b) Where no independent assurance of the quality of outside support services or supplies is
available, the laboratory shall have procedures to ensure that purchased equipment, materials
and services comply with specified requirements. The laboratory should, wherever possible,
ensure that purchased equipment and consumable materials are not used until they have been
inspected, calibrated or otherwise verified as complying with any standard specifications
relevant to the calibrations or tests concerned.
c) The laboratory shall maintain records of all suppliers from whom it obtains support services or
supplies required for tests.
5.16 COMPLAINTS
The laboratory shall have documented policy and procedures for the resolution of complaints
received from clients or other parties about the laboratory's activities. Where a complaint, or any
other circumstance, raises doubt concerning the laboratory's compliance with the laboratory's
policies or procedures, or with the requirements of this Standard or otherwise concerning the quality
of the laboratory's calibrations or tests, the laboratory shall ensure that those areas of activity and
responsibility involved are promptly audited in accordance with Section 5.5.3.1. Records of the
complaint and subsequent actions shall be maintained.
-------
QUALITY SYSTEMS
APPENDIXA
REFERENCES
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5A-1 of 2
Appendix A - REFERENCES
40 CFR Part 136, Appendix A, paragraphs 8.1.1 and 8.2
American Association for Laboratory Accreditation April 1996. General Requirements for
Accreditation
"American National Standards Specification and Guidelines for Quality Systems for
Environmental Data Collection and Environmental Technology Programs (ANSI/ASQC E-4)",
1994
Catalog of Bacteria, American Type Culture Collection, Rockville, MD
EPA 2185 - Good Automated Laboratory Practices, 1995 available at
www.epa.gov/docs/etsdwe1/irm_galp/
"Glossary of Quality Assurance Terms and Acronyms", Quality Assurance Division, Office of
Research and Development, USEPA
"Guidance on the Evaluation of Safe Drinking Water Act Compliance Monitoring Results from
Performance Based Methods", September 30,1994, Second draft.
International vocabulary of basic and general terms in metrology (VIM): 1984. Issued by BIPM.
IEC. ISO. and OIML
ISO Guide 3534-1: "Statistics, vocabulary and symbols - Part 1: Probability and general
statistical terms"
ISO Guide 7218: Microbiology - General Guidance for Microbiological Examinations
ISO Guide 8402: 1986. Quality - Vocabulary
ISO Guide 9000: 1994 Quality management and quality assurance standards - Guidelines for
selection and use
ISO Guide 9001: 1994 Quality Systems - Model for quality assurance in design/development,
production, installation and servicing
ISO Guide 9002: 1994 Quality systems - Model for quality assurance in production and
installation
ISO/I EC Guide 2: 1986. General terms and their definitions concerning standardization and
related activities
ISO/IEC Guide 25: 1990. General requirements for the competence of calibration and testing
laboratories
"Laboratory Biosafety Manual", World Health Organization, Geneva, 1983
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5A-2of2
Manual for the Certification of Laboratories Analyzing Drinking Water Revision 4, EPA 815-B-97-
001
Manual of Method for General Bacteriology, Philipp Gerhard et al., American Society for
Microbiology, Washington, 1981
Performance Based Measurement System, EPA EMMC Method Panel, PBM workgroup, 1996
-------
QUALITY SYSTEMS
APPENDIXB
DEFINITIONS FOR QUALITY SYSTEMS
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 5B-1 of 9
Appendix B - DEFINITIONS FOR QUALITY SYSTEMS
The following definitions are used in the text of Quality Systems. In writing this document, the
following hierarchy of definition references were used: ISO 8402, ANSI/ASQC E-4, EPA's Quality
Assurance Division Glossary of Terms, and finally definitions developed by NELAC and/or the
Quality Assurance Standing Committee. The source of each definition is noted.
Acceptance Criteria: specified limits placed on characteristics of an item, process, or service
defined in requirement documents. (ASQC)
Accreditation: the process by which an agency or organization evaluates and recognizes a
program of study or an institution as meeting certain predetermined qualifications or standards,
thereby accrediting the laboratory. In the context of the National Environmental Laboratory
Accreditation Program (NELAP), this process is a voluntary one. (NELAC)
Accrediting Authority: the agency having responsibility and accountability for environmental
laboratory accreditation and who grants accreditation. For the purposes of NELAC, this is EPA,
other federal agencies, or the state. (NELAC)
Accuracy: the degree of agreement between an observed value and an accepted reference value.
Accuracy includesa combination of random error (precision) and systematic error (bias) components
which are due to sampling and analytical operations; a data quality indicator. (Glossary of Quality
Assurance Terms, QAMS, 8/31/92).
Analytical Detection Limit: the smallest amount of an analyte that can be distinguished in a
sample by a given measurement procedure throughout a given (e.g., 0.95) confidence interval.
(Applicable only to radiochemistry)
Analytical Reagent (AR) Grade: designation for the high purity of certain chemical reagents and
solvents given the American Chemical Society. (Quality Systems)
Assessor Body: the organization that actually executes the accreditation process, i.e., receives
and reviews accreditation applications, reviews QA documents, reviews proficiency testing results,
surveys the site, etc., whether EPA, the state, or contracted private party. (NELAP)
Batch: environmental samples which are prepared and/or analyzed together with the same process
and personnel, using the same lot(s) of reagents. A preparation batch is composed of one to 20
environmental samples of the same NELAC-defined matrix, meeting the above mentioned criteria
and with a maximum time between the start of processing of the first and last sample in the batch
to be 24 hours. An analytical batch is composed of prepared environmental samples (extracts,
digestates or concentrates) which are analyzed together as a group using the same calibration curve
or factor. An analytical batch can include prepared samples originating from various environmental
matrices and can exceed 20 samples. (Quality Systems)
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 5B-2 of 9
Blank: a sample that has not been exposed to the analyzed sample stream in order to monitor
contamination during sampling, transport, storage or analysis. The blank is subjected to the usual
analytical and measurement process to establish a zero baseline or background value and is
sometimes used to adjust or correct routine analytical results. (ASQC, Definitions of Environmental
Quality Assurance Terms, 1996)
Blind Sample: a subsample for analysis with a composition known to the submitter. The
analyst/laboratory may know the identity of the sample but not its composition. It is used to test the
analyst's or laboratory's proficiency in the execution of the measurement process.
Calibrate: to determine, by measurement or comparison with a standard, the correct value of each
scale reading on a meter or other device, or the correct value for each setting of a control knob. The
levels of the applied calibration standard should bracket the range of planned or expected sample
measurements.
Calibration: the set of operations which establish, under specified conditions, the relationship
between values indicated by a measuring instrument or measuring system, or values represented
by a material measure, and the corresponding known values of a measurand. (VIM - 6.13)
Calibration Curve: the graphical relationship between the known values, such as concentrations,
of a sen'es of calibration standards and their analytical response.
Calibration Standard: a solution prepared from the primary dilution standard solution or stock
standard solutions and the internal standards and surrogate analytes. The Calibration solutions are
used to calibrate the instrument response with respect to analyte concentration. (Glossary of Quality
Assurance Terms, QAMS, 8/31/92).
Certified Reference Material (CRN): a reference material one or more of whose property values
are certified by a technically valid procedure, accompanied by or traceable to a certificate or other
documentation which is issued by a certifying body. (ISO Guide 30 - 2.2)
Chain of Custody: an unbroken trail of accountability that documents the physical security of
samples, data and records.
Confirmation: verification of the presence of a component through the use of an analytical
technique that differs from the original test method. These may include:
Second column confirmation
Alternate wavelength
Derivatization
Mass spectral interpretation
Alternative detectors or
Additional cleanup procedures.
Corrective Action: action taken to eliminate the causes of an existing nonconformity, defect or
other undesirable situation in order to prevent recurrence. (ISO 8402)
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 5B-3 of 9
Data Audit: a qualitative and quantitative evaluation of the documentation and procedures
associated with environmental measurements to verify that the resulting data are of acceptable
quality (i.e., that they meet specified acceptance criteria.
Data Reduction: the process of transforming raw data by arithmetic or statistical calculations,
standard curves, concentration factors, etc., and collation into a more useful form.
Detection Limit: the lowest concentration or amount of the target analyte that can be determined
to be different from zero by a single measurement at a stated degree of confidence. See Method
Detection Limit.
Document Control: the act of ensuring that documents (and revisions thereto) are proposed,
reviewed for accuracy, approved for release by authorized personnel, distributed properly and
controlled to ensure use of the correct version at the location where the prescribed activity is
performed. (ASQC, Definitions of Environmental Quality Assurance Terms, 1996)
Duplicate Analyses: the analyses or measurements of the variable of interest performed identically
on two subsamples of the same sample. The results from duplicate analyses are used to evaluate
analytical or measurement precision but not the precision of sampling, preservation or storage
internal to the laboratory.
Environmental Detection Limit (EDL): the smallest level at which a radionuclide in an
environmental medium can be unambiguously distinguished for a given confidence interval using
a particularcombination of sampling and measurement procedures, sample size, analytical detection
limit, and processing procedure. The EDL shall be specified for the 0.95 or greater confidence
interval. The EDL shall be established initially and verified annually for each test method and
sample matrix. (Radioanalysis Subcommittee)
Holding Times (Maximum Allowable Holding Times): the maximum times that samples may be
held prior to analysis and still be considered valid. (40 CFR Part 136).
Initial Demonstration of Capability: procedure to establish the ability of the laboratory to generate
acceptable accuracy and precision which is included in many of the EPA's analytical test methods.
In general the procedure includes the addition of a specified concentration of each analyte (using
a QC check sample) in each of four separate aliquots of laboratory pure water. These are carried
through the entire analytical procedure and the percentage recovery and the standard deviation are
determined and compared to specified limits. (40 CFR Part 136).
Internal Standard: a known amount of standard added to a test portion of a sample and carried
through the entire measurement process as a reference for evaluating and controlling the precision
and bias of the applied analytical test method.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 5B-4 of 9
Laboratory: Body that calibrates and/or tests.
NOTES:
1. In cases where a laboratory forms part of an organization that carries out other activities
besides calibration and testing, the term "laboratory" refers only to those parts of that
organization that are involved in the calibration and testing process.
2. As used herein, the term "laboratory" refers to a body that carries out calibration or testing
- at or from a permanent location,
- at or from a temporary facility, or
- in or from a mobile facility. (ISO 25)
Laboratory Control Sample (however named, such as laboratory fortified blank or spiked
blank): a sample matrix, free from the analytes of interest, spiked with verified known amounts
of analytes from a source independent of the calibration standards or a material containing known
and verified amounts of analytes. It is generally used to establish intra-laboratory or analyst specific
precision and bias or to assess the performance of all or a portion of the measurement system.
(NELAC).
Laboratory Duplicate: Aliquots of a sample taken from the same container under laboratory
conditions and processed and analyzed independently.
Legal Chain of Custody (COC): an unbroken trail of accountability that ensures the physical
security of samples, data and records. (Glossary of Quality Assurance Terms, QAMS, 8/31/92).
Limit of Detection (LOD): the lowest concentration level that can be determined by a single
analysis and with a defined level of confidence to be statistically different from a blank. (Analytical
Chemistry, 55, p.2217, December 1983, modified) See also Method Detection Limit.
Manager (however named): the individual designated as being responsible for the overall operation,
all personnel, and the physical plant of the environmental laboratory. A supervisor may report to
the manager. In some cases, the supervisor and the manager may be the same individual.
Matrix: The component or substrate which contains the analyte of interest. For purposes of batch
determination, the following matrix types shall be used:
- Aqueous: Any aqueous sample excluded from the definition of a drinking water matrix or
Saline/Estuarine source. Includes surface water, groundwater and effluents.
- Drinking water: Any aqueous sample that has been designated a potable or potential potable
water source.
- Saline/Estuarine: Any aqueous sample from an ocean or estuary, or other salt water source
such as the Great Salt Lake.
- Non-aqueous liquid: Any organic liquid with <15% settleable solids.
- Biological Tissue: Any sample of a biological origin such as fish tissue, shellfish, or plant
material. Such samples shall be grouped according to origin.
- Solids: Includes soils, sediments, sludges and other matrices with >15% settleable solids.
- Chemical Waste: A product or by-product of a industrial process that results in a matrix not
previously defined.
- Air Samples: Media used to retain the analyte of interest from an air sample such as sorbent
tubes or summa canisters. Each medium shall be considered as a distinct matrix. (Quality
Systems)
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 5B-5 of 9
Matrix Spike (spiked sample, fortified sample) : prepared by adding a known mass of target
analyte to a specified amount of matrix sample for which an independent estimate of target analyte
concentration is available. Matrix spikes are used, for example, to determine the effect of the matrix
on a method's recovery efficiency. (Glossary of Quality Assurance Terms, QAMS, 8/31/92).
Matrix Spike Duplicate (spiked sample/fortified sample duplicate): a second replicate matrix
spike is prepared in the laboratory and analyzed to obtain a measure of the precision of the recovery
for each analyte. (Glossary of Quality Assurance Terms, QAMS, 8/31/92).
May: permitted, but not required (TRADE)
Method Blank: a sample of a matrix similar to the batch of associated samples (when available)
that is free from the analytes of interest and is processed simultaneously with and under the same
conditions as samples containing an analyte of interest through all steps of the analytical procedures.
(NELAC).
Method Detection Limit: the minimum concentration of a substance (an analyte) that can be
measured and reported with 99% confidence that the analyte concentration is greater than zero and
is determined from analysis of a sample in a given matrix containing the analyte. (40 CFR Part 136
Appendix B).
Must: denotes a requirement that must be met. (Random House College Dictionary)
Negative Control: measures taken to ensure that a test, its components, or the environment do
not cause undesired effects, or produce incorrect test results.
NELAC: National Environmental Laboratory Accreditation Conference. A voluntary organization
of state and federal environmental officials and interest groups purposed primarily to establish
mutually acceptable standards for accrediting environmental laboratories. A subset of NELAP.
(NELAC)
NELAP: the overall National Environmental Laboratory Accreditation Program of which NELAC is
apart. (NELAC)
Performance Audit: the routine comparison of independently obtained quantitative measurement
system data with routinely obtained data in order to evaluate the proficiency of an analyst or
laboratory.
Performance Based Measurement System (PBMS): a set of processes wherein the data quality
needs, mandates or limitations of a program or project are specified and serve as criteria for
selecting appropriate test methods to meet those needs in a cost-effective manner.
Positive Control: measures taken to ensure that a test and/or its components are working properly
and producing correct or expected results from positive test subjects.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 5B-6 of 9
Precision: the degree to which a set of observations or measurements of the same property,
obtained under similar conditions, conform to themselves; a data quality indicator. Precision is
usually expressed as standard deviation, variance or range, in either absolute or relative terms.
(NELAC).
Preservation: refrigeration and or reagents added at the time of sample collection to maintain the
chemical and or biological integrity of the sample.
Proficiency Test Sample (PT): a sample, the composition of which is unknown to the analyst and
is provided to test whether the analyst/laboratory can produce analytical results within specified
acceptance criteria. (Glossary of Quality Assurance Terms, QAMS, 8/31/92).
Proficiency Testing: Determination of the laboratory calibration or testing performance by means
of interlaboratory comparisons. (ISO/I EC Guide 2 -12.6, amended)
Proficiency Testing Program: the aggregate of providing rigorously controlled and standardized
environmental samples to a laboratory for analysis, reporting of results, statistical evaluation of the
results in comparison to peer laboratories and the collective demographics and results summary of
all participating laboratories.
Protocol: a detailed written procedureforfield and/or laboratory operation (e.g., sampling, analysis)
which must be strictly followed.
Pure Reagent Water: shall be water in which no target analytes or interferences are present at a
concentration which would impact the results when using a particular analytical test method.
Quality Assurance: an integrated system of activities involving planning, quality control, quality
assessment, reporting and quality improvement to ensure that a product or service meets defined
standards of quality with a stated level of confidence. (Glossary of Quality Assurance Terms,
QAMS, 8/31/92).
Quality Control: the overall system of technical activities whose purpose is to measure and control
the quality of a product or service so that it meets the needs of users. (Glossary of Quality
Assurance Terms, QAMS, 8/31/92).
Quality Manual: A document stating the quality policy, quality system and quality practices of an
organization. This may be also called a Quality Assurance Plan or a Quality Plan.
NOTE - The quality manual may call up other documentation relating to the laboratory's quality
arrangements.
Quality System: a structured and documented management system describing the policies,
objectives, principles, organizational authority, responsibilities, accountability, and implementation
plan of an organization for ensuring quality in its work processes, products (items), and services.
The quality system provides the framework for planning, implementing, and assessing work
performed by the organization and for carrying out required QA and QC. (ANSI/ASQC E-41994)
Quantitation Limits: the maximum or minimum levels, concentrations, or quantities of a target
variable (e.g., target analyte) that can be quantified with the confidence level required by the data
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 5B-7 of 9
user. Quantitation limit, for the purposes of NELAC, is defined as 3.18 times the MDL, by
convention.
Range: the difference between the minimum and the maximum of a set of values.
Raw Data: any original factual information from a measurement activity or study recorded in a
laboratory notebook, worksheets, records, memoranda, notes ,or exact copies thereof that are
necessary for the reconstruction and evaluation of the report of the activity or study. Raw data may
include photography, microfilm or microfiche copies, computer printouts, magnetic media, including
dictated observations, and recorded data from automated instruments. If exact copies of raw data
have been prepared (e.g., tapes which have been transcribed verbatim, data and verified accurate
by signature), the exact copy or exact transcript may be submitted.
Reagent Blank (method reagent blank): a sample consisting of reagent(s), without the target
analyte or sample matrix, introduced into the analytical procedure at the appropriate point and
carried through all subsequent steps to determine the contribution of the reagents and of the
involved analytical steps. (Glossary of Quality Assurance Terms, QAMS, 8/31/92).
Reference Material: a material or substance one or more properties of which are sufficiently well
established to be used for the calibration of an apparatus, the assessment of a measurement
method, or for assigning values to materials. (ISO Guide 30-2.1)
Reference Standard: a standard, generally of the highest metrological quality available at a given
location, from which measurements made at that location are derived. (VIM - 6.08)
Requirement: a translation of the needs into a set of individual quantified or descriptive
specifications for the characteristics of an entity in order to enable its realization and examination.
Reference Toxicant: see D.2.1 .a
Selectivity: (Analytical chemistry) the capability of a test method or instrument to respond to a
target substance or constituent in the presence of nontarget substances.
Sensitivity: the capability of a test method or instrument to discriminate between measurement
responses representing different levels (e.g., concentrations) of a variable of interest.
Shall: denotes a requirement that is mandatory whenever the criterion for conformance with the
specification requires that there be no deviation. This does not prohibit the use of alternative
approaches or methods for implementing the specification so long as the requirement is fulfilled.
(Style Manual for Preparation of Proposed American National Standards , American National
Standards Institute, eighth edition, March 1991).
Should: denotes a guideline or recommendation whenever noncompliance with the specification
is permissible. (Style Manual for Preparation of Proposed American National Standards, American
National Standards Institute, eighth edition, March 1991).
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 5B-8 of 9
Standard Operating Procedures (SOPs): a written document which details the method of an
operation, analysis or action whose techniques and procedures are thoroughly prescribed and which
is accepted as the method for performing certain routine or repetitive tasks. (Glossary of Quality
Assurance Terms, QAMS, 8/31/92).
Spike: a known mass of target analyte added to a blank sample or subsample; used to determine
recovery efficiency or for other quality control purposes.
Standard Reference Material (SRM): a certified reference material produced by the U.S. National
Institute of Standards and Technology and characterized for absolute content, independent of
analytical test method.
Supervisor (however named): the individual(s) designated as being responsible for a particular
area or category of scientific analysis. This responsibility includes direct day-to-day supervision of
technical employees, supply and instrument adequacy and upkeep, quality assurance/quality control
duties and ascertaining that technical employees have the required balance of education, training
and experience to perform the required analyses.
Surrogate: a substance with properties that mimic the analyte of interest. It is unlikely to be found
in environment samples and is added to them for quality control purposes. (Glossary of Quality
Assurance Terms, QAMS, 8/31/92).
Technical Director: Definition needs to be developed
Test: a technical operation that consists of the determination of one or more characteristics or
performance of a given product, material, equipment, organism, physical phenomenon, process or
service according to a specified procedure.
NOTE - The result of a test is normally recorded in a document sometimes called a test report
or a test certificate. (ISO/IEC Guide 2 -12.1, amended)
Test Method: defined technical procedure for performing a test.
Testing Laboratory: laboratory that performs tests. (ISO/IEC Guide 2 -12.4)
Test Sensitivity/Power: D.2.4.a
Tolerance Chart: A chart in which the plotted quality control data is assessed via a tolerance level
(e.g. +/-10% of a mean) based on the precision level judged acceptable to meet overall quality/data
use requirements instead of a statistical acceptance criteria (e.g. +/- 3 sigma). (ANSI N42.23-1995,
Measurement and Associated Instrument Quality Assurance for Radioassay Laboratories)
Traceability: the property of a result of a measurement whereby it can be related to appropriate
standards, generally international or national standards, through an unbroken chain of comparisons.
(VIM-6.12)
Verification: confirmation by examination and provision of evidence that specified requirements
have been met.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 5B-9 of 9
NOTE - In connection with the management of measuring equipment, verification provides a
means for checking that the deviations between values indicated by a measuring instrument and
corresponding known values of a measured quantity are consistently smaller than the maximum
allowable error defined in a standard, regulation or specification peculiar to the management of
the measuring equipment.
The result of verification leads to a decision either to restore in service, to perform adjustments,
or to repair, or to downgrade, or to declare obsolete. In all cases it is required that a written
trace of the verification performed shall be kept on the measuring instrument's individual record.
Validation: the process of substantiating specified performance criteria.
-------
QUALITY SYSTEMS
APPENDIXC
INITIAL DEMONSTRATION OF
CAPABILITY
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page 5C-1 of 4
Appendix C - INITIAL DEMONSTRATION OF CAPABILITY
C.1 PROCEDURE FOR INITIAL DEMONSTRATION OF CAPABILITY
An initial demonstration of method performance must be made prior to using any test method, and
at any time there is a significant change in instrument type, personnel or test method (see
5.10.2.1).
All initial demonstrations, continuing demonstrations and method certification shall be documented
through the use of the forms in this appendix.
The following steps, which are adapted from the EPA test methods published in 40 CFR Part 136,
Appendix A, shall be performed:
a) A quality control sample shall be obtained from an outside source. If not available, the QC
check sample may be prepared by the laboratory using stock standards that are prepared
independently from those used in instrument calibration.
b) The concentrate shall be diluted in a volume of clean matrix sufficient to prepare four aliquots
at the required method volume to a concentration approximately 10 times the method-stated or
laboratory-calculated method detection limit.
c) The four aliquots shall be prepared and analyzed according to the test method either
concurrently or over a period of days.
d) Using the four results, calculate the average recovery ( x) in the appropriate reporting units (such
as ,ug/L) and the standard deviation of the population sample (n-1) (in the same units) for each
parameter of interest.
e) For each parameter, compare s and x to the corresponding acceptance criteria for precision and
accuracy in the test method (if applicable) or in laboratory-generated acceptance criteria (if a
non-standard method). If s and x for all parameters meet the acceptance criteria, the analysis
of actual samples may begin. If any one of the parameters exceed the acceptance range, the
performance is unacceptable for that parameter.
f) When one or more of the tested parameters fail at least one of the acceptance criteria, the
analyst must proceed according to 1) or 2) below.
1) Locate and correct the source of the problem and repeat the test for all parameters of
interest beginning with c) above.
2) Beginning with c) above, repeat the test for all parameters that failed to meet criteria.
Repeated failure, however, will confirm a general problem with the measurement system.
If this occurs, locate and correct the source of the problem and repeat the test for all
compounds of interest beginning with c).
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5C-2of4
C.2 CERTIFICATION STATEMENT
The following certification statement shall be used to document the completion of each initial
demonstration of capability. A copy of the certification statement shall be retained in the personnel
records of each affected employee (see 5.6.3 and 5.12.3.4.b).
-------
NEUVC
Quality Systems
Revision 9
July 2,1998
Page5C-3of4
Initial Demonstration of Capability
Certification Statement
Date: Page of
Laboratory Name:
Laboratory Address:
Analyst(s) Name(s):
Matrix:
(examples: laboratory pure water, soil, air, waste solid, leachate, sludge, other)
Method number, and Analyte, or Class of Analytes or Measured Parameters
(examples: barium by 200.7, trace metals by 6010, benzene by 8021, etc.)
We, the undersigned, CERTIFY that:
1. The analysts identified above, using the cited test method, which is in use at this
facility for the analyses of samples under the National Environmental Laboratory
Accreditation Program, have met the Initial Demonstration of Capability.
2. The test method was performed by the analyst(s) identified on this certification.
3. A copy of the test method and the laboratory-specific SOPs are available for all
personnel on-site.
4. The data associated with the initial demonstration capability are true, accurate,
complete and self-explanatory (1).
5. All raw data (including a copy of this certification form) necessary to reconstruct
and validate these analyses have been retained at the facility, and that the associated
information is well organized and available for review by authorized inspectors.
Technical Director's Name and Title Signature Date
Quality Assurance Officer's Name Signature Date
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5C-4of4
This certification form must be completed each time an initial demonstration of capability study is
completed.
(1) True: Consistent with supporting data.
Accurate: Based on good laboratory practices consistent with sound scientific principles/practices.
Complete: Includes the results of all supporting performance testing.
Self-Explanatory: Data properly labeled and stored so that the results are clear and require no
additional explanation.
-------
QUALITY SYSTEMS
APPENDIXD
ESSENTIAL QUALITY CONTROL
REQUIREMENTS
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5D-1of18
Appendix D - ESSENTIAL QUALITY CONTROL REQUIREMENTS
The quality control protocols specified by the laboratory's method manual (5.10.1.2) shall be
followed. The laboratory shall ensure that the essential standards outlined in Appendix D are
incorporated into their method manuals
All quality control measures shall be assessed and evaluated on an on-going basis and quality
control acceptance criteria shall be used to determine the validity of the data. The laboratory shall
have procedures for the development of acceptance/rejection criteria where no method or regulatory
criteria exists.
D.1 CHEMICAL TESTING
D.1.1 Positive and Negative Controls
a) Negative Controls
1) Method Blanks - Shall be performed at a frequency of one per batch of samples per matrix
type per sample extraction or preparation method. The results of this analysis shall be one
of the QC measures to be used to assess batch acceptance. The source of contamination
must be investigated and measures taken to correct, minimize or eliminate the problem if
i) the blankcontaminationexceedsaconcentrationgreaterthan 1/1 Oof the measured
concentration of any sample in the associated sample batch and
ii) the blank contamination exceeds the concentration present in the samples and is
greater than 1/10 of the specified regulatory limit.
Each sample in the affected batch must be assessed against the above criteria to determine
if the sample datum is acceptable. Any sample associated with the contaminated blank
shall be reprocessed for analysis or the results reported with appropriate data qualifying
codes.
b) Positive Controls
1) Laboratory Control Sample - (QC Check Samples) Shall be analyzed at a minimum of 1
per batch of 20 or less samples per matrix type per sample extraction or preparation method
except for analytes for which spiking solutions are not available such as total suspended
solids, total dissolved solids, total volatile solids, total solids, pH, color, odor, temperature,
dissolved oxygen or turbidity. The results of these samples shall be used to determine
batch acceptance. NOTE: the Matrix spike (see 2 below) may be used as a control as long
as the acceptance criteria are as stringent as the LCS.
2) Matrix Spikes (MS) - Shall be performed at a frequency of one in 20 samples per matrix
type per sample extraction or preparation method except for analytes for which spiking
solutions are not available such as, total suspended solids, total dissolved solids, total
volatile solids, total solids, pH, color, odor, temperature, dissolved oxygen or turbidity. The
selected sample(s) shall be rotated among client samples so that various matrix problems
may be noted and/or addressed. Poor performance in a matrix spike may indicate a
problem with the sample composition and shall be reported to the client whose sample was
used for the spike.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5D-2of 18
3) Surrogates - Surrogate compounds must be added to all samples, standards, and blanks,
for all organic chromatography methods except when the matrix precludes its use or when
a surrogate is not available. Poor surrogate recovery may indicate a problem with the
sample composition and shall be reported to the client whose sample produced the poor
recovery.
4) If the test method does not specify the spiking compounds, the laboratory shall spike all
reportable components in the Laboratory Control Sample and Matrix Spike. However, in
cases where the components interfere with accurate assessment (such as simultaneously
spiking chlordane, toxaphene and PCBs in Method 608), the test method has an extremely
long list of components or components are incompatible, a representative number (10%)
of the listed components may be used to control the test method. The selected components
of each spiking mix shall represent all chemistries, elution patterns and masses and shall
include permit specified analytes and other client requested components. The laboratory
shall ensure, however, that all reported components are used in the spike mixture within a
two-year time period, and that no one component or components dominate the spike
mixture.
D.1.2 Analytical Variability/Reproducibility
Matrix Spike Duplicates (MSDs) or Laboratory Duplicates - Shall be analyzed at a minimum of 1 in
20 samples per matrix type per sample extraction or preparation method. The laboratory shall
document their procedure to select the use of appropriate type of duplicate. The selected sample(s)
shall be rotated among client samples so that various matrix problems may be noted and/or
addressed. Poor performance in the duplicates may indicate a problem with the sample composition
and shall be reported to the client whose sample was used for the duplicate.
D.1.3 Method Evaluation
In order to ensure the accuracy of the reported result, the following procedures shall be in place:
a) Initial Demonstration of Analytical Capability - (Section 5.10.2.1) shall be performed initially
(prior to the analysis of any samples) and with a significant change in instrument type,
personnel, matrix or test method.
b) Calibration - Calibration protocols specified in Section 5.9.4 shall be followed.
c) Proficiency Test Samples - The results of such analyses (5.4.2.J or 5.5.3.4) shall be used by the
laboratory to evaluate the ability of the laboratory to produce accurate data.
D.1.4 Method Detection Limits
Method detection limits (MDL) shall be determined by 40 CFR Part 136, Appendix B unless
included in a test method or program.
a) An MDL study is not required for any component for which spiking solutions are not
available such as total suspended solids, total dissolved solids, total volatile solids, total
solids, pH, color, odor, temperature dissolved oxygen or turbidity.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5D-3of 18
b) The method detection limit shall be initially determined for the compounds of interest in
each test method in a clean matrix appropriate to the test method (such as laboratory pure
reagent water or Ottawa sand) or the matrix of interest (see definition of matrix).
c) All quantitatively reported results (i.e., those greater than 3.18 times the MDL) shall be
bracketed by calibration or calibration verification standards.
d) The MDL shall be verified annually by the preparation and analysis of at least one clean
matrix sample spiked at the current reported MDL. If the selected components cannot be
detected, the MDL study must be repeated.
e) All procedures used must be documented including the matrix type.
D.1.5 Data Reduction
The procedures for data reduction, such as use of linear regression, shall be documented.
D.1.6 Quality of Standards and Reagents
a) The source of standards shall comply with 5.9.2.
b) Reagent Quality. Water Quality and Checks:
1) Reagents -In methods where the purity of reagents is not specified, analytical reagent grade
shall be used. Reagents of lesser purity than those specified by the test method shall not
be used. The labels on the container should be checked to verify that the purity of the
reagents meets the requirements of the particular test method. Such information shall be
documented.
2) Water - The quality of water sources shall be monitored and documented and shall meet
method specified requirements.
D.1.7 Selectivity
a) Absolute retention time and relative retention time aid in the identification of components in
chromatographic analyses and to evaluate the effectiveness of a column to separate
constituents. The laboratory shall develop and document acceptance criteria for retention time
windows.
b) A confirmation shall be performed to verify the compound identification when positive results
are detected on a sample from a location that has not been previously tested by the laboratory.
Such confirmations shall be performed on organic tests such as pesticides, herbicides, or acid
extractable or when recommended by the analytical test method except when the analysis
involves the use of a mass spectrometer. Confirmation is required unless stipulated in writing
by the client. All confirmation shall be documented.
c) The laboratory shall develop and document acceptance criteria for mass spectral tuning.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5D-4of 18
D.1.8 Constant and Consistent Test Conditions
a) The laboratory shall assure that the test instruments consistently operate within the
specifications required of the application for which the equipment is used .
b) Glassware Cleaning - Glassware shall be cleaned to meet the sensitivity of the test method.
Any cleaning and storage procedures that are not specified by the test method shall be
documented in laboratory records and SOPs.
D.2 WHOLE EFFLUENT TOXICITY
D.2.1 Positive and Negative Controls
a) Positive Control - Reference Toxicants - Reference toxicant tests indicate the sensitivity of the
test organisms being used and demonstrate a laboratory's ability to obtain consistent results with
the test method.
1) The laboratory must demonstrate its ability to obtain consistent results with reference
toxicants before it performs toxicity tests with effluents for permit compliance purposes.
i. An intralaboratory coefficient of variation (%CV) is not established for each test method.
However, a testing laboratory shall maintain control charts for the control performance
and reference toxicant statistical endpoint (such as NOEC or ECp) and shall evaluate
the intralaboratory variability with a specific reference toxicant for each test method.
In addition, a laboratory must produce test results that meet test acceptability criteria
(such as greater than 80% survival in the control) as specified in the specific test
method.
ii. Intra-laboratory precision on an ongoing basis must be determined through the use of
reference toxicant tests and plotted in quality control charts. As specified in the test
methods, the control charts shall be plotted as point estimate values, such as EC25 for
chronic tests and LC 50 for acute tests, over time within a laboratory.
2) The frequency of reference toxicant testing shall comply with the EPA or state permitting
authority requirements.
3) The USEPA test methods for EPA/600/4-91-002, EPA/600/4-91-003 and EPA/600/4-90-
027F do not currently specify a particular reference toxicant and dilution series, however,
if the state or permitting authority identifies a reference toxicant or dilution series for a
particular test, the laboratory shall follow the specified requirements.
4) Test Acceptability Criteria (TAG) - The test acceptability criteria (for example, the chronic
Ceriodaphnia test, requires 80% or greater survival and an average 15 young per female
in the controls) as specified in the test method must be achieved for both the reference
toxicant and effluent test. The criteria shall be calculated and shall meet the method
specified requirements for performing toxicity:
i. The control population of Ceriodaphnia shall contain no more than 20% males.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5D-5of18
ii. An individual test may be conditionally acceptable if temperature, dissolved oxygen, pH
and other specified conditions fall outside specifications, depending on the degree of
the departure and the objectives of the tests (see test conditions and test acceptability
criteria specified for each test method). The acceptability of the test shall depend on
the experience and professional judgment of the technical employee and the permitting
authority.
b) Negative Control - Control, Brine Control or Dilution Water - The standards for the use, type and
frequency of testing are specified by the test methods and by permit and shall be followed.
D.2.2 Variability and/or Reproducibility
Intra-laboratory precision shall be determined on an ongoing basis through the use of further
reference toxicant tests and related control charts as described in item D.2.1.a above.
D.2.3 Accuracy
This principle is not applicable to Whole Effluent Toxicity.
D.2.4 Test Sensitivity
a) Test sensitivity (or test power) of the tests will depend in part on the number of replicates per
concentration, the significance level selected (0.05), and the type of statistical analysis. If the
variability remains constant, the sensitivity of the test will increase as the number of replicates
is increased. Test sensitivity is the minimum significant difference (MSD) between the control
and test concentration that is statistically significant. If the Dunnett's procedure is used, the
MSD shall be calculated according to the formula specified by the EPA test method and
reported with the test results.
b) Estimate the MSD for non-normal distribution and or heterogenous variances.
c) Point estimates: (LCp, ICp, or ECp) - Confidence intervals shall be reported as a measure of the
precision around the point estimate value.
d) The MSD shall be calculated and reported for only chronic end points. In addition, the calculated
endpoint is typically a lethal concentration of 50% (LC 50), therefore, confidence intervals shall
be reported as a measure of the precision around the point estimate value. In order to have
sufficient replicates to perform a reliable MSD, such tests shall have a minimum of four
replicates per treatment so that either parametric or non parametric tests can be conducted.
D.2.5 Selection of Appropriate Statistical Analysis Methods
a) The methods of data analysis and endpoints will be specified by language in the permit or, if not
present in the permit, by the EPA methods manuals for Whole Effluent Toxicity.
b) Dose Response Curves - When required, the data shall be plotted in the form of a curve relating
the dose of the chemical to cumulative percentage of test organisms demonstrating a response
such as death.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5D-6of 18
D.2.6 Selection and Use of Reagents and Standards
a) The grade of all reagents used in Whole Effluent Toxicity tests is specified in the test method
except the reference standard. All reference standards shall be prepared from chemicals which
are analytical reagent grade or better. The preparation of all standards and reference toxicants
shall be documented.
b) All standards and reagents associated with chemical measurements, such as dissolved oxygen,
pH or specific conductance, shall comply with the standards outlined in Appendix D.1 above.
D.2.7 Selectivity
This principle is not applicable. The selectivity of the test is specified by permit.
D.2.8 Constant and Consistent Test Conditions
a) If closed refrigerator-sized incubators are used, culturing and testing of organisms shall be
separated to avoid loss of cultures due to cross-contamination.
b) The laboratory or a contracted outside expert shall positively identify test organisms to species
on an annual basis. The taxonomic reference (citation and page(s))and the names(s) of the
taxonomic expert(s) must be kept on file at the laboratory.
c) Instruments used for routine measurements of chemical and physical parameters such as pH,
DO, conductivity, salinity, alkalinity, hardness, chlorine, and weight shall be calibrated, and/or
standardized per manufacturer's instructions and Section D. 1. Temperature shall be calibrated
per section 5.9.4.2.1 All measurements and calibrations shall be documented.
d) Test temperature shall be maintained as specified in the methods manuals. The average daily
temperature of the test solutions must be maintained within 1 "C of the selected test temperature,
for the duration of the test. The minimum frequency of measurement shall be once per 24 hour
period. The test temperature for continuous flow toxicity tests shall be recorded and monitored
continuously.
e) Water used for culturing and testing shall be analyzed for toxic metals and organics annually
or whenever the minimum acceptability criteria for control survival, growth or reproduction are
not met and no other cause, such as contaminated glassware or poor stock, can be identified.
The method specified analytes and concentration levels shall be followed.
f) New batches of food used for culturing and testing shall be analyzed for toxic organics and
metals. If food combinations or recipes are used, analyses shall be performed on the final
product upon the use of new lot of any ingredient. If the concentration of total organic chlorine
exceeds 0.15 /^g/g wet weight, or the total concentration of organochlorine pesticides plus PCBs
exceeds 0.30 /^g/g wet weight, or toxic metals exceeds 20 /ug/g wet weight, the food must not
be used.
g) Test chamber size and test solution volume shall be as specified in the methods manuals.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5D-7of18
h) Test organisms shall be fed the quantity and type food specified in the methods manuals. They
shall also be fed at the intervals specified in the test methods.
i) Light intensity shall be maintained as specified in the methods manuals. Measurements shall
be made and recorded on a yearly basis. Photoperiod shall be maintained as specified in the
test methods and shall be documented at least quarterly. For algal tests, the light intensity shall
be measured and recorded at the start of each test.
j) At a minimum, during chronic testing DO and pH shall be measured daily in at least one
replicate of each concentration. DO may be measured in new solutions prior to organism
transfer, in old solutions after organisms transfer, or both.
k) All cultures used for testing shall be maintained as specified in the methods manuals.
I) Age and the age range of the test organisms must be as specified in the manuals.
m) The maximum holding time (lapsed time from sample collection to first use in a test) shall not
exceed 36 hours without the permission of the permitting authority.
n) All samples shall be chilled to 4 °C during or immediately after collection. They shall be
maintained at a temperature range from just above the freezing temperature of water to 6 °C and
the arrival temperature shall be no greater than 6 °C. Samples that are hand delivered to the
laboratory immediately after collection (i.e., within 1 hour) may not meet the laboratory
temperature acceptance criteria. In these cases, the laboratory may accept the samples if there
is evidence (such as arrival on ice) that the chilling process has begun.
o) Organisms obtained from an outside source must be from the same batch.
D.3 MICROBIOLOGY
These standards apply to laboratories undertaking the examination of materials, products and
substances involving microbiological analysis, recovery or testing. The procedures involve the
culture media, the test sample and the microbial species being isolated, tested or enumerated.
a) Microbiological testing refers to and includes the detection, isolation, enumeration and
identification of microorganisms and their metabolites, as well as sterility testing. It includes
assays using microorganisms as part of a detection system and their use for ecological testing.
b) These standards are concerned with the quality of test results and not specifically with health
and safety measures. In the performance of microbiological testing, laboratories must be aware
of and have SOPs that conform with local, state, and national regulatory policies for the safety
and health of personnel.
c) Clothing appropriate to the type of testing being performed shall be worn, and often includes
protection for hair, beard, hands and shoes. Protective clothing worn in the microbiological
laboratory shall be removed before leaving the restricted area.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5D-8of 18
D.3.1 Positive and Negative Controls
a) Negative Controls
The laboratory shall demonstrate that the cultured samples have not been contaminated through
sampling handling/preparation or environmental exposure. These controls shall include sterility
checks of media and blanks such as filtration blanks.
1) All blanks and uninoculated controls specified by the test method shall be prepared and
analyzed at the frequency stated in the method.
2) A minimum of one uninoculated control shall be prepared and analyzed unless the same
equipment set is used to prepare multiple samples. In such cases, the laboratory shall
prepare a series of blanks using the equipment. At least one beginning and ending control
shall be prepared, with additional controls inserted after every 10 samples.
b) Positive Controls
Positive controls demonstrate that the medium can support the growth of the test organism, and
that the medium produces the specified or expected reaction to the test organism.
On a monthly basis each lot of media shall be tested with at least one pure culture of a known
positive reaction and shall be included with the sample test batch.
D.3.2 Test Variability/Reproducibility
a) Duplicates - At least 5% of the suspected positive samples shall be duplicated. In laboratories
with more than one analyst, each shall make parallel analyses on at least one positive sample
per month.
b) Where possible, participation in, or organization of collaborative trails, proficiency testing, or
interlaboratory comparisons, either formal or informal, must be done.
D.3.3 Method Evaluation
a) In order to demonstrate the suitability of a test method for its intended purpose, the laboratory
shall demonstrate and document its ability to meet acceptance criteria either specified by the
method or by the EPA or State program requirements. Acceptance criteria must meet or
exceed these requirements and must demonstrate that the test method provides
correct/expected results with respect to specified detection capabilities, selectivity, and
reproducibility.
1) Accepted (official) test methods or commercialized test kits for official test methods, or test
methods from recognized national or international standard organizations, may not require
a specific validation. Laboratories are required, however, to demonstrate proficiency with
the test method prior to first use. This can be achieved by simultaneous, side-by-side
analysis by several analysts.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5D-9of 18
2) Qualitative microbiological test methods in which the response is expressed in terms of
presence/absence, shall be validated by estimating, if possible, the specificity, and
reproducibility. The differences due to the matrices must be taken into account when testing
different sample types.
3) The validation of microbiological test methods shall be performed under the same
conditions as those for routine sample analysis. This can be achieved by using a
combination of naturally contaminated products and spiked products with results that can
be statistically analyzed to demonstrate that the test meets its intended purpose.
4) All validation data shall be recorded and stored at least as long as the test method is in
force, or if withdrawn from active use, for at least 5 years past the date of last use.
b) Laboratories shall participate in the Proficiency Test programs (interlaboratory) identified by
NELAP (5.4.2.J or 5.5.3.4).
D.3.4 Test Performance
All growth and recovery media must be checked to assure that the target organisms respond in an
acceptable and predictable manner (see D.3.1.b).
D.3.5 Data Reduction
a) The calculations, data reduction and statistical interpretations specified by each test method
shall be followed.
b) If the test method specifies colony counts, such as membrane filter or colony counting, then the
ability of individual analysts to count colonies shall be verified at least once per month, by
having two or more analysts count colonies from the same plate.
D.3.6 Quality of Standards, Reagents and Media
The laboratory shall ensure that the quality of the reagents and media used is appropriate for the
test concerned.
a) Culture media may be prepared in the laboratory from the different chemical ingredients, from
commercial dehydrated powders or may be purchased ready to use.
b) Reagents, commercial dehydrated powders and media shall be used within the shelf-life of the
product and shall be documented according to 5.10.5. The laboratory shall retain all
manufacturer supplied "quality specification statements" which may contain such information
as shelf life of the product, storage conditions, sampling regimen/rate, sterility check including
acceptability criteria, performance checks including the organism used, their culture collection
reference and acceptability criteria, date of issue of specification, or statements assuring that
the relevant product batch meets the product specifications.
c) Distilled water, deionized water or reverse osmosis produced water free from bactericidal and
inhibitory substances shall be used in the preparation of media solutions and buffers. Where
required by the test method, the quality of the water (such as pH, chlorine residual, specific
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 5D-1 Oof 18
conductance or metals) shall be monitored at the specified frequency and evaluated according
to the stated standards. Records shall be maintained on all activities.
d) Media, solutions and reagents shall be prepared, used and stored according to a documented
procedure following the manufacturer's instructions or the test method.
e) All laboratory media shall be checked to ensure they support the growth of specific microbial
cultures. In addition, selective media shall be checked to ensure they suppress the growth of
non-target organisms. Media purchased pre-preparedfrom the manufacturer shall be checked
monthly. In preference to using the commonly used streak method, it is better to use a
quantitative procedure, where a known (often low) number of relevant organisms are inoculated
into the medium under test and the recovery evaluated.
f) Each lot of laboratory detergent shall be checked to ensure that residues from the detergent do
not inhibit or promote growth of microorganisms, such as inhibitory residue test.
D.3.7 Selectivity
a) All confirmation/verification tests specified by the test method shall be performed according to
method protocols.
b) In order to demonstrate traceability and selectivity, laboratories shall use reference cultures of
microorganisms obtained from a recognized national collection or an organization recognized
by the assessor body.
1) Reference cultures may be subcultured once to provide reference stocks. Appropriate purity
and biochemical checks shall be made and documented. The reference stocks shall be
preserved by a technique which maintains the desired characteristics of the strains.
Examples of such methods are freeze-drying, liquid nitrogen storage and deep-freezing
methods. Reference stocks shall be used to prepare working stocks for routine work. If
reference stocks have been thawed, they must not be re-frozen and re-used.
2) Bacterial working stocks shall not be sub-cultured under normal conditions. However
working stocks may be subcultured up to a defined number of subcultures when:
i. it is required by standard test methods, or
ii. laboratories can provide documentary evidence demonstrating that there has been no
loss of viability, no changes in biochemical activity and/or no change in morphology.
3) Working stocks shall not be subcultured to replace reference stocks.
4) A scheme for handling reference cultures is included in figure D. 1.
D.3.8 Constant and Consistent Test Conditions
a) The laboratory shall devise an appropriate environmental monitoring program to indicate trends
in levels of contamination appropriate to the type of testing being carried out. Acceptable
background counts shall be determined and there shall be a documented procedures to deal
with situations in which these limits are exceeded.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5D-11of18
b) Walls, floors, ceilings and work surfaces shall be non-absorbentand easy to clean and disinfect.
Wooden surfaces of fixtures and fitting shall be adequately sealed. Measures shall be taken to
avoid accumulation of dust by the provision of sufficient storage space by having minimal
paperwork in the laboratory and by prohibiting plants and personal possessions from the
laboratory work area.
c) Temperature measurement devices
1) Where the accuracy of temperature measurement has a direct effect on the result of the
analysis, temperature measuring devices such as liquid-in-glass thermometers,
thermocouple, platinum resistance thermometers used in incubators, autoclaves and other
equipment shall be the appropriate quality to achieve the specification in the test method.
The graduation of the temperature measuring devices must be appropriate for the required
accuracy of measurement and they shall be calibrated to national or international standards
for temperature (see 5.9.2.1). Calibration shall be done at least annually.
2) The stability of temperature, uniformity of temperature distribution and time required to
achieve equilibrium conditions in incubators, waterbaths, ovens and temperature controlled
rooms shall be established, for example, position, space between and height of stacks of
Petri dishes.
d) Autoclaves
1) The performance of each autoclave shall be initially evaluated by establishing its functional
properties, for example heat distribution characteristics with respect to typical uses.
Autoclaves shall be capable of meeting specified temperature tolerances. Pressure cookers
fitted only with a pressure gauge are not recommended for sterilization of media or
decontamination of wastes.
2) Records of autoclave operations including temperature and time shall be maintained. This
shall be done for every cycle. Acceptance/rejection criteria shall be established and used
to evaluate the autoclave efficiency and effectiveness.
e) Volumetric equipment such as automatic dispensers, dispenser/diluters, mechanical hand
pipettes and disposal pipettes may all be used in the microbiology laboratory. Regular checks
as outlined in Section 5.9.4.2.1 shall be performed and documented.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5D-12of18
Figure D-1. USE OF REFERENCE CULTURES (BACTERIA)
Flow Chart
Reference culture from source recognized by NELAC
4
Culture once
Appropriate Purity Checks and Biochemical Tests
Reference Stocks
Retained under specific Conditions:
Freeze dried, liquid nitrogen storage, deep frozen or other storage means under specified
conditions and storage times/
Thaw/Reconstitute
Purity Checks and Biochemical Tests as Appropriate
Working Stocks
Maintained under specific conditions and storage times
Regular/Daily Quality Controls
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5D-13of 18
f) Conductivity meters, oxygen meters, pH meters, hygrometers, and other similar measurement
instruments shall be calibrated according to the method specified requirements (see
Appendix D.1). Mechanical timers shall be checked regularly against electronic timing devices
to ensure accuracy.
D.4 RADIOCHEMICAL ANALYSIS
These standards apply to laboratories undertaking the examination of environmental samples by
radiochemical analysis. These procedures for radiochemical analysis may involve some form of
chemical separation followed by detection of the radioactive decay of analyte (or indicative
daughters) and tracer isotopes where used. For the purpose of these standards procedures for the
determination of radioactive isotopes by mass spectrometry (e.g. ICP-MS or TIMS) or optical (e.g.
KPA) techniques are not addressed herein.
D.4.1 Negative Controls
a) Method Blank - Shall be performed at a frequency of one per preparation batch. The results of
this analysis shall be one of the quality control measures to be used to assess batch acceptance.
The method blank result shall be assessed against the specific acceptance criteria [see
5.10.1.2.b)18] specified in the laboratory method manual [see 5.10.1.2]. When the specified
method blank acceptance criteria is not met the specified corrective action and contingencies
[see 5.10.1.2.b)19 and 20] will be followed. The occurrence of a failed method blank
acceptance criteria and the actions taken shall be noted in the laboratory report [see 5.13.a)11].
b) In the case of gamma spectrometry where the sample matrix is simply aliquoted into a
calibrated counting geometry the method blank shall be of similar counting geometry that is
empty or filled to similar volume with ASTM Type II water to partially simulate gamma
attenuation due to a sample matrix.
c) There shall be no subtraction of the required method blank [see D.4.1 .a)] result from the sample
results in the associated preparation or analytical batch. This does not preclude the application
of any correction factor (e.g. instrument background, analyte presence in tracer, reagent
impurities, peak overlap, calibration blank, etc.) to all analyzed samples, both program/project
submitted and internal quality control samples. However, these correction factors shall not
depend on the required method blank result in the associated analytical batch.
d) The method blank acceptance criteria [see 5.10.1.2.b)18] shall address the presumed aliquot
size on which the method blank result is calculated and the manner in which the method blank
result is compared to sample results of differing aliquot size.
D.4.2 Positive Controls
a) Laboratory Control Samples - Shall be performed at a frequency of one per preparation batch.
The results of this analysis shall be one of the quality control measures to be used to assess
batch acceptance. The laboratory control sample result shall be assessed against the specific
acceptance criteria [see 5.10.1.2.b)18] specified in the laboratory method manual [see 5.10.1.2].
When the specified laboratory control sample acceptance criteria is not met the specified
corrective action and contingencies [see 5.10.1.2. b)19 and 20] will be followed. The occurrence
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5D-14of18
of a failed laboratory control sample acceptance criteria and the actions taken shall be noted in
the laboratory report [see 5.13.a)11].
b) Matrix Spike - Shall be performed at a frequency of one per preparation batch for those methods
which do not utilize an internal standard or carrier and for which there is a physical or chemical
separation process and where there is sufficient sample to do so. The results of this analysis
shall be one of the quality control measures to be used to assess batch acceptance. The matrix
spike result shall be assessed against the specific acceptance criteria [see 5.10.1.2.0)18]
specified in the laboratory method manual [see 5.10.1.2]. When the specified matrix spike
acceptance criteria is not met the specified corrective action and contingencies [see
5.10.1.2.b)19 and 20] will be followed. The occurrence of a failed matrix spike acceptance
criteria and the actions taken shall be noted in the laboratory report [see 5.13.a)11]. The lack
of sufficient sample aliquot size to perform a replicate analysis should be noted in the laboratory
report.
c) The activity of the laboratory control sample and matrix spike analyte(s) shall be greater than
ten times and less than one hundred times the a priori detection limit.
d) The laboratory standards used to prepare the laboratory control sample and matrix spike shall
be from a source independent of the laboratory standards used for instrument calibration.
e) Where a radiochemical method, other than gamma spectroscopy, has more than one reportable
analyte isotope (e.g. isotopic uranium: U-234, -235, and -238) only one of the analyte isotopes
need be included in the laboratory control or matrix spike sample at the indicated activity level.
However, where more than one analyte isotope is present above the specified activity level each
shall be assessed against the specified acceptance criteria.
f) Where gamma spectrometry is used to identify and quantitate more than one analyte isotope
the laboratory control sample and matrix spike shall contain isotopes that represent the low (e.g.
americium-241), medium (e.g. cesium-137) and high (e.g. cobalt-60) energy range of the
analyzed gamma spectra. As indicated by these examples the isotopes need not exactly
bracket the calibrated energy range or the range over which isotopes are identified and
quantitated.
D.4.3 Test Variability/Reproducibility
a) Replicate - Shall be performed at a frequency of one per preparation batch where there is
sufficient sample to do so. The results of this analysis shall be one of the quality control
measures to be used to assess batch acceptance. The replicate result shall be assessed against
the specific acceptance criteria [see 5.10.1.2.b)18] specified in the laboratory method manual
[see 5.10.1.2]. When the specified replicate acceptance criteria is not met the specified
corrective action and contingencies [see 5.10.1.2. b) 19 and 20] will be followed. The occurrence
of a failed replicate acceptance criteria and the actions taken shall be noted in the laboratory
report [see 5.13.a)11].
D.4.4 Other Quality Control Measures
a) Tracer - For those methods that utilize a tracer (i.e. internal standard) each sample result will
have an associated tracer recovery calculated and reported. The tracer recovery for each
sample results shall be one of the quality control measures to be used to assess the associated
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5D-15of18
sample result acceptance. The tracer recovery shall be assessed against the specific
acceptance criteria [see 5. 1 0. 1 .2.b)1 8] specified in the laboratory method manual [see 5.10.1.2].
When the specified tracer recovery acceptance criteria is not met the specified corrective action
and contingencies [see 5.10.1 .2.b)1 9 and 20] will be followed. The occurrence of a failed tracer
recovery acceptance criteria and the actions taken shall be noted in the laboratory report [see
b) Carrier - For those methods that utilize a carrier (i.e. internal standard) each sample will have
an associated carrier recovery calculated and reported. The carrier recovery for each sample
shall be one of the quality control measures to be used to assess the associated sample result
acceptance. The carrier recovery shall be assessed against the specific acceptance criteria [see
5.10.1. 2. b)18] specified in the laboratory method manual [see 5.10.1.2]. When the specified
carrier recovery acceptance criteria is not met the specified corrective action and contingencies
[see 5.10.1.2.b)19 and 20] will be followed. The occurrence of a failed carrier recovery
acceptance criteria and the actions taken shall be noted in the laboratory report [see 5. 1 3.a)1 1 ].
0.4.5 Method Evaluation
In order to ensure the accuracy of the reported result, the following procedures shall be in place:
a) Initial Demonstration of Capability - (section 5.10.2.1) shall be performed initially (prior to the
analysis of any samples) and with a significant change in instrument type, personnel or method.
b) Proficiency Test Samples - The results of such analysis (5.4.2.J or 5.5.3.4) shall be used by the
laboratory to evaluate the ability of the laboratory to produce accurate data. The providers of
such proficiency test samples should conform to the requirements of ANSI N42.22.
D.4.6 Radiation Measurement System Calibration
Due to the stability and response nature of modern radiation measurement instrumentation it is not
typically necessary to calibrate these systems in the day of use manner done so for some types of
chemical measurement instrumentation. As well due to the nature of some radiation measurement
instrumentation calibrations it may not be practical to calibrate in a day of use manner. In addition
the calibration of modern radiation measurement instrumentation has significant differences from
chemical measurementinstrumentation. This section will address those practices that are necessary
for proper calibration and those requirements of section 5.9.4.3 (Instrument Calibrations) that are
not applicable to some types of radiation measurement instrumentation.
a) Calibration Curves
The requirements of 5.9.4.3.b)1 through 5.9.4.3.b)4 for the determination of the appropriate
number of standards for initial calibration are not applicable to the performance of radiochemical
methods. For those radiochemical methods that may require multiple standards for initial
calibration (e.g. gas-proportional counting and liquid scintillation counting) the required number
shall be addressed in the laboratory method manual [see 5.10.1 .2. b)1 3] if not addressed in the
method.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5D-16of 18
b) Calibration Curve Regression
The requirements of 5.9.4.3.C are not necessarily applicable for all radiochemical methods.
Instead where linear regression is used to fit standard response or calibration standard results
to a calibration curve the correlation coefficient shall be determined. Where non-linear
regression is used to fit standard response or calibration standard results to a calibration curve
the correlation coefficient should be determined.
c) Calibration Range
The requirements of 5.9.4.3.d are not applicable to the performance of radiochemical methods
given the non-correlated event nature of decay counting instrumentation.
d) Calibration Verification
The Laboratory Control Sample may fill the requirements for the performance of an initial
calibration and continuing calibration verification standard as specified in section 5.9.4.4.1 and
5.9.4.4.2. The calibration verification acceptance criteria shall be the same as specified for the
Laboratory Control Sample.
e) Background Calibration-Background calibration measurementsshall be made on a regularbasis
and monitored using control charts or tolerance charts to ensure that a laboratory maintains its
capability to meet required data quality objectives. These values are subtracted from the total
measured activity in the determination of the sample activity
1) For gamma spectroscopy systems, background calibration measurements shall be
performed on at least a monthly basis.
2) For alpha spectroscopy systems, background calibration measurements shall be performed
on at least a monthly basis.
3) For gas-proportional and scintillation counters, background calibration measurements shall
be performed on a day of use basis.
f) Calibration - Instrument calibration shall be performed with reference standards as defined in
section D.4.9.a. The standards shall have the same general characteristics (i.e. geometry,
homogeneity, density, etc.) as the associated samples.
g) The frequency of calibration shall be addressed in the laboratory method manual [see
5.10.1.2.b)13] if not addressed in the method. A specific frequency (e.g. monthly) or
observations from the associated control or tolerance chart, as the basis for calibration shall be
specified.
D.4.7 Method Detection Limits
Note: To be addressed in the next Chapter 5 revision.
D.4.8 Data Reduction
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5D-17of18
a) Refer to Section 5.10.6," Computers and Electronic Data Related Requirements," of this
document.
b) Method Uncertainties - the laboratory shall have the ability to trace all sources of method
uncertainties and their propagation to reported results. The ISO "Guide to the Expression of
Uncertainty in Measurement" and/or the NIST Technical Note 1297 on "Guidelines for
Evaluating and Expressing the Uncertainty of NIST Measurement Results" should be used in
this regard.
0.4.9 Quality of Standards and Reagents
a) The quality control program shall establish and maintain provisions for radionuclide standards.
1) Reference standards that are used in a radiochemical laboratory shall be obtained from the
National Institute of Standards and Technology (NIST), EPA, or suppliers who participate
in supplying NIST standards or NIST traceable radionuclides. Any reference standards
purchased outside the United States shall be traceable back to each country's national
standards laboratory. Commercial suppliers of reference standards should conform to ANSI
N42.22 to assure the quality of their products.
2) Reference standards shall be accompanied with a certificate of calibration whose content
is as described in ANSI N42.22 -1995, Section 8, Certificates.
3) Laboratories should consult with the supplier if the lab's verification of the activity of the
reference traceable standard indicates a noticeable deviation from the certified value. The
laboratory shall not use a value other than the decay corrected certified value.
b) All reagents used shall be analytical reagent grade or better.
D.4.10 Constant and Consistent Test Conditions
a) To prevent incorrect analysis results caused by the spread of contamination among samples,
the laboratory shall establish and adhere to written procedures to minimize the possibility of
cross-contamination between samples.
b) Instrument performance checks - Instrument performance checks using appropriate check
sources shall be performed on a regular basis and monitored with control charts or tolerance
charts to ensure that the instrument is operating properly and that the calibration has not
changed. The same check source used in the preparation of the tolerance chart or control chart
at the time of calibration shall be used in the performance checks of the instrument. The check
sources must provide adequate counting statistics for a relatively short count time and the
source should be sealed or encapsulated to prevent loss of activity and contamination of the
instrument and laboratory personnel. For alpha and gamma spectroscopy systems, the
instrument performance checks shall include checks on the counting efficiency and the
relationship between channel number and alpha or gamma ray energy.
1) For gamma spectroscopy systems, the performance checks for efficiency and energy
calibration shall be performed on a day of use basis along with performance checks on peak
resolution.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5D-18of 18
2) For alpha spectroscopy systems, the performance check for energy calibration shall be
performed on a day of use basis and the performance check for counting efficiency shall be
performed on at least a monthly basis.
3) For gas-proportional and scintillation counters, the performance checks for counting
efficiency shall be performed on a day of use basis.
D.5 AIR TESTING
Analyses for Air Toxics shall follow the essential quality controls for chemistry outlined in
Appendix D. 1. For air testing, the blank, laboratory control sample and a desorption efficiency (such
as charcoal tubes) shall be used. Matrix spikes and duplicate samples shall be used when feasible.
-------
QUALITY SYSTEMS
APPENDIX E
PERFORMANCE BASED MEASUREMENT
SYSTEM
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5-E1of18
Appendix E - PERFORMANCE BASED MEASUREMENT SYSTEM
RESERVED - The information presented here is the most recent EMMC Workgroup draft, and
is provided for information only.
E.1 CHECKLIST OVERVIEW
The Checklists present consensus among EPA's programs on performance "categories" that allow
use of the same Checklists across the Agency's various programs/projects. The Checklists may be
applied to screening and field techniques as well as traditional laboratory procedures.
Implementation of the Checklists is intended to be program-specific and a category that does not
apply within a specific EPA program or project will be indicated by NA (not applicable). Criteria for
a specific EPA program or project are to be filled in under the "Performance Criteria" column; e.g.,
an Office of Water Reference Method may specify 20% RSD or a correlation coefficient of 0.995
for the category that specifies calibration linearity, whereas an Office of Solid Waste project may
specify a Measurement Quality Objective of 12% RSD or a correlation coefficient of 0.998 for this
category.
For each EA program or project, the checklists are to be completed for each matrix within each
medium for which performance is demonstrated.
Each completed Checklist must be retained on file at the laboratory that uses the performance-
based method (PBM) or method modification and must be submitted to the appropriate regulatory
authority upon request to support analysis of those samples to which the PBM or modified method
was applied.
E.1.1 Header
Each page of the checklist contains six lines of header information, consisting of:
a) Date: enter the date that the checklist was completed and associated samples were collected.
b) Laboratory Name & Address: If the method is being employed by a commercial contract
laboratory on behalf of one or more applicable clients, enter the name of the laboratory if
possible followed by a listing of the appropriate clients from which the samples were collected).
c) Discharge Point ID, where applicable.
d) Facility Name: enter the name of the water treatment facility, system, or regulated facility or
other program/project specified entity where the facility maintains an on-site analytical
laboratory.
e) EPA Program & Applicable Regulation: enter the name of the Agency program or project to
whom the results will be reported, or under the auspices of which the data are collected, e.g.,
"CAA" for Clean Air Act testing/monitoring and "SDWA" for analyses associated with the Safe
Drinking Water Act.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5-E2of18
f) Medium: enter the type of environmental sample, e.g., water-NOTE a separate checklist
should be prepared for each matrix, e.g., for checklists associated with performance-based
methods for SDWA, enter Drinking Water as the matrix type. As the evaluations of a
performance-based method will involve matrix-specific performance measures, a separate
checklist would be prepared for each matrix. The medium is the environmental sample type to
which the performance-based method applies, whereas the performance category matrix,
appearing in the body of the checklists refers to the specific sample type within the Medium that
was spiked, e.g., for Medium hazardous waste, the checklist category Matrix may be solvent
waste.
g) Analyte, Class of Analytes, or Other Measured Parameters-CAS # where available: As many
methods apply to a large number of analytes, it is not practical to list every analyte in this field,
as indicated on the form, the class of analytes may be listed here, i.e., volatile organics.
However, if such a classification is used, a separate list of analytes and their respective
Chemical Abstract Service Registry Numbers (CAS #) must be attached to the checklist.
E.1.2 EPA PBMS Checklist for Initial Demonstration of Method Performance
The Initial Demonstration of Method Performance involves multiple spikes into a defined sample
matrix (e.g., wastewater, paper plant effluent), to demonstrate that the Performance-based Method
meets the Program or Project Performance Criteria based on the performance of established
Reference Method or based on Measurement Quality Objectives (analytical portion of the Data
Quality Objectives). This exercise is patterned after the Initial Demonstration of Capability in C.1
of this appendix.
Footnote #1 indicates that a detailed narrative description of the initial demonstration procedure is
to be provided.
Footnote #2 For multi-analyte methods, enter "see attachment" and attach a list or table containing
the analyte-specific performance criteria from the reference method or those needed to satisfy
measurement quality objectives. Complete only one of the two columns. For multi-analyte methods
it is suggested that the list also contain the information for the "Results Obtained" and Performance
Specification Achieved" columns.
Footnote #3 indicates that if a reference method is the source of the performance criteria, the
reference method should be appropriate for its intended application and the listed criteria should be
fully consistent with that reference method. The reference method name and EPA number (where
applicable) should be delineated.
There are 34 numbered entries in the body of the checklist-each program will indicate the
performance categories which do not pertain to the application/project, e.g., by listing as NA ("Not
Applicable") for the corresponding performance criteria.
#1. Written Method (addressing all elements in the EMMC format)
The details of the method used for analysis (and sampling, where applicable) should be described
in a version of the method written in EMMC format. The EMMC method format includes the
following sections: 1.0 Scope & Application; 2.0 Summary of Method; 3.0 Definitions; 4.0
Interferences; 5.0 Safety; 6.0 Equipment & Supplies; 7.0 Reagents & Standards; 8.0 Sample
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5-E3of 18
Collection, Preservation & Storage; 9.0 Quality Control; 10.0 Calibration & Standardization; 11.0
Procedure; 12.0 Data Analysis & Calculations; 13.0 Method Performance; 14.0 Pollution Prevention;
15.0 Waste Management; 16.0 References; 17.0 Tables, Diagrams, Flowcharts & Validation Data.
While this format may differ from that used in standard operation procedures (SOPs) in a given
laboratory, the use of a consistent format is essential for the efficient and effective evaluation by
inspectors, program and project managers/officers.
#2. Title, Number and date/revision of "Reference Method" if applicable.
For example Polychlorinated Dioxinsand Furans, EPA Method 1613, Revision B, October, 1994.
#3. Copy of the reference method, if applicable, maintained at the facility.
A copy of the reference method should be available to all laboratory personnel, however, it need not
be attached to the checklist itself.
#4. Differences between PBM and reference method attached, if applicable.
The laboratory should summarize the differences between the reference method and the
performance-based method and attach this summary to the checklist. This summary should focus
on significant differences in techniques (e.g., changes beyond the flexibility allowed in the reference
method), not minor deviations such as the glassware used.
#5. Concentrations of calibration standards.
The range of the concentrations of materials used to establish the relationship between the response
of the measurement system and analyte concentration. This range must bracket any action,
decision or regulatory limit. In addition, this range must include the concentration range for which
sample results are measured and reported.
#6. % RSD or Slope/Correlation Coefficient of Calibration Regression.
This performance category refers to quantitative measures describing the relationship between the
amount of material introduced into the measurement system and the response of the measurement
system,.such as an analytical instrument. A linear response is generally expected and is typically
measured as either a linear regression (for inorganic analytes) or as the relative standard deviation
(or coefficient of variation) of the response factors or calibration factors (for organic analytes). For
example, traditional performance specifications consider any regression line with a correlation
coefficient (r) of 0.995 or greater as linear. Also, for organic analytes, a relative standard deviation
(RSD) of 15% or less is often considered linear (RCRA). The calibration relationship is not
necessarily limited to a linear relationship. However, it should be remembered if the
Program/Project Office or Officer/Managers specifies other calibration relationships, e.g., quadratic
fit, more calibration standards are generally necessary to establish accurately the calibration. If
applicable, a calibration curve, graphical representation of the instrument response versus the
concentration of the calibration standards, should be attached.
#7. Performance range tested (with units).
This range must reflect the actual range of sample concentrations that were tested and must include
the concentration units. Since the procedures may include routine sample dilution or concentration,
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5-E4of18
the performance range may be broader than the range of the concentrations of the calibration
standards.
#8. Samples(s) used in initial demonstration have recommended preservative, where applicable.
Sample(s) used in the initial demonstration should employ the recommended preservative, where
applicable. Answer "yes" if the preservation in the reference method was used. If "no", include a
narrative description of the testing done to support use of the alternate preservation technique.
#9. Samples(s) used in the initial demonstration must be within the recommended holding times,
where applicable.
Unless holding time (time from when a sample is collected until analysis) has been specifically
evaluated, this entry should be taken directly from the reference method, where applicable or
standard table. If holding time has been evaluated, include the study description and conclusions
of that evaluation here, with a reference to the specific study description. The data must be
attached.
#10. Interferences.
Enter information on any known or suspected interferences with the performance-based method.
Such interferences are difficult to predict in many cases, but may be indicated by unacceptable spike
recoveries in environmental matrices, especially when such recovery problems were not noted in
testing a clean matrix such as reagent water. The interferences associated with the reference
method are to be indicated, as well as, the effect of these interferences on the performance-based
method.
#1 1 . Qualitative identification criteria used.
Enter all relevant criteria used for identification, including such items as retention time, spectral
wavelengths and ion abundance ratios. If the instrumental techniques for these performance-based
method are similar to a reference method, use the reference method as a guide when specifying
identification criteria. If the list of criteria is lengthy, attach it on a separate sheet, and enter "see
attached" for this item.
Performance Evaluation Studies performed for analytes of interest, where available (last study
sponsor and title last study number).
Several EPAprog rams conduct periodic performanceevaluation(PE) studies. Organizationsoutside
of the Agency also may conduct such studies. Where available and applicable, enter the sponsor,
title, and date of the most recent study in which the performance-based method was applied to the
matrix of interest. A program/project may specify that a performance-based method be fully
successful, i.e., within the PE study QC acceptance criteria. Where applicable, provide a listing of
analytes for which the PE results were "not acceptable".
#13. Analysis of external reference material.
Enter the results of analyses on reference material from a source different from that used to prepare
calibration standards (if available). This performance category is especially important if
Performance Evaluation Studies are not available for the analytes of interest.
-------
NEU\C
Quality Systems
Revision 9
July 2,1998
Page5-E5of 18
#14. Source of reference material.
Enter information, if applicable and available, for traceability of external reference materials used
to verify the accuracy of the results, e.g., obtained from the National Institute of Science and
Technology (MIST).
#15. Surrogates used, if applicable.
Enter the namesof the surrogate compounds used. Surrogates are often used in analysis of organic
analytes. Surrogates may be added to samples prior to preparation, as a test of the entire analytical
procedure. These compounds are typically brominated, fluorinated or isotopically labeled, with
structural similarities to the analytes of interest. Target analytes of the method may be used as
surrogates, if they can be demonstrated not to be present in the samples to be analyzed.
#16. Concentrations of surrogates, if applicable.
Enter the concentration of surrogates once spiked into the sample (i.e., final concentration).
#17. Recoveries of Surrogates appropriate to the proposed use, if applicable.
Enter the summary of the surrogate recovery limits; attach a detailed listing if more space is needed.
#18. Sample Preparation.
Enter preliminary procedures, e.g., digestion, distillation and/or extraction. A detailed listing may
be attached if more space is needed.
#19. Clean-up Procedures.
Enter appropriate sample clean-up steps prior to the determinative step (instrumental analysis), e.g.,
GPC, copper, alumina treatment, etc.
#20. Method Blank Results.
A clean matrix (i.e., does not contain the analytes of interest) that is carried through the entire
analytical procedure, including all sample handling, preparation, extraction, digestion, cleanup and
instrumental procedures. The volume or weight of the blank should be the same as that used for
sample analyses. The method blank is used to evaluate the concentrations of analytes that may be
introduced into the samples as a result of background contamination in the laboratory. Enter the
analyte/s and concentration measured in the blank.
#21. Matrix (reagent water, drinking water, sand, waste solid, ambient air, etc.).
Refers to the specific sample type within the broader Medium that was spiked, e.g., for Medium:
Hazardous Waste an example matrix spiked as part of the initial demonstration of method
performance might be "solvent waste".
#22. Spiking System, appropriate to the method and application.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5-E6of 18
Enter the procedure by which a known amount of analyte/s ("spike") was added to the sample
matrix. This may include the solvent that is employed and the technique to be employed (e.g.,
permeation tube, or volumetric pipet delivery techniques spiked onto a soil sample and allowed to
equilibrate 1 day, etc.). Solid matrices and air are often difficult to spike and considerable detailed
narrative may be necessary to delineate the procedure. For spikes into aqueous samples generally
a water miscible solvent is needed.
#23. Spike concentrations (w/units corresponding to final sample concentration).
Enter the amount of the analyte/s ("spike") that was added to the sample matrix in terms of the final
concentration in the sample.
#24. Source of spiking material.
Enter the organization or vendor from which the spiking material was obtained or how the spiking
material was prepared. This should include specific identification information, e.g., lot#, catalogue
number, etc.
#25. Number of Replicate Spikes.
The initial demonstration of method performance involves the analyses of replicate spikes into a
defined sample matrix (category #21). Enter the number of such replicates. For example in the
NPDES and SDWA programs, at least 4 replicates should be prepared and analyzed independently.
#26. Precision (analyte by analyte).
Precision is a measure of agreement among individual determinations. Statistical measures of
precision include standard deviation, relative standard deviation or percent difference.
#27. Bias (analyte by analyte).
Bias refers to the systematic or persistent distortion of a measurement process which causes errors
in one direction. Bias is often measured as the ratio of the measured value to the "true" value or
nominal value. Bias is often (erroneously) used interchangeably with "accuracy", despite the fact
that the two terms are complementary, that is, high "accuracy" implies low "bias", as well as good
precision. Enter the name of the bias measure (% recovery, difference from true, etc.), and the
numeric value with associated units for each analyte obtained for each analyte spiked in the initial
demonstration procedure.
#28. Detection Limit (w/units; analyte by analyte), if applicable.
A general term for the lowest concentration at which an analyte can be detected and identified.
There are various approaches to establishing detection limits which include "Limit of Detection" and
'Method Detection Limit". Enter the approach used (e.g., MDL) and the analytical result with units
for each analyte in the matrix (see #21).
This performance category is of importance when operating at extremely low concentrations. If the
concentrations measured or the decisions to be made, e.g., action levels, are several orders of
magnitude above these concentrations, the "quantitation level" should be entered.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5-E7of18
#29. Confirmation of Detection Limit, if applicable.
In addition to spikes into the matrix of interest (see #21) it may be beneficial to perform the detection
limit measurements in a clean matrix, e.g., laboratory pure water, air, sand, etc. Results of the
spikes in the clean matrix are frequently available in the Agency's published methods. Determining
MDLs in a clean matrix using the performance-based method will allow a comparison to the MDLs
published in the Agency methods.
This performance category is of importance when operating at extremely low concentrations. If the
concentrations measured or the decisions to be made, e.g., action levels, are several orders of
magnitude above these concentrations, the "quantitation level" should be entered.
Also, the detection limit technique may specify specific procedures to verify that the obtained limit
is correct, e.g., the "iterative process" detailed in the 40 CFR Part 136, Appendix B, MDL
procedures.
#30. Quantitation Limit (w/ units; analyte by analyte).
The lowest concentration at which the analyte can be reported with sufficient certainty that an
unqualified numeric value is reported. Approaches to establishing quantitation limits include the
Minimum Level (ML), Interim Minimum Level (IML), Practical Quantitation Level (PQL), and Limit
of Quantitation (LOG). Enter the approach used to establish the quantitation limits, and the
corresponding units for each analyte appropriate to the intended application and a description of how
hey were determined.
#31. Qualitative Confirmation.
Enter all relevant criteria used for identification, including such items as: retention time; use of
second chromatographic column; use of second (different) analytical technique; spectral
wavelengths, ion abundance ratios. If the instrumental techniques for the performance-based
method are similar to those of a reference method, use the reference method as a guide when
specifying confirmation criteria. If the list of criteria is lengthy, attach it on a separate sheet, and
enter "see attached" for this item.
#32. Frequency of performance of Initial Demonstration:
Enter the frequency that the initial demonstration needs to be repeated.
#33-#34. Other Criteria.
Enter other necessary program/project specific method performance categories.
Signatures:
The printed name, signature and date of each analyst involved in the initial demonstration of method
performance is to be provided at the bottom of the checklist sheet.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5-E8of 18
E.1.3 EPA PBMS Checklist for Continuing Demonstration of Capability:
The process by which a laboratory documents that its previously established performance of an
analytical procedure continues to meet performance specifications as delineated in this checklist.
#1. Method Blank Result.
A clean matrix (i.e., does not contain the analytes of interest) that is carried through the entire
analytical procedure, including all sample handling, preparation, extraction, digestion, cleanup and
instrumental procedures. The volume or weight of the blank should be the same as that used for
sample analyses. The method blank is used to evaluate the levels of analytes that may be
introduced into the samples as a result of background contamination in the laboratory. Enter the
analyte/s and concentration measured in the blank.
#2. Concentrations of calibration standards used to verify working range, where applicable (include
units).
The range of the concentration(s) of materials used to confirm the established relationship between
the response of the measurementsystem and analyte concentration. This range should bracket any
action, decision or regulatory limit. In addition, this range must include the concentration range for
which sample results are measured and reported (when samples are measured after sample
dilution/concentration). Enter the concentrations of the calibration standards.
#3. Calibration Verification.
A means of confirming that the previously determined calibration relationship still holds. This
process typically involves the analyses of two standards with concentrations which bracket the
concentration(s) measured in the sample/s. Enter the procedure to be used to verify the calibration
and the results obtained for each analyte.
#4. Laboratory Control Sample.
An analytical standard carried through all aspects of the analytical method, e.g., digestions,
distillations and determinative steps/instrumentation. It is generally used to assess the performance
of all of the measurement system independent of the challenges of the sample matrix.
#5. External QC sample (where applicable).
Enter the results of analyses for reference material (e.g., quality control samples/ampoules) from
a source different from that used to prepare calibration standards (where applicable). Enter the
concentration, as well as, the source of this material. This performance category is of particular
importance if Performance Evaluation (PE) studies are not available for the analytes of interest.
#6. Performance Evaluation Studies performed for analytes of interest, where available (last study
sponsor and title last study number).
Several EPA programs conduct periodic performanceevaluation(PE) studies. Organizationsoutside
of the Agency also may conduct such studies. Where available and applicable, enter the sponsor,
title, and date of the most recent study in which the performance-based method was applied to the
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5-E9of 18
matrix of interest. A program/project may specify that a performance-based method be fully
successful, i.e., within the PE study QC acceptance criteria.
#7. List of analytes for which results were "not acceptable" in PE study where available and
applicable..
#8. Surrogates used, if applicable.
Enter the names of the surrogate compounds used. Surrogates are often used in analysis of organic
analytes. Surrogates may be added to samples prior to preparation, as a test of the entire analytical
procedure. These compounds are typically brominated, fluorinated or isotopically labeled, with
structural similarities to the analytes of interest. Target analytes of the method may be used as
surrogates, if they can be demonstrated not to be present in the samples to be analyzed.
#9. Concentration of surrogates, if applicable.
Enter the concentration of surrogates once spiked into the sample (i.e., final concentration), with
units.
#10. Recoveries of Surrogates appropriate to the proposed use (if applicable).
Enter the summary of the surrogate recovery limits and attached a detailed listing (each surrogate
compound), if more space is needed.
#11. Matrix (reagent water, drinking water, sand, loam, clay, waste solid, ambient air, etc.).
Refers to the specific sample type within the broader "Medium" that was spiked, e.g., for Medium:
Waste an example matrix, spiked as part of the initial demonstration of method performance, might
be solvent waste.
#12. Matrix Spike Compounds.
Enter the analytes spiked. In preparing a matrix spike, a known amount of analyte is added to an
aliquot of a real-world sample matrix. This aliquot is analyzed to help evaluate the effects of the
sample matrix on the analytical procedure. Matrix spike results are typically used to calculate
recovery of analytes as a measure of bias for that matrix.
#13. Matrix Spike Concentrations (w/units corresponding to final sample concentration).
Enter the amount of the analyte/s or "spike" that was added to the sample matrix in terms of the final
concentration in the sample.
#14. Recovery of Matrix Spike (w/units).
The ratio of the standard deviation of a series of at least three measurements to the mean of the
measurements. This value is often expressed as a percentage of the mean.
Note: Some programs/projects have utilized matrix spike duplicates (a separate duplicate of the
matrix spike) to help verify the matrix spike result and to provide precision data for analytes which
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page 5-E10 of 18
are not found in real-world samples, since duplicates of non-detects provides little information
concerning the precision of the method. See Item #19.
#15. Qualitative identification criteria used.
Enter all relevant criteria used for identification, including such items as retention times, spectral
wavelengths, and ion abundance ratios. If the instrumental techniques for the performance-based
method are similar to a reference method, use the reference method as a guide when specifying
identification criteria. If the list of criteria is lengthy, attach it on a separate sheet, and enter "see
attached" for this item.
#16. Precision (analyte by analyte).
#17-18. Other category.
Enter other necessary program/project specific method performance categories.
Signatures:
The printed name, signature and date of each analyst involved in the initial demonstration of method
performance is to be provided at the bottom of the checklist sheet.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5-E11of18
EPA Performance-Based Measurement System
Certification Statement
Date: Page of
Laboratory Name & Address
Facility Name:
Discharge Point ID, where applicable:
EPA Program and Applicable Regulation:
Medium:
(i.e., water, soil, air, waste solid, leachate, sludge, other)
Analyte, Class of Analytes or Measured Parameters (CAS # where available)
(i.e , barium, trace metals, benzene, volatile organics, etc.)
We, the undersigned, CERTIFY that:
1. The methods in use at this facility for the analyses of samples for the programs of
the U.S. Environmental Protection Agency have met the Initial and any required Continuing
Demonstration of Method Performance Criteria specified under the Performance-Based
Measurement System.
2. A copy of the Performance-Based Method, written in EMMC format, and copies of
the reference method and laboratory-specific SOPs are available for all personnel on-site.
3. The data and checklists associated with the initial and continuing demonstration of
method performance are true, accurate, complete and self-explanatory (1).
4. All raw data (including a copy of this certification form) necessary to reconstruct and
validate these performance related analyses have been retained at the facility, and that
the associated information is well organized and available for review by authorized
inspectors.
Facility Manager's Name and Title Signature Date
Quality Assurance Officer's Name Signature Date
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5-E12of18
This certification form must be completed when the performance-based method is
originally certified, each time a continuing demonstration of method performance
is documented, and whenevera change of personnel involves the Facility Manager
or the Quality Assurance Officer.
(1) True: Consistent with supporting data.
Accurate: Based on good laboratory practices consistent with sound scientific
principles/practices.
Complete: Includes the results of all supporting performance testing.
Self-Explanatory: Date properly labeled and stored so that the results are clear
and require no additional explanation.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5-E13of18
EPA PBMS
Checklist for Initial Demonstration of Method Performance
Provide a checklist for each matrix included in the demonstration.
Date: Page of
Laboratory Name & Address:
Facility Name:
Discharge Point ID, where applicable:
EPA Program and Applicable Regulation:
Medium:
(i.e., water, soil, air, waste solid, leachate, sludge, other) ,
Analyte, Class of Analytes or Other Measured Parameters (CAS #, where
available):
(i.e., barium, trace metals, benzene, volatile organics, etc.)
4 '-.Jv.-f-^A >,;•«-%".;, ••; *"-!",•.. ,>v Initial nju*tnnBfr**Ii>tM nf I
%vfc> ^ife . . i ^ i" ; >" - M > o '" ^P? .°ffw||»nBPP *» «
•:" f;-7-'i- 3r/V~ •*'••; ", -":, '=/" ?.- '.""- -"-' -. -,
l±V':°: '•'" •-''•-"""" '"-^' '"" - . •:;•• '•' "' •"•'•''>•<;•" " ;,.,
•-;,. -.-?~:f „ , ^-^,,-p,, ^ i , -,^ - — . „ ,- r
'•' '"' CT' '• "> ^"t ' ' < * > ' "-'-.- T^,~u u. * ^ ,'- f.'~ " ^.
fct:jtr';«i;«. if" ^c" -» - •- - - -,- :- ': ' " <-' 'i^ . ••*: " '
1 . Written method (addressing all elements in the EMMC format)
attached
2. Title, number and date/rev, of "reference method", if applicable
(3)
3. Copy of the reference method, if applicable, maintained at facility
4. Differences between PBM and reference method (if applicable)
attached
5. Concentrations of calibration standards
6. %RSD or slope/correlation coefficient of calibration regression
7. Performance range tested (with units)
flethod Performance (1)
f Performance
.GrHNfeftt ;
;- -: •; Based on : _ .. •
> Measurement
Reference Quality
Method Objective
- .';:;:• -!• A
Results
Obtained
, 'n - 'f ~ s^
JJ; '^ "-"W^
'^Mti^
"m*n
Achieved
;_i^-^
. i1'' * , ' ".*
" ., ,". •.•/-,,-!¥•
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5-E14of18
Initial Demonstration of Method Performance |f;
Category
8. Sampte(s) used in initial demonstration have recommended
preservative, where applicable.
9. Samptes(s) used in initial demonstration met recommended
holding times, where applicable
10. Interferences
1 1 . Qualitative identification criteria used
1 2. Performance Evaluation studies performed for analytes of
interest, where available:
Last study sponsor and title:
Last study number.
1 3. Analysis of external reference material
Last study sponsor and title:
Last study number.
List of analytes with "not acceptable' results:
14. Source of reference material
1 5. Surrogates used, if applicable
1 6. Concentrations of surrogates, if applicable
17. Recoveries of Surrogates appropriate to the proposed use, if
applicable
18. Sample preparation
19. Clean-up procedures
20. Method Blank Result
Performance I
Criteria (J>)
• ~ " , -llawKlciif . ' "
Measurement
Reference QuaWy
Method Objective
Results
Obtained
Pert.
Spec.
Achieved
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5-E15of18
, 1. " • <'''..:'. WifW
-• '-. -v"'-." '-.'.- - ,-•'•: v- * ' ;.
21 . Matrix (reagent water, drinking water, sand, waste solid, ambient
air, etc.)
22. Spicing system, appropriate to method and application
23. Spice concentrations (w/ units corresponding to final sample
concentration)
24. Source of spiking material
25. Number of replicate spikes
26. Precision (analyte by analyte)
27. Bias (analyte by analyte)
28. Detection Limit (w/ units; analyte by analyte)
29. Confirmation of Detection Limit, if applicable
30. Quantitation Limit (w/ units: analyte by analyte)
31. Qualitative Confirmation
32. Frequency of performance of the Initial Demonstration
33. Other criterion (spectry)
34. Other criterion (specify)
Method Perfor
I^WIJ^fl
Base
Reference
Method
mance -glfc
Objective
- " - *-' ,,, -
,;--- v > : -'-
Results
Pert
Spec.
Achieved
Provide a detailed narrative description of the initial demonstration.
For multi-analyte methods, enter "see attachment" and attach a list or table containing
the analyte-specific performance criteria from the reference method or those needed to
satisfy measurement quality objectives.
If a reference method is the source of the performance criteria, the reference method
should be appropriate to the required application, and the listed criteria should be fully
consistent with that reference method.
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5-E16of18
Name and signature of each analyst involved in the initial demonstration of method
performance (includes all steps in the proposed method/modification):
Name Signature Date
Name Signature Date
Name Signature Date
The certification above must accompany this form each time it is submitted.
-------
NELAC
Quality Systems
Revision 9
July 2,1998
Page5-E17of18
EPA PBMS
Checklist for Continuing Demonstration of Method Performance
Date: Page of
Facility Name:
Laboratory Name & Address:
Discharge Point ID, where applicable:
EPA Program and Applicable Regulation:
Medium:
(i.e.,water, soil, air, waste solid, leachate, sludge, other)
Analyte, Class of Analytes or Measured Parameters (CAS # where available)
(i.e., barium, trace metals, benzene, volatile organics, etc.)
Kai-Wf»;a~«»rnS?*EAy»'^wiv .TSr* •#&??
1. Method blank result (taken through all steps in the procedure)
2. Concentrations of calibration standards used to verify working
range (with units), where applicable
3. Calibration verification
4. Laboratory Control Sample
5. External QC sample (where available)
6. Performance evaluation (PE) studies, if applicable
Last study sponsor and title:
Last study number
7. List analytes for which results were "not acceptable" in PE study
8. Surrogates used, if applicable
9. Concentration of Surrogates, if applicable
10. Recovery of Surrogates (acceptance range for multianalyte
methods), if applicable
11. Matrix
12. Matrix spike compounds
13. Concentration of Matrix spike compounds
14. Recoveries of Matrix spike compounds
15. Qualitative identification criteria used
16. Precision (anaryte by anaryte)
-------
NELAC
Quality Systems
Revision 9
July 2, 1998
Page5-E18of18
17. Other category (specify)
18. Other category (specify)
EPA PBMS
Checklist for Continuing Demonstration of Method Performance
Date: Page of
Facility Name:
Discharge Point ID, where applicable:
EPA Program and Applicable Regulation:
Medium:
(i.e. water, soil, air, waste solid, leachate, sludge, other)
Analyte, Class of Analytes or Measureand (CAS # where available)
(i.e. barium, trace metals, benzene, volatile organics, etc.)
Name and signature of each analyst involved in continuing
demonstration of method performance (includes all steps in the
proposed method/modification):
Name Signature Date
Name Signature Date
Name Signature Date
The certification above must accompany this form each time it is submitted.
-------
^••H
_ o
g
c
LLJ
O
O
ACCREDITING
AUTHORITY
July 2, 1998
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page i of i
TABLE OF CONTENTS
ACCREDITING AUTHORITY
6.0 ACCREDITING AUTHORITY 1
6.1 INTRODUCTION 1
6.2 GENERAL PROVISIONS 1
6.2.1 Reciprocity 2
6,2.2 Where to Apply for NELAP Accreditation 3
6.2.3 Documentation Maintained by Accrediting Authorities 4
6.3 APPLICATION FOR NELAP RECOGNITION 5
6.3.1 Written Application for NELAP Recognition 5
6.3.2 Application Completeness Review by NELAP 7
6.3.3 Application Technical Review by a NELAP Assessment Team 7
6.3.3.1 Required Technical Elements of a NELAP-Recognized Accrediting Authority's
Program 8
6.3.3.1.1 Records 10
6.3.3.1.2 Use of Contractors by an Accrediting Authority 10
6.3.3.1.3 Accrediting Authority s Quality System 10
6.3.3.2 Application Technical Review Report 11
6.3.4 Notification of Changes to An Accrediting Authority's Program 12
6.4 ON-SITE ASSESSMENT OF THE ACCREDITING AUTHORITY 13
6.4.1 Scheduling the On-Site Assessments 13
6.4.2 Conducting the On-Site Assessment 13
6.4.3 On-Site Assessment Reports 14
6.5 ACCREDITING AUTHORITY'S REQUEST FOR EXTENSION OF TIME TO COMPLY WITH
THE NELAC STANDARDS 16
6.6 NELAP ASSESSMENT TEAM RECOMMENDATIONS TO THE NELAP DIRECTOR 16
6.7 CERTIFICATE OF RECOGNITION TO THE ACCREDITING AUTHORITY 17
6.8 USE OF ACCREDITATION BY NELAP ACCREDITED LABORATORIES 18
6.9 REQUIREMENTS OF THE NELAP 18
6.9.1 NELAP Assessment Team 19
6.10 APPEALING DECISIONS TO DENY OR REVOKE NELAP RECOGNITION 20
FIGURE 1. Flow Chart for NELAP Recognition of an Accrediting Authority 21
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 1 of 22
6.0 ACCREDITING AUTHORITY
6.1 INTRODUCTION
The standards in this Chapter define the process and criteria that will be used by the National Environmental
Laboratory Accreditation Program (NELAP) to determine whether accrediting authorities apply ing for NELAP
recognition meet the standards required for such recognition.
Chapter six is structured so that the requirements of the International Organization for Standardization/the
International Electrotechnical Commission (ISO/IEC) Guide 58: Calibration and testing laboratory
accreditation systems-General requirements for operation and recognition, 1993 are incorporated into the
requirements for an accrediting authority to be NELAP-recognized.
Chapter six addresses most of the requirements of ISO/IEC Guide 58. All NELAP-recognized accrediting
authorities are required to administer an environmental laboratory accreditation program that meets the
requirements contained in the National Environmental Laboratory Accreditation Conference (NELAC)
standards, Chapter six. Those ISO/IEC Guide 58 requirements not addressed in Chapter six are addressed
in the NELAC standards, Chapters two through five. Since Chapter six requires an accrediting authority to
administer an environmental laboratory accreditation program that requires laboratories to meet the
standards set forth in the NELAC standards, Chapters two through six, all the requirements of ISO/IEC Guide
58 will be met by a NELAP-recognized accrediting authority. In most cases, the ISO/IEC requirements,
contained in Chapter six or elsewhere in the NELAC standards are not direct quotations from the ISO/IEC
guidance document.
6.2 GENERAL PROVISIONS
a) In all cases, accrediting authorities are governmental organizations at the territory, state or federal
levels.
b) A territorial, state or federal entity shall designate the appropriate agencies or departments as its
designated NELAP-recognized accrediting authorities for the fields of testing for which NELAP
recognition is being sought.
c) A NELAP-recognized accrediting authority shall not delegate authority for granting, maintaining,
suspending or revoking a laboratory's NELAP accreditation to an outside person or body. Portions of
the accreditation process may be contracted out when the accrediting authority follows the provisions
of subsections 6.3.3.1.2 and 6.3.3.1.3 (b)(3); however, the authority to grant, maintain, suspend or
revoke NELAP accreditation must remain with the accrediting authority.
d) The procedures under which a NELAP-recognized accrediting authority operates shall be administered
in an impartial and non-discriminatory manner. The accrediting authority also shall require accredited
laboratories to maintain impartiality and integrity. An accrediting authority shall have no rules,
regulations, procedures or practices that:
1) restrict the size, large or small, of any laboratory seeking accreditation;
2) require membership or participation in any laboratory or other professional association;
3) impose any financial conditions or restrictions for participation in the accreditation program other
than the fees authorized by territorial, state or federal law; and
4) conflict with any territorial, state or federal laws governing discrimination.
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 2 of22
e) Accrediting authorities and their contractors shall confine their requirements, assessments and decision
making processes for a NELAP accredited laboratory to those matters specifically related to the fields
of testing of the NELAP accreditation being sought by a laboratory.
f) If the NELAP insignia is used on general literature such as brochures, letterheads and business cards,
a NELAP-recognized accrediting authority shall accompany the display of the NELAP insignia with at
least the phrase "NELAP-recognized".
g) Accrediting authorities, within the scope and applicability of their prevailing rules and regulations, shall
establish one or more technical committees for assistance in interpretation of requirements and for
advising the accrediting authority on the technical matters relating to the operation of its environmental
laboratory accreditation program. When such committees are established, the accrediting authority shall
have
1) formal rules and structures for the appointment and operation of committees involved in the
accreditation process and such committees shall be free from any commercial, financial, and other
pressures that might influence decisions, or
2) a structure where committee members are chosen to provide relevant competent technical support
and impartiality through a balance of interests where no single interest predominates, and
3) a mechanism for publishing interpretations and recommendations made by these committees.
h) Unless the contrary is clearly indicated, all references in this Chapter to singular nouns include the plural
noun, and all references to plural nouns include the singular, for example, "area of responsibility" also
includes multiple "areas of responsibility."
6.2.1 Reciprocity
a) Except as noted in this subsection, NELAP-recognized secondary accrediting authorities shall grant
accreditation to laboratories accredited by any other NELAP-recognized primary accrediting authority.
Such reciprocal NELAP accreditation shall be granted on a laboratory-by-laboratory basis. The NELAP-
recognized secondary accrediting authority shall consider only the current certificate of accreditation
issued by the NELAP-recognized primary accrediting authority.
b) When granting reciprocal accreditation to a laboratory, the NELAP-recognized secondary accrediting
authority shall:
1) grant reciprocal accreditation for only the fields of testing, methods and analytes for which the
laboratory holds current primary NELAP accreditation, and
2) grant reciprocal accreditation and issue certificates, as required in NELAC, Chapter four, to an
applicant laboratory within 30 calendar days of receipt of the laboratory's application.
c) All fees shall be paid by laboratories as required by the NELAP-recognized secondary accrediting
authority.
d) LaboratoriesseekingNELAPaccreditationbyaNELAP-recognizedsecondaryaccreditingauthorityshall
not be required to meet any additional proficiency testing, quality assurance, or on-site assessment
requirements for the fields of testing for which the laboratory holds primary NELAP accreditation.
e) If a NELAP-recognized secondary accrediting authority notes any potential nonconformance with the
NELAC standards by a laboratory during the initial application process for reciprocal accreditation, or
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 3 of 22
for a laboratory that already has been granted NELAP accreditation through reciprocity, the NELAP-
recognized secondary accrediting authority shall immediately notify, in writing, the applicable NELAP-
recognized primary accrediting authority and the laboratory. However, the laboratory is to be notified
only in situations where no administrative or judicial prosecution is contemplated. The notification must
cite the applicable sections within the NELAC standards for which nonconformance by the laboratory has
been noted.
1) If the alleged nonconformance is noted during the initial application process for reciprocal NELAP
accreditation, final action on the application for reciprocal NELAP accreditation shall not be taken
until the alleged nonconformance issue has been resolved, or
2) If the alleged nonconformance is noted after reciprocal NELAP accreditation has been granted, the
laboratory shall maintain its current NELAP accreditation status until the alleged nonconformance
issue has been resolved.
f) Upon receipt of the subsection 6.2.1 (e) notification, the NELAP-recognized primary accrediting authority
shall:
1) Review and investigate the alleged nonconformance,
2) Take appropriate action on the laboratory as set forth by the NELAC standards, including the
addition of any change of accreditation status in the National Environmental Laboratory
Accreditation Database. All such actions shall be taken in accordance with the laboratory's right to
due process as set forth in the NELAC standards, Chapter four, Accreditation Process,
3) Respond to the NELAP-recognized secondary accrediting authority, in writing, with a copy to the
NELAP Director, within 20 calendar days of receipt of the subsection 6.2.1 (e) notification providing:
A) an initial report of the findings;
B) a description of the actions to be taken; and
C) a schedule for implementation of further action on the alleged nonconformance, if necessary.
g) If, in the opinion of the secondary accrediting authority, the primary accrediting authority does not take
timely and appropriate action on the complaint, the secondary accrediting authority should notify the
NELAP Director of the dispute between the two accrediting authorities regarding proper disposition of
the complaint. Within 20 calendar days of receipt of such notification, the NELAP Director shall review
the alleged nonconformance and take appropriate action according to the standards set forth in this
Chapter.
6.2.2 Where to Apply for NELAP Accreditation
a) Laboratories that are NELAP accredited by an accrediting authority that has lost NELAP recognition may
seek NELAP accreditation through any NELAP-recognized accrediting authority. The laboratory's
NELAP accreditation shall remain valid throughout its current certificate of accreditation.
b) Except for governmental laboratories as noted in subsection 6.2.2(d) below, all laboratories seeking
NELAP accreditation or renewal of NELAP accreditation must apply for such accreditation through their
home state (the state in which the laboratory facility is located) accrediting authority.
c) Laboratories located in a territory or other state that is not NELAP-recognized may seek NELAP
accreditation through any NELAP-recognized accrediting authority.
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 4 of22
d) Governmental laboratories that are organizational units of the same department or agency in which the
accrediting authority is located or have other institutional conflicts of interest shall:
1) demonstrate by organizational structure that the laboratory's Technical Director and the
environmental laboratory accreditation program manager do not report within the same chain-of-
command; and
2) demonstrate by policies and procedures that conflicts-of-interest, actual or potential, do not exist;
or
3) apply for NELAP accreditation through any other NELAP-recognized accrediting authority.
e) In order that all laboratory applications for NELAP accreditation are treated equally, accrediting
authorities shall initiate processing applications for NELAP accreditation in the chronological order that
the applications are received.
6.2.3 Documentation Maintained by Accrediting Authorities
a) The accrediting authority shall provide through publication, electronic media or other means a document
or documents describing its environmental laboratory accreditation program.
1) The document or documents shall include the following:
A) information setting forth the authority of the accrediting authority to grant laboratory
accreditations and whether such laboratory accreditation is mandatory or voluntary;
B) information setting forth the accrediting authority's requirements for an environmental laboratory
to become accredited;
C) information stating the requirements for granting, maintaining, withdrawing, suspending or
revoking laboratory accreditation;
D) information about the laboratory accreditation process;
E) information on fees charged to applicants and accredited laboratories;
F) information regarding the rights and duties of accredited laboratories; and
G) information listing its NELAP accredited laboratories describing the NELAP accreditation
granted.
2) The document or documents shall be reviewed annually. A written record of this review must be
available for inspection by the NELAP assessment team.
b) When the document or documents reviewed in subsection 6.2.3(a)(2) above reveals that the accrediting
authority's environmental laboratory accreditation program has changed or is otherwise different from
the accreditation program described in such documents, the document or documents shall be updated
within 30 calendar days of the review.
c) The document or documents described in subsection 6.2.3(a)(1) above shall be made readily available
upon request.
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 5 of 22
d) The accrediting authority shall have arrangements, consistent with NELAC, Chapter three, On-Site
Assessment to safeguard information claimed by the laboratories as confidential.
6.3 APPLICATION FOR NELAP RECOGNITION
This section describes the process by which accrediting authorities may apply for NELAP recognition and
the procedures that NELAP will use to review the applications.
6.3.1 Written Application for NELAP Recognition
a) Each accrediting authority requesting initial NELAP recognition shall complete an application and supply
all supporting documentation. Applications can be obtained from the Office of the NELAP Director,
USEPA.
b) The application shall request information that is essential for the NELAP to evaluate an accrediting
authority's environmental laboratory accreditation program. When documentation is required, copies
of the applicable statutes, rules, regulations, policy statements, standard operating procedures, guidance
documents, etc. must be submitted along with a clear citation of where the required information is found
in the documents. The application will request the following information and documentation from the
accrediting authority:
1) the name, mailing address, telephone number, electronic mail address and telefacsimilie number
of the accrediting authority;
2) the statutes and regulations establishing and governing the accrediting authority's environmental
laboratory accreditation program as required in subsection 6.3.3.1 (b) and (c);
3) the policies, guidance documents, promulgating instructions and standard operating procedures
governing the operation of the accrediting authority'senvironmental laboratory accreditation program
as set forth in subsection 6.3.3.1;
4) the accrediting authority's arrangements for liability insurance and workman's compensation
assurance coverage as required in subsection 6.3.3.1 (d);
5) the requirements governing how the accrediting authority restricts the use of its accreditation by
accredited laboratories as required in Section 6.8;
6) the fields of testing for which the accrediting authority is requesting NELAP recognition;
7) the name and title of the primary person responsible for the day-to-day management of the
accrediting authority's environmental laboratory accreditation program as required in subsection
6.3.3.1 (h);
8) the names, education and experience levels of the accrediting authority's environmental laboratory
accreditation program's managementand technical staff as required in subsection 6.3.3.1 (f), (g) and
(h);
9) the names and contractual agreements for any external assessment bodies used by the accrediting
authority as required in subsection 6.3.3.1.2 and 6.3.3.1.3 (b)(3);
10) the names, areas of responsibility, education and experience levels of all technical and assessment
employees of any external assessment bodies used by the accrediting authority as required in
subsection 6.3.3.1.2 and 6.3.3.1.3 (b)(3);
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 6 of 22
11) RESERVED
12) a description of the accrediting authority's environmental laboratory accreditation program quality
systems (e.g., a quality systems manual or a quality assurance plan) as required in subsection
6.3.3.1.3;
13) the procedures for the selecting, training, contracting and appointing of the accrediting authority's
laboratory assessors as required in subsection 6.3.3.1 (f) and (g);
14) a description of the accrediting authority's conflict-of-interest disclosure program as required in
subsection 6.3.3.1 (i);
15) a tabular listing of all laboratories applying for accreditation in the two-year period immediately
preceding the date of the application. The table shall set forth the date on which the laboratory's
application for accreditation was received by the accrediting authority and the date on which final
action on the application was taken.
16) the policies and procedures used by the accrediting authority for establishing and maintaining
records on each accredited laboratory and procedures for record access and retention as required
in subsection 6.3.3.1.1;
17) the accrediting authority's findings, reports and corrective actions from internal audits conducted in
the last two years as required in subsection 6.3.3.1 (j) and 6.3.3.1.3 (b)(4);
18) a certification that the accrediting authority meets the provisions of Section 6.2 of this Chapter;
19) the name and job title of the individual or individuals authorized to sign accreditation certificates; and
20) the standardized checklist required by subsection 6.3.2 (c)(1) is to be completed by the applicant
accrediting authority citing the location in the application or supporting documents where the
checklist information is provided.
c) The application must be signed and dated by the highest ranking individual within the department or
agency responsible for laboratory accreditation activities for which NELAP recognition is being sought.
By signature on the application, this individual must attest to the validity of the information contained
within the application and its supporting documents.
d) The accrediting authority shall submit a renewal application to the NELAP every two years to maintain
NELAP recognition.
1) The NELAP shall send by certified mail or some other verifiable means to the accrediting authority,
no later than 180 calendar days prior to the expiration of the accrediting authority's then-current
NELAP recognition an application for renewal of NELAP recognition to the accrediting authority.
This notification of renewal shall indicate whether an on-site assessment is due as set forth in
subsection 6.4 (a).
2) The accrediting authority must address each requirement of subsection 6.3.1 (b); however, it must
submit information and documentation only of changes from the accrediting authority's most recent
NELAP-recognized environmental laboratory accreditation program.
3) The accrediting authority must submit the completed renewal application and supporting documents
to the NELAP within 30 calendar days of receiving the renewal notification.
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 7 of 22
6.3.2 Application Completeness Review by NELAP
a) The NELAP is required to provide notices required by this Chapter only to those accrediting authorities
who have submitted an initial application for NELAP recognition or who hold NELAP recognition.
b) If the NELAP does not receive a completed renewal application as specified in subsection 6.3.1 (d)(3),
the accrediting authority shall be notified in writing. If the accrediting authority does not submit the
completed application within 20 calendar days of receipt of this notification from the NELAP, the
accrediting authority's NELAP recognition will not be renewed upon expiration of its current NELAP
recognition.
c) Following receipt of an initial or a renewal application, the NELAP must complete a review of the
application and supporting documents to determine that information and supporting documentation
required in subsection 6.3.1 (b) is included with the submittal.
1) The completeness review of the application and supporting documents shall be conducted using a
standardized checklist provided by the NELAP as part of the application. The checklist shall be
designed to assist the applicant in gathering all the information needed to complete the application
and include a place to note the date the completeness review was completed.
2) The NELAP must notify the accrediting authority in writing within 20 calendar days of receiving the
application of any additional information needed to complete the application.
3) The accrediting authority must provide any additional information or clarification requested in writing
within 20 calendar days of receipt of the 6.3.2(c)(2) notification.
A) The NELAP may grant extensions to the 20-day time period for up to an additional 20 calendar
days if the accrediting authority requests the extension in writing.
B) The NELAP shall notify the accrediting authority in writing when an extension is granted.
4) Written notification to the accrediting authority that an application is complete shall be furnished by
the NELAP within seven calendar days of the date of such determination.
6.3.3 Application Technical Review by a NELAP Assessment Team
a) Within 30 calendar days of the determination that the application is complete, the NELAP assessment
team as established in subsection 6.9.1 will perform a technical review of the application and its
supporting documents and respond in writing to the accrediting authority.
1) The review shall be conducted in accordance with the NELAP standard operating procedures for
application review; and
2) The review shall be performed by the same NELAP assessment team assigned to conduct the on-
site assessment.
3) In the years when no on-site assessment is required, as provided in subsection 6.4 (a)(2), the
NELAP Director shall endeavor to appoint the same NELAP assessment team that conducted the
application technical review and on-site assessment for the accrediting authority's immediately
preceding application cycle.
4) The NELAP Director shall appoint a different NELAP assessment team for each succeeding four-
year NELAP on-site assessment cycle as set forth in Section 6.4 (a) of this Chapter. New four-year
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 8 of 22
NELAP on-site assessment cycles shall start with each renewal application when an on-site
assessment of the accrediting authority is required.
b) The NELAP assessment team will review the application and supporting documents to evaluate whether
the accrediting authority's environmental laboratory accreditation program requires its accredited
laboratories to meet the standards set forth by the NELAC standards, Chapter two, Proficiency Testing,
Chapter three, On-Site Assessment, Chapter four, Accreditation Process and Chapter five, Quality
Systems.
c) Should the NELAP assessment team have questions or need additional application information to
determine the accrediting authority's compliance with this Chapter, the NELAP assessment team must
seek additional application information and documentation from the accrediting authority.
6.3.3.1 Required Technical Elements of a NELAP-Recognized Accrediting Authority's Program
a) The NELAP assessment team will review the application and supporting documentation to ensure that
the accrediting authority's environmental laboratory accreditation program meets the requirements of
subsection (b) through (m) below.
b) The accrediting authority shall be a legally identifiable governmental entity;
c) The accrediting authority shall have the authority, rights and responsibilities necessary to carry out an
environmental laboratory accreditation program;
d) The accrediting authority shall have the same arrangements to cover liabilities and workman's
compensation claims arising from its operations and activities as all other programs, units, divisions,
bureaus, etc. in the department or agency in which the accrediting authority is located;
e) The accrediting authority shall have financial stability and the physical and human resources required
for the operation of an accrediting authority's laboratory accreditation program. The accrediting authority
shall have and make available on request a description of the means by which it receives it financial
support. As a benchmark, the accrediting authority shall have the resources necessary to complete
action on a laboratory's application within nine months from the time a completed application is first
received from the laboratory. This time period applies as long as all turn-around times for responses
to application review, proficiency testing and on-site assessment issues are carried out within the
required time limits set forth in the NELAC standards.
f) The accrediting authority shall appoint and maintain records on assessors, including contractual
assessors, who meet the education, experience and training requirements set forth in the NELAC
standards, Chapter three, On-Site Assessment. Such records shall include:
1) name and address;
2) organization affiliation and position held;
3) educational qualification and professional status;
4) work experience;
5) training applicable to laboratory accreditation;
6) experience in laboratory assessment, together with field of competence; and
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 9 of 22
7) date of most recent updating of record.
g) The accrediting authority shall have a system in place to evaluate assessor performance that is
consistent with the organizational employee evaluation program and demonstrates compliance with the
NELAC standards, Chapter three, On-Site Assessment;
h) The accrediting authority shall identify one individual responsible for day-to-day management of the
accrediting authority's environmental laboratory accreditation program. This individual must:
1) be an employee of the accrediting authority, and
2) have the technical expertise necessary to:
A) plan and manage the laboratory accreditation program,
B) coordinate various facets of the laboratory accreditation program with other territory, state and
federal accrediting authorities,
C) coordinate development of environmental laboratory accreditation regulations, and
D) evaluate the technical competence and performance of contractors or employees.
i) The accrediting authority shall have arrangements to ensure that the accrediting authority's management
and technical staff are free of any commercial, financial or other pressures that influence the results of
the accreditation process and are subject to the same conflict of interest disclosure requirements
designed to identify and eliminate potential conflict-of- interest problems as all other programs, units,
divisions, bureaus etc. in the department or agency in which the accrediting authority is located;
j) The accrediting authority shall have a documented procedure in place to conduct systematic internal
audits annually of the accrediting authority's environmental laboratory accreditation program to verify
compliance with the NELAC standards. One element of the annual internal audit shall be to review the
effectiveness of the quality systems required in subsection 6.3.3.1.3. When applicable, the accrediting
authority shall use the same policies and procedures for internal audits as used by all other programs,
units, divisions, bureaus etc. in the department or agency in which the accrediting authority is located;
k) The accrediting authority shall designate the individual specified in subsection 6.3.3.1 (h) oran individual
who reports directly to the individual responsible for day-to-day management of the accrediting
authority's environmental laboratory accreditation program to take responsibility for the quality system
and maintenance of the quality documentation required in subsection 6.3.3.1.3;
I) The accrediting authority shall have established standard operating procedures for dealing with appeals,
complaints and disputes arising from denial, suspension or revocation of laboratory accreditation, or from
users of the services about the NELAP accredited laboratories or any other matters,
m) The accrediting authority shall require NELAP-accredited laboratories to participate in a proficiency
testing program meeting the requirements of the NELAC standards, Chapter two, Proficiency Testing,
Appendix A; and
n) The accrediting authority or its contractors shall not offer consultancy or other services which may
compromise the objectivity or impartiality of its accreditation process and decisions.
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 10 of 22
6.3.3.1.1 Records
a) The accrediting authority shall have arrangements to establish and maintain records for each accredited
laboratory with respect to all aspects of the laboratory's accreditation process.
b) The accrediting authority shall have a policy and procedure for retaining NELAP accreditation records
for a minimum of ten years or a longer period of time if required by contractual obligations or pertinent
territorial, state or federal laws and regulations.
c) The accrediting authority shall have a policy and procedures concerning access to records as prescribed
by the territorial, state or federal entity in which the accrediting authority resides.
6.3.3.1.2 Use of Contractors by an Accrediting Authority
a) The accrediting authority shall have arrangements to ensure and require by signed contract or other
similar type of binding document that all laboratory accreditation functions performed by a contractor on
behalf of the accrediting authority are carried out in compliance with the NELAC standards.
b) When laboratory accreditation functions are contracted out, the accrediting authority shall:
1) take full responsibility for such contracted work,
2) ensure that the contractor and their employees are competent and comply with the applicable
provisions of the NELAC standards,
3) ensure that the contractor and their employees comply with the confidentiality requirements of the
accrediting authority and NELAC, and
4) ensure that the contractor and their employees are not directly involved with:
A) the laboratory seeking NELAP accreditation from the accrediting authority employing the
contractor; or
B) any other affiliations which would compromise impartiality in the NELAP laboratory accreditation
process.
6.3.3.1.3 Accrediting Authority's Quality System
a) The accrediting authority shall have a quality system appropriate to the type, range and volume of work
performed by the accrediting authority.
b) The quality system shall be documented in a quality manual and associated written quality procedures
and shall be made available for use by the staff. The quality manual shall include at least the following:
1) the quality policy statement, including objectives and commitments, signed by the manager
responsible for day-to-day management of the accrediting authority's environmental laboratory
accreditation program;
2) the organizational structure of the accrediting authority's environmental laboratory accreditation
program and the responsibilities of individual staff assigned to the structure;
3) the policies and procedures for acquiring, training, supervising and evaluating the performance of
contractors carrying out any part of the accrediting authority's laboratory accreditation program;
-------
NEU\C
Accrediting Authority
Revision 7
July 2, 1998
Page 11 of 22
4) the arrangements for annual internal audits, including Quality System reviews, as required in
subsection 6.3.3.1 (j);
5) the system for providing feedback to personnel responsible for the area audited and for taking timely
and appropriate corrective actions whenever discrepancies are detected;
6) the procedures established to address conflict-of-interest questions arising from the NELAC
standards as set forth in subsection 6.2.2 (d)(2) and for the accrediting authority's management and
technical staff as set forth in subsection 6.3.3.1 (i);
7) the policies and procedures established to maintain document control for documents required by the
NELAC standards;
8) the policies and procedures to implement the accreditation process; and
9) the policies and procedures for dealing with appeals, complaints and disputes by laboratories.
6.3.3.2 Application Technical Review Report
a) The NELAP assessment team will accept an initial application and its supporting documentation for
continued processing that contains sufficient information to determine that an accrediting authority meets
the requirements of the NELAC standards for designation as a NELAP-recognized accrediting authority.
When the NELAP assessment team completes its review of an initial application and notes no
deficiencies, the NELAP assessment team will schedule the on-site assessment as set forth in
subsection 6.4.1 below.
b) The NELAP assessment team will accept a renewal application and its supporting documentation for
continued processing that contains sufficient information to determine that an accrediting authority meets
the requirements of the NELAC standards for designation as a NELAP-recognized accrediting authority.
When the NELAP assessment team completes its review of a renewal application and denotes no
deficiencies, the NELAP assessment team will recommend to the NELAP Director that NELAP
recognition be maintained.
c) Except as noted in Section 6.5, the NELAP assessment team will not accept the application for
continued processing if it notes deficiencies. The NELAP assessment team will send by certified mail
an application technical review report to the accrediting authority. The report will:
1) identify any specific deficiencies noted during the application technical review,
2) include references to the specific NELAC standards, and
3) provide suggested corrective action.
d) To proceed with the review process, the accrediting authority shall respond with written corrective actions
within 30 calendar days of receipt of the NELAP assessment team's subsection 6.3.3.2(c) notification.
The NELAP assessment team will review the corrective actions within 30 calendar days of receipt of the
accrediting authority's response. Alternately, the accrediting authority has the option to withdraw all or
part of its NELAP recognition request.
1) If the corrective actions submitted by the accrediting authority do not meet the requirements of this
Chapter, the NELAP assessment team will notify the accrediting authority that it must submit
additional corrective actions within 20 calendar days of receipt of the NELAP assessment team's
-------
NELAC
Accrediting Authority
Revision 7
July 2,1998
Page 12 of 22
response. The NELAP assessment team will review the accrediting authority's second corrective
action response within 20 calendar days of receipt.
2) If the second corrective action response submitted by the accrediting authority does not address
satisfactorily all of the application deficiencies, the NELAP assessment team will make no further
suggestions to the accrediting authority for correction of application deficiencies.
3) If application deficiencies still remain after the assessment team's second attempt to resolve those
deficiencies, the N ELAP assessment team will document those deficiencies which are not resolved
and recommend to the NELAP Director that:
A) the accrediting authority's application for initial NELAP recognition be denied; or
B) the accrediting authority's NELAP recognition be revoked.
e) If the initial application as submitted contained no deficiencies or if deficiencies were corrected as
provided in subsection 6.3.3.2(d), except those deficiencies requiring legislative or rulemaking action
as set forth in Section 6.5, the NELAP assessment team will schedule the on-site assessment as set
forth in subsection 6.4.1 below.
f) If an accrediting authority elects to appeal denial or revocation of NELAP recognition resulting from the
Section 6.3.3 application technical review process, an accrediting authority must follow the procedure
set forth in Section 6.10 of this Chapter.
g) After review of the renewal NELAP-recognition application and supporting documents, the NELAP
assessment team will schedule an on-site assessment of the accrediting authority's environmental
laboratory accreditation program as set forth in Section 6.4 (a) and subsection 6.4.1 (a) below.
6.3.4 Notification of Changes to An Accrediting Authority's Program
a) For all changes in the accrediting authority's environmental laboratory accreditation program listed
below, the NELAP Director shall be notified of changes to:
1) the authority to accredit laboratories as stated in the statutes, regulations and promulgating
instructions establishing and governing the accrediting authority's environmental laboratory
accreditation program,
2) the organizational structure,
3) the rules, regulations, policies, guidance documents and standard operating procedures,
4) the mailing address and office location, telephone and telefacsimilie numbers and electronic mail
address, and
5) the contractual arrangements, including contractor's personnel, for laboratory accreditation activities
contracted out under authority of subsection 6.2 (c).
b) The notification to the NELAP Director shall be made within 30 calendar days of the change taking place
in the accrediting authority's environmental laboratory accreditation program.
c) The NELAP Director may request further documentation or conduct on-site assessments to verify that
changes in the accrediting authority's NELAP-recognized environmental laboratory accreditation
program do not place that program in violation of the NELAC standards.
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 13 of 22
6.4 ON-SITE ASSESSMENT OF THE ACCREDITING AUTHORITY
a) On-site assessments of an accrediting authority's environmental laboratory accreditation program shall
be conducted on a four-year cycle as follows:
1) An initial on-site assessment shall be conducted in conjunction with an accrediting authority's initial
application process and every four years thereafter; and
2) No on-site assessment of an accrediting authority's environmental laboratory accreditation program
is required for the two-year renewal application immediately following an application for NELAP
recognition where an on-site assessment was conducted.
b) The NELAP assessment team will arrange on-site assessments except as stated in subsection 6.4(c)
below at the mutual convenience of the parties.
c) The NELAP assessment team may make subsequentannounced or unannounced on-site assessments
of an accrediting authority's environmental laboratory accreditation program whenever such an
assessment is necessary to determine the accrediting authority's compliance with the requirements of
the NELAC standards.
6.4.1 Scheduling the On-Site Assessments
a) The NELAP assessment team shall contact the accrediting authority to schedule on-site assessments
as set forth in Section 6.4 (a) above within 20 calendar days of the date the NELAP assessment team
accepts an initial or renewal application.
b) The NELAP assessment team must send to the accrediting authority written confirmation of the logistics
required to conduct the on-site assessment. The written confirmation shall include, but is not limited to:
1) on-site assessment date and agenda or schedule of activities,
2) copies of the standardized assessment checklists,
3) the names, titles, affiliations, and on-site assessment responsibilities of the NELAP assessment
team members, and
4) the names and titles of all accrediting authority staff that need to be available during the on-site
assessment.
c) All on-site assessments shall be conducted no later than 50 calendar days following approval of the
application.
6.4.2 Conducting the On-Site Assessment
a) The purpose of the on-site assessment is to verify compliance with the requirements of the NELAC
standards including, but not limited to:
1) determining the accuracy of information contained in the accrediting authority's application and
supporting documents;
2) determining whether the accrediting authority's implementation of its environmental laboratory
accreditation program conforms with the information and data contained in the application and
supporting documents; and
-------
NELAC
Accrediting Authority
Revision 7
July 2,1998
Page 14 of 22
3) observing, upon recommendation of the NELAP assessment team and the approval of the NELAP
Director, an accrediting authority's laboratory assessor(s) conducting an on-site assessment of a
laboratory seeking initial or renewal NELAP accreditation. The NELAP assessment team members
shall not participate in the laboratory's assessment.
b) When conducting an on-site assessment, the NELAP assessment team shall, at a minimum:
1) review the accrediting authority's record keeping and documentation procedures;
2) conduct interviews with the accrediting authority's management and technical staff;
3) review selected laboratory accreditation cases;
4) review records of laboratory complaints, disputes and appeals; and
5) review quality assurance and internal audit procedures employed by the accrediting authority.
c) The NELAP assessment team shall have access to all records of the accrediting authority's
environmental laboratory accreditation program to determine compliance with the NELAC standards.
d) The NELAP assessment team shall have the opportunity to interview privately:
1) all management and technical staff of the accrediting authority's environmental laboratory
accreditation program; and
2) any NELAP-accredited laboratory receiving its accreditation from the applicant accrediting authority.
e) The NELAP assessment team must ensure that the assessment is conducted according to the schedule
as set forth in subsection 6.4.1 (b)(1) and consists of the following:
1) an opening meeting,
2) the physical assessment of the accrediting authority's environmental laboratory accreditation
program, and
3) an exit interview to discuss all noted deficiencies.
f) The NELAP assessment team shall conduct all assessments in accordance with the NELAP standard
operating procedure for conducting on-site assessments of accrediting authorities.
6.4.3 On-Site Assessment Reports
a) The NELAP assessment team will send by certified mail to the accrediting authority an on-site
assessment report within 30 calendar days of completion of the on-site assessment. The report shall
include, but is not limited to:
1) the date(s) of assessment;
2) the name(s) of the person(s) responsible for the report;
3) the NELAP recognition fields of testing being applied for; and
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 15 of 22
4) the comments of the NELAP assessment team on the accrediting authority's compliance with the
requirements of the NELAC standards.
b) If the on-site assessment does not reveal any deficiencies, the NELAP assessment team shall
recommend to the NELAP Director that the accrediting authority be granted or maintain NELAP
recognition.
c) If deficiencies are noted during the on-site assessment, the report will:
1) identify any specific deficiencies noted during the on-site assessment,
2) include references to the specific NELAC standards, and
3) provide suggested corrective action.
d) If the on-site assessment reveals deficiencies, the accrediting authority shall submit a plan of corrective
action to the NELAP assessment team within 30 calendar days of receipt of the on-site assessment
report.
1) The plan of corrective action must detail those specific actions taken or that will be taken by the
accrediting authority to correct all deficiencies noted by the NELAP assessment team during the on-
site assessment.
2) The plan of corrective action must include the accrediting authority's projected time to complete the
corrective actions not yet complete at the time of the accrediting authority's response to the on-site
assessment report.
3) Except for those deficiencies set forth in Section 6.5, the implementation of corrective actions must
take place no more than 65 calendar days from receipt of the on-site assessment report.
e) The NELAP assessment team shall recommend to the NELAP Director revocation or denial of NELAP
recognition for on-site assessment deficiencies for any accrediting authority that fails to submit a plan
of corrective action within 30 calendar days as set forth in subsection 6.4.3(d) above.
f) Within 20 calendar days of receipt of the accrediting authority's plan of corrective actions, the NELAP
assessment team shall review the plan and respond in writing to the accrediting authority.
1) If the accrediting authority corrects all deficiencies, the NELAP assessment team shall recommend
to the NELAP Director that the accrediting authority be granted or maintain NELAP recognition.
2) If the accrediting authority's plan of corrective actions does not address all deficiencies, the NELAP
assessment team will notify the accrediting authority by certified mail that it must submit another
plan of corrective actions for the remaining deficiencies not covered by Section 6.5 within 20
calendar days of the accrediting authority's receipt of this notification.
g) The NELAP assessment team shall review the corrective actions for the remaining deficiencies within
20 calendar days of receipt of a subsection 6.4.3(f)(2) response from the accrediting authority.
1) If all deficiencies are not corrected and the remaining deficiencies affect only certain fields of testing,
the NELAP assessment team shall recommend to the NELAP Director that the accrediting
authority's NELAP recognition be denied or revoked for those fields of testing for which on-site
assessment deficiencies remain.
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 16 of 22
2) If all deficiencies are not corrected and the remaining deficiencies affect the entire accrediting
authority's environmental laboratory accreditation program, the NELAP assessment team shall
recommend to the NELAP Director that the accrediting authority's NELAP recognition be denied or
revoked.
3) If the only remaining deficiencies require legislation or rulemaking as set forth in Section 6.5, the
NELAP assessment team shall recommend to the NELAP Director that the accrediting authority be
granted or maintain NELAP recognition.
4) If remaining deficiencies are corrected, the NELAP assessment team shall recommend to the
NELAP Director that the accrediting authority be granted or maintain NELAP recognition.
h) If the NELAP assessment team determines that the accrediting authority has falsified information
included in its application and supporting documents, the NELAP assessment team shall recommend
to the NELAP Director that the accrediting authority's NELAP recognition be denied or revoked.
6.5 ACCREDITING AUTHORITY'S REQUEST FOR EXTENSION OF TIME TO COMPLY WITH THE
NELAC STANDARDS
a) For all accrediting authorities applying for NELAP recognition prior to July 1, 2000, and upon written
request to the NELAP Director, through the NELAP assessment team, an extension of time, not to
exceed two years, to correct deficiencies noted in the accrediting authority's application and/or
deficiencies noted during the on-site assessment will be granted only:
1) when an applicant accrediting authority has an operating environmental laboratory accreditation
program for the fields of testing for which it is seeking or renewing NELAP recognition, and
2) when implementation of corrective actions to correct application and/or assessment deficiencies
requires the accrediting authority to promulgate new or revised regulations, or
3) when implementation of corrective actions to correct application and/or assessment deficiencies
requires the accrediting authority to seek new or revised legislation.
b) If the deficiencies continue to exist after two years from the date the extension was granted, the NELAP
recognition granted as set forth in subsection 6.4.3 (g)(3) above will not be renewed.
c) The accrediting authority shall include in its request for an extension of time to comply with the NELAC
standards a projected time table for correction of the application and/or assessment deficiencies.
d) For an accrediting authority seeking initial NELAP recognition on or after July 1, 2000, all NELAC
requirements must be met prior to being granted NELAP recognition.
e) Regardless of the date on which applications for renewal of NELAP recognition of an accrediting
authority are submitted, the extension of time provisions to correct deficiencies set forth in Section 6.5
of this Chapter shall remain in effect provided those deficiencies were caused by changes to the NELAC
standards.
6.6 NELAP ASSESSMENT TEAM RECOMMENDATIONS TO THE NELAP DIRECTOR
a) All recommendations required by this Chapterfrom the NELAP assessment team to the NELAP Director
must be made in writing.
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 17 of 22
b) All NELAP assessment team recommendations to the NELAP Director shall include the following
documentation when applicable:
1) a recommendation to grant, maintain or revoke NELAP recognition in full or in part;
2) a summary of the reasons supporting the recommendation;
3) a copy of all application review letters sent to the accrediting authority and all corrective action
response letters submitted by the accrediting authority to the NELAP assessment team;
4) a copy of all on-site assessment review letters sent to the accrediting authority and all corrective
action response letters submitted by the accrediting authority; and
5) a copy of the accrediting authority's requests for extension of time to implement corrective actions
if legislative or additional rulemaking is required pursuant to Section 6.5.
c) A copy of any NELAP assessment team's recommendation to the NELAP Director also shall be
furnished to the accrediting authority.
d) Within 20 calendar days of receipt of the NELAP assessment team's recommendation, the NELAP
Director shall provide written notification to the accrediting authority of acceptance or rejection of the
NELAP assessment team's recommendation.
e) The accrediting authority has the option to appeal a revocation or denial decision regarding NELAP
recognition by the NELAP Director as set forth in Section 6.10 of this Chapter.
6.7 CERTIFICATE OF RECOGNITION TO THE ACCREDITING AUTHORITY
a) The NELAP Director will issue a certificate of NELAP recognition dated the day on which NELAP
recognition is granted.
b) The certificate of NELAP recognition shall include the following items:
1) the name and address of the accrediting authority,
2) the fields of testing for which the accrediting authority is NELAP-recognized,
3) the date of the accrediting authority's most recent on-site assessment,
4) the expiration date of the accrediting authority's NELAP recognition which shall not be more than
two years from the date of the most recent date granting NELAP recognition,
5) the signature of the NELAP Director,
6) a statement that the accrediting authority is in compliance with the NELAC standards,
7) a statement that the accrediting authority has been granted the authority to accredit environmental
laboratories for the fields of testing for which the accrediting authority is NELAP-recognized,
8) a statement that continued NELAP recognition depends on compliance with the NELAC standards;
9) a seal incorporating the NELAP insignia; and
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 18 of 22
10) a unique designator, such as date of issuance and a serial or certificate number.
6.8 USE OF ACCREDITATION BY NELAP ACCREDITED LABORATORIES
a) The accrediting authority shall have requirements for controlling the ownership, use and display of the
accrediting authority's NELAP accreditation documents and for controlling the manner in which an
accredited laboratory may refer to its NELAP accreditation and/or use of the NELAC/NELAP logo.
These arrangements shall include, but are not limited to requirements that:
1) NELAP accredited laboratories post or display their most recent NELAP accreditation certificate or
their NELAP-accredited fields of testing in a prominent place in the laboratory facility;
2) NELAP accredited laboratories make accurate statements concerning their NELAP accreditation
fields of testing and NELAP accreditation status;
3) NELAP accredited laboratories accompany the accrediting authority's name and/or the
NELAC/NELAP logo with at least the phrase "NELAP accredited" and the laboratory's accreditation
number or other identifier when the accrediting authority's name is used on general literature such
as catalogs, advertising, business solicitations, proposals, quotations, laboratory analytical reports
or other materials; and
4) NELAP accredited laboratories not use their NELAP certificate, NELAP accreditation status and/or
NELAC/NELAP logo to imply endorsement by the accrediting authority.
b) The accrediting authority shall have arrangements to ensure that NELAP accredited laboratories
choosing to use the accrediting authority's name, making reference to its NELAP accreditation status
and/or using the NELAC/NELAP logo in any catalogs, advertising, business solicitations, proposals,
quotations, laboratory analytical reports or other materials, the NELAP accredited laboratory shall:
1) distinguish between proposed testing for which the NELAP-accredited laboratory is accredited and
the proposed testing for which the NELAP accredited laboratory is not accredited;
2) include the NELAP-accredited laboratory's accreditation number or other identifier; and
c) The accrediting authority shall have arrangements to ensure that the NELAP-accredited laboratories
upon suspension, revocation or withdrawal of their NELAP accreditation shall:
1) discontinue use of all catalogs, advertising, business solicitations, proposals, quotations, laboratory
analytical results or other materials that contain reference to their past NELAP accreditation status
and/or display the NELAC/NELAP logo, and
2) return any certificates for NELAP accreditation to the accrediting authority.
d) The accrediting authority shall have arrangements to take suitable actions, including legal action, when
incorrect references to the accrediting authority's NELAP accreditation, misleading use of the
laboratory's NELAP accreditation status and/or unauthorized use of the NELAC/NELAP logo is found
in catalogs, advertisements, business solicitations, proposals, quotations, laboratory analytical reports
or other materials.
6.9 REQUIREMENTS OF THE NELAP
a) The NELAP assessment team shall submit all documents, letters, assessment notes, checklists, etc. to
the NELAP headquarters office within:
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 19 of 22
1) 30 calendar days of the final decision on the application by the NELAP Director, or
2) 30 calendar days after the final recommendation by the Accrediting Authority Review Board (AARB)
as set forth in Section 6.10 of this Chapter.
b) The NELAP Director shall maintain complete and accurate records of all documents relating to the
application and on-site assessment processes for each accrediting authority for a minimum often years
or a longer period of time if required by contractual obligations or pertinent federal laws and regulations.
c) The NELAP Director shall maintain an electronic directory to display the status of all NELAP-recognized
accrediting authorities, pending applications for NELAP recognition and currently scheduled announced
on-site assessments.
6.9.1 NELAP Assessment Team
a) The NELAP Director shall appoint NELAP assessment team members as set forth in Section 6.3.3 (a)(4)
and delegate the responsibilities required by this Chapter to assessment teams.
b) During the time prior to the NELAP issuing the first NELAP recognitions to accrediting authorities, the
NELAP assessment team shall consist of at least one member who is an employee of the USEPA and
at least one member who is an employee of another operating territorial, state or federal environmental
laboratory accreditation program.
c) No later than two years from the date that the first accrediting authority recognitions are announced, the
NELAP assessment team shall consist of at least one member who is an employee of the USEPA and
at least one member who is a employee of a NELAP-recognized accrediting authority.
d) Prior to conducting the on-site assessment of an accrediting authority's program, at least one member
of the NELAP assessment team shall complete the NELAP Accrediting Authority Assessor Training
Course.
e) The NELAP assessment team shall:
1) have at least one member of the NELAP assessment team who meets the education, experience
and training requirements for laboratory assessors specified in the NELAC standards, Chapterthree,
On-Site Assessment; and
2) have at least another member with experience that includes at least one of the following:
A) certification as a management systems lead assessor (quality or environmental) from an
internationally recognized auditor certification body;
B) one year of experience implementing federal or state laboratory accreditation rulemaking;
C) laboratory accreditation management; or
D) one year experience developing or participating in laboratory accreditation programs.
3) All experience required by this subsection must have been acquired within the five year period
immediately preceding appointment as a NELAP assessment team member.
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 20 of 22
6.10 APPEALING DECISIONS TO DENY OR REVOKE NELAP RECOGNITION
a) Within 20 calendar days of official notification of the NELAP action on an accrediting authority's
application for NELAP recognition, the accrediting authority shall notify the NELAP Director and the
Accrediting Authority Review Board (AARB) (as established in the NELAC standards, Chapter one,
Policy and Structure) if the accrediting authority chooses to appeal the NELAP action.
b) If any AARB member is not free of financial connection to the appealing accrediting authority, or is not
free of any other relationship that would bias their review of the case, that AARB member shall be
excluded from participating in deliberations on that appeal.
c) The AARB shall carry out an independent review of the entire record (all application information,
checklists, review notes, on-site assessment notes, letters, reports and any other data in the N ELAP and
NELAP assessment team files).
d) The AARB must conduct interviews with the accrediting authority, the NELAP assessment team
members and the NELAP Director. The AARB also may conduct interviews with other individuals
deemed appropriate by the AARB.
e) If the accrediting authority so desires, an opportunity for both the NELAP and the accrediting authority
to appear before the AARB shall be granted. Such a meeting shall be held in the state of the appealing
accrediting authority.
f) The AARB must complete its review and render a final recommendation to the NELAP Director within
90 calendar days following receipt of the notice of appeal.
g) The ultimate decision to grant, maintain, deny or revoke NELAP recognition remains with the NELAP
Director. The NELAP Director shall notify the appealing accrediting authority of his/her decision within
20 calendar days of receipt of the recommendation from the AARB.
h) Accrediting authorities shall be limited to one appeal for each application cycle.
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 21 of 22
National Environmental Laboratory
Accreditation Conference,
Accrediting Authority
Figure 1: Flow Chart for NZIAP Recognition of An
Accrediting Authority
-------
NELAC
Accrediting Authority
Revision 7
July 2, 1998
Page 22 of 22
0
Figure 1: Flow Chart for NXLAP Recognition of An Accrediting Authority
-------
U.S. Environmental Prelection Agency
Region 5, Library (PL-12J)
77 West Jackson Boulevard, 12lh Floor
Chicago, IL 60604-3590
-------
t
n>
o
------- |