United States __.. ,.nnl. 00 „, n
Environmental Protection EPA 600/4-82-018
Agency MARCH 1982
&EPA Research and
Development
TECHNICAL ASSISTANCE DOCUMENT: TECHNIQUES TO DETERMINE
A COMPANY'S ABILITY TO CONDUCT A QUALITY STACK TEST
Prepared for
OFFICE OF AIR, RADIATION AND NOISE
DIVISION OF STATIONARY SOURCE ENFORCEMENT
TECHNICAL SUPPORT BRANCH
Prepared by
Environmental Monitoring Systems
Laboratory
Research Triangle Park NC 27711
-------
TECHNICAL ASSISTANCE DOCUMENT: TECHNIQUES TO DETERMINE
A COMPANY'S ABILITY
TO CONDUCT A QUALITY STACK TEST
Eva D. Estes William J. Mitchell
Systems and Measurements Division Quality Assurance Division
Research Triangle Institute U.S. EPA
Research Triangle Park, NC 27709 RTP, NC 27711
Contract No. 68-02-3431
EPA Project Officer: William J. Mitchell
Quality Assurance Division
Environmental Monitoring Systems Laboratory
Office of Research and Development
U.S. Environmental Protection Agency
ENVIRONMENTAL MONITORING SYSTEMS LABORATORY
OFFICE OF RESEARCH AND DEVELOPMENT
U.S. ENVIRONMENTAL PROTECTION AGENCY
RESEARCH TRIANGLE PARK, NORTH CAROLINA 27711
-------
ABSTRACT
Techniques are described to determine a company's ability to
conduct quality stack testing both before testing begins and during the
actual testing. The techniques described are intended for use by
Industrial and governmental agencies that conduct or contract for stack
testing. The techniques suggested apply to stack testing methods in
general, that is, they are not limited to EPA Reference Test Methods
alone.
-------
ACKNOWLEDGMENTS
The authors wish to thank the following individuals who reviewed
and commented on the document during its preparation: Mike Fogel
(Georgia Department of Natural Resources); Bill DeWees, Diane Albrinck
and Wade Mason (PEDCo Environmental, Cincinnati, Ohio); Walt Smith
(Entropy Environmentalists, Inc., Raleigh, NC); Lynn Kuryla (PPG
Industries, Barberton, Ohio); Tom Lowe (Kaiser Aluminum, Pleasanton,
California); Keith Bentley (Georgia Pacific Corporation, Atlanta,
Georgia); Dan Fitzgerald and Joe Wilson (Scott Environmental
Technology, Plumsteadville, Pennsylvania); Bruce Ferguson (Harmon
Engineering and Testing, Auburn, Alabama); Warren Kelly and Roland
Hebert (Scott Environmental Technology, San Bernadino, California) and
Jim McGaughey, Denny Wagoner, and Tony Eggleston (TRW, Research
Triangle Park, North Carolina).
-------
TECHNIQUES TO DETERMINE A COMPANY'S ABILITY
TO CONDUCT A QUALITY STACK TEST
INTRODUCTION
This document has four parts. Part I identifies features
generally found in companies with a history of conducting valid stack
tests, i.e., tests that meet all the associated legal and technical
requirements including the issuance of a clear and complete final
report. Companies that conduct stack testing can use the information
in Part I to upgrade stack testing knowledge and operating procedures
of their staff. Individuals who contract for or monitor stack tests
will increase their knowledge of techniques that can be used to
validate the test results.
Part II contains a set of questions that can either be mailed to
the company being evaluated for their response or can be asked in a
phone conversation discussing the test. These questions, which are
based on the points discussed in Part I, are intended to allow a
reasonable decision to be made about the ability of a company to
conduct the stack testing required when a specific test is eminent. If
the user wants merely to classify laboratories into categories such as
"likely capable," "most likely capable," and "likely not capable," and
does not have a specific test in mind, these questions will not give an
accurate assessment of a company's technical abilities unless a
specific source sampling situation is presented to the company.
If extensive stack sampling is anticipated, it may be useful and
cost-effective to visit those laboratories identified as likely able to
conduct the testing. Part III suggests things to look for during this
-------
visit, e.g., the quality of the equipment, the completeness of test
reports and records, and the knowledge of the personnel about their
work.
Part IV suggests some straightforward and simple techniques to
determine if the personnel supplied for the stack test are properly
trained and their equipment is properly calibrated and in good repair.
It also suggests some techniques to determine the accuracy and adequacy
of their sample recovery and analysis procedures.
The approach described in this document is not limited to USEPA
methods and is flexible enough to balance the degree of certainty the
user requires with the resources available for the testing.
-------
PART I
APPLICABLE FEATURES FOUND IN COMPANIES WITH A HISTORY
OF CONDUCTING VALID STACK TESTING
A source test company can be organized in many ways and still meet
the requirements necessary for valid stack testing. The organizational
structure will certainly be affected by the type of testing done, the
geographical area covered and the size of the company. Some companies
find it efficient to separate field activities (equipment maintenance,
calibration, performance of the stack testing, etc.) from laboratory
activities (analysis, filter weighing, etc.). Other companies find it
f
more convenient to combine field and laboratory activities under one
individual.
An effective quality assurance program should allow the company's
performance to be monitored so that erroneous results are identified
and corrected before they are reported. Although many techniques are
available to ensure a viable quality assurance program, the
sophistication of the program actually installed must be determined by
balancing the degree of data quality desired with the amount of
resources (funds, personnel) available.
The cost of an adequate quality assurance program can range from
as little as 2% up to 25% of the total cost of the testing program
depending on the purpose of the program and the methods utilized. For
example, sophisticated methods generally require a higher level of
personnel training and more extensive equipment maintenance and
calibration than less sophisticated methods. A company that conducts
testing only within easy driving distance of its laboratory and uses
only one or two test methods will likely have lower quality assurance
costs than one that conducts testing over a larger geographical area
and uses sophisticated test methods.
Regardless of the organizational structure chosen or the size and
sophistication of the quality assurance program, certain character-
istics are desirable to assure that good quality, useful data are
obtained. The most desirable characteristics are listed below by
subject area.
-------
ORGANIZATION
There is a detailed organizational structure which clearly
delineates each person's duties and responsibilities so that
someone outside the organization can determine who 1s responsible
for each activity.
There are effective methods for communication between individuals
in different parts of the organization to ensure that all
objectives of the test are met.
One individual is assigned the overall responsibility for quality
assurance (or quality control) with direct access to management,
e.g., without having to go through the stack test project manager.
PERSONNEL EXPERIENCE/TRAINING
J
\
There is an established training program that company personnel
must complete before being qualified to perform their assigned
tasks without constant supervision. This program should be under
the direction of a specific person or persons and a written record
of training accomplishments kept for each individual. Ideally,
the proficiency of technical personnel should be periodically
reviewed in some planned manner and the results documented.
Current periodicals such as the Journal of the Air Pollution
Control Association, Pollution Engineering, Industrial Research,
Environmental Science and Technology, Analytical Chemistry, etc.,
are circulated among the staff with encouragement to read them.
Personnel are encouraged to participate in certification/audit
programs, workshops, conferences and training courses and the
results documented.
-------
GENERAL OPERATING PROCEDURES
Established step-by-step instructions are available in writing for
personnel who routinely perform the activities of sampling,
shipping, storage, and analysis.
The history of a sample (sample identification, chain-of-custody,
analysis and sampling methods used, calibration curves, quality
control/assurance procedures used, calculations, record-keeping,
etc.) is clearly documented, dated, and maintained in an organized
manner so as to be easily followed by someone who did not actually
collect or analyze the sample.
The company has a reliable, effective communication system for use
in those stack tests that require the sampling and/or monitoring
of several process points simultaneously. (Walkie-Talkies have
greater range and location flexibility than either sound-powered-
phones or a field intercom system. FM type Walkie-Talkies are
preferred over CB types in situations where powerful mobile CB
units can prevent effective communication between low powered CB
Walkie-Talkies or where personnel will not be visible to each
other.)
A record is kept on the receipt of chemicals, reagents, and gas
cylinders with an indication of their expected shelf life and the
date of receipt recorded on the actual container if at all
possible.
SAMPLING AND ANALYSIS
Operational checks are specified to verify the proper assembly of
sampling equipment and to ensure sufficient spare equipment is
shipped to the test site to allow completion of the test in a
timely manner. Spare glassware, equipment, filters, dry gas meter
and other equipment that can break or change calibration should be
available at the test site.
-------
Calibration procedures and schedules are established for all
important sampling and analysis equipment and stored in a
permanent file.
When equipment is calibrated, a logbook entry is made that
contains the name of the person doing the calibration, the date of
the calibration, and the calibration factor determined.
Calibration and reference standards are traceable to USEPA, NBS
SRM, or other recognized standards such as ACS, wherever
possible.
Calibration records are maintained in an organized file for a
reasonable amount of time, e.g., two years.
Standardized Reference Test Methods (EPA, ASTM, ASME, etc.) are
used whenever available and other methods used are clearly written
and contain all necessary documentation to fully describe the
modified or special test method and the testing results.
All chemical solutions used in sampling are checked to assure
proper makeup and are properly stored to prevent contamination or
deterioration.
A reagent blank is routinely carried through all sampling
procedures, whenever appropriate. All glassware is properly
cleaned and stored between tests.
DATA REDUCTION AND VALIDATION
s
An effective quality assurance program will ensure that all
samples are subjected to data validation techniques that are
adequate for the intent of the sampling results and compatible
-------
with the type of sample (routine, nonroutine). For example, for
routine samples (similar samples collected and analyzed more often than
every 4 to 6 weeks) quality control charts such as those described in
the Quality Assurance Handbook* may be appropriate. The basic idea
of control charts is to determine the normal limits (precision)
associated with the sampling results (based on the past performance of
the laboratory) so that sample data that fall outside these limits can
be scrutinized to determine the source of the imprecision (error).
Standard samples or control samples are generally used to establish
these limits and the longer a specific control chart is used, the
easier it becomes to identify erroneous data.
When used in conjunction with the Method of Standard Additions,
control charts can be very effective in identifying erroneous results,
particularly those associated with the analysis of the sample. Errors
resulting from mistakes made in the field, however, may not be identi-
fied by this approach since many sources do not emit a constant level
of pollutant over long periods of time. For best results the standards
should be carried through the entire analytical procedure when utiliz-
ing this approach.
Control charts may not be cost-effective on samples collected at
infrequent intervals and thus the use of Standard Additions or control
samples may be adequate. This latter method alone will tend to
identify incorrectly prepared standards and other laboratory errors but
will not identify errors caused by mistakes made during sample
collection.
Whenever possible, a spiked or synthetic sample should accompany
samples sent to another company for analysis to document the
performance of the contracting company. A method to validate that
sample integrity was maintained during transport should also be used.
Companies that contract out work should have an in-place, functioning
standard procedure for documenting the accuracy of the contractor's
results and an established criterion for selecting the contracting
company.
* Quality Assurance Handbook for Air Pollution Measurement Systems,
Volume III - Stationary Source Specific Methods, EPA-600/4-77-027b, Re-
search Triangle Park, NC (1977).
-------
Some examples of good data reduction and validation practices are:
All laboratory and field data sheets are complete, dated and
signed by the analyst/operator and checked or reviewed by manage-
ment personnel. Data sheets should be filled out in ink (and not
pencil).
Standard written procedures, forms, etc., are used to perform
necessary computations, data reductions, and validations.
The company has a specific policy for reporting data including the
number of significant figures and detection limits.
Preventive maintenance schedules are established for each impor-
tant physical (electrical, electronic, mechanical) portion of
equipment which could affect the validity of the results or cause
delays at the test site.
There is an established, closed-loop system to ensure that
corrective actions are taken when problems or out-of-control
conditions are detected and reported.
Problems that have been discovered and corrected are recorded in a
consistent manner stating the deficiency, the corrective action
taken, and the date of the action.
Independent, duplicate checks are frequently made of the key
aspects of the sampling and measurement systems.
Blind samples are periodically introduced into the analytical
system for purposes of quality control.
The company participates in external performance audits that are
relevant to the test methodology it uses (e.g., dry gas meter
calibration surveys and performance surveys for Methods 6 and 7
-------
conducted by the Source Branch, Environmental Monitoring Systems
Laboratory, U.S. Environmental Protection Agency, Research
Triangle Park, North Carolina).
RECORDKEEPING
A field logbook or "events log" is maintained in ink by the
sampling team for recording field measurements and other pertinent
observations. This may consist only of a binder containing all
the data sheets necessary for recording the test data, but many
companies find it beneficial to have logbooks for the sampling
operator, the recovery team and the analyst.
Samples are labelled in a permanent manner with an identification
number, the date, time, and sampling location; the name of the
tester; the sample liquid level at the time of collection and any
required preservation techniques; light- or temperature-sensi-
tivity; allowable holding time.
The analyst keeps a record of samples submitted, date received and
analyzed, sample storage techniques used and analyses done,
including procedures used and the person responsible.
Standardized forms are used whenever practical to ensure that all
required data are collected and recorded. Examples of how forms
can be used to simplify compiling the final report, while at the
same time ensuring that all objectives of the test are met, are
described below.
1. Presurvey Forms. In most cases adequate pre-test planning
requires that a staff member familiar with the requirements of
the test method, either visit the test site or talk to the
appropriate plant personnel via phone to determine the
equipment needed for the test. The use of a presurvey form
can ensure that all the equipment/personnel needs for the test
program are fulfilled before the test team leaves their
-------
laboratory. This will yield considerable cost savings and
improved data quality. Based on their workload and type of
testing done, some companies may find it best to use one form
for all methods while others may find it more effective
to use different forms for different pollutants, methods,
sources, etc. Regardless, a good presurvey form would at
least address the following points.
- objective of the test (research, compliance, equipment
performance, etc.) and desired test and report date
- names of contact personnel at plant, their telephone
numbers and their availability and function during the
testing
- plant operating schedule (batch, cyclical, continuous) and
shutdowns anticipated
- sketch of sampling location(s), number of points to be
sampled at each location, size of test ports, clearance
around ports, availability of electrical power, and its
location with respect to test site.
- velocity profile(s), stack temperature, pollutant concen-
tration(s) expected and how and when determined
- major components in the stack gas (X^O, 5KCO, %C02,
etc.), any hazardous materials likely present, concentra-
tion of pollutant that will be encountered.
- test method(s) to be used and special calibrations or
equipment required (e.g., special probes, reagents, etc.)
- physical layout of sampling site, availability of parking
space, and a list of the equipment required to put the
sampling train in place (ladder, pulley, electrical cords,
electrical outlet adapter, monorail), and availability of
1ce, chemicals, gas samples, etc., needed for the test
- location and phone number of first aid station and
hospital, required plant procedures and safety practices.
- minimum number of samples to be collected, what process
points are to be monitored during test and at what
interval the data should be recorded
-------
- Identification and physical description of test sample
recovery facility
- location and phone numbers of nearest motels and map of
area
2. Test Plan/Accomplishment Form. This form can be included as
part of the presurvey form, but for simplicity it may be
better to have a separate form that is completed after the
presurvey has been done. It should be consulted by those
responsible for calibrating, packaging and shipping the
equipment to the test site and also be supplied to the testing
crew at the test site for reference. At a minimum this form
should contain the following information:
- number/type of equipment required for test (including
spare parts) and testing schedule planned or required
- -specific test procedure(s) to be used with copies attached
- what pretest/post-test equipment maintenance and cali-
bration are required in the laboratory and at the site,
date done and values obtained
- how samples are to be shipped, stored, labelled, analyzed,
etc., and what, if any, sample stability problems might be
encountered
3. Equipment Shipment/Transport Forms. If desired, this form can
be included in the Test Design form described above. It
should contain the following information with apropriate means
to certify that the required equipment and other materials
have been packaged for shipment.
- type/number of items required and packaged
- calibrations done/calibration factors obtained
- methocl of shipment/number of boxes
- what field calibrations should be done
- all necessary data forms and testing procedures required
for the test
-------
4. Shipping Labels for Equipment. Labels that clearly identify
the contents of each box greatly facilitate sample set up in
the field. They also help ensure that all required equipment
is sent to the test site.
5. Field Testing Form. This form, which is completed in ink by
field personnel, should contain all pertinent information
about the sampling and process operating conditions encounter-
ed. Items that should be addressed on this form are:
- sampling equipment serial number/calibration factor
- stack velocity profile, temperature, sampling point
location
- names of persons conducting the test and their specific
functions (sampling, sample clean-up, process monitoring,
etc.)
- plant name, date of test, special instructions to analyst,
etc.
- how samples were stored, shipped, etc.
- any field tests done to ascertain the accuracy of equip-
ment calibration.
- description of process and control device design and
operating conditions and any problems encountered
6. Sample ID Forms (Labels). All samples should be tagged
immediately after collection with a label that identifies all
pertinent sample data such as sample number, plant name,
project number, date collected, type sample, special storage/
transport conditions, sample volume, name of person taking
sample.
7. Sample Analysis Form. This form, which is completed by the
analyst and may be only a laboratory notebook, should contain
the following pertinent information:
-------
- analytical method used
- analytical equipment calibration/standardization done
- integrity of the samples as received and date received
- how results were calculated
- any special conditions that occurred that might have
affected the quality of the data
- analyst(s) name, date of analysis.
8. Final Report Format. Issuance of a clear, complete final
report in a timely manner (e.g., 4-6 weeks) is the sign of a
well-managed testing company. The longer it takes to issue a
final report after the testing/analysis is completed, the more
likely that significant errors of omission will occur. If the
company is unable to issue final reports routinely in a timely
manner, it could indicate that their staff or facilities are
inadequate. The procedures used in the testing and analysis
should be well documented in the final report. At a minimum,
discussions of field sampling methods, analytical procedures,
sample handling logs and quality control checks should be
included.
NOTE: Appendix A contains a checklist that summarizes the informa-
tion that could be contained on these standardized forms.
-------
PART II
QUESTIONNAIRE
This section contains a set of questions that can be completed by
the companies under consideration or filled out by the evaluator via a
phone conversation with the company. If mailed, the questionnaire
should be accompanied by detailed information on the facility and
pollutant(s) to be measured, so that the respondees can provide
specific information on their ability to perform the desired testing.
Questions that do not apply to the testing under consideration should
be crossed out by the evaluator before sending it to the company.
If the size or importance of the test warrants it, reference or audit
samples could also accompany the questionnaire. The number of samples
to be collected, the process parameters to be measured and similar
information should be given if available.
Table II-l is an example of the type of information that should
accompany the questionnaire. For example, if two points are to be
sampled simultaneously or if very cyclical (batch) processes are to be
tested, the questionnaire should include this information.
-------
QUESTIONNAIRE
(Write N/A If question does not apply
and explain answers where appropriate)
Date
I.
Name of Company
Address
Phone No.
Responsible person: Source Testing
Analysis
QA/QC
Ext,
Ext,
Ext.
TABLE II-l
The following facilities will be tested
PROCESS
CONTROL EQUIPMENT
EMISSIONS OR
COMPOUNDS TO BE
MEASURED
REQUIRED OR
DESIRED TEST
METHODS TO BE
USED (IF KNOWN)
Please respond to testing the above facilities and compounds for the
purpose of .
__ Compliance - Federal
_ Compliance - State
__ Compliance - Local
__ Engineering evaluation
Control equipment vendor's guarantee
-------
Special instructions about the testing, process cycle, etc., that
you should consider in your response are:
II. EXPERIENCE/TRAINING
1. How long has your company been doing source testing?
2. Categorize your source test projects:
a) % Compliance
b) X Government % Industry
3. Indicate name, education/training/experience of staff that
would do the testing using Table II-2 and/or by inclusion of
resumes.
4. Which of the following training/educational programs are
required and/or available for your employees:
Professional Technician
ReqTAvail. ReqTAvail.
In-place training program
Classroom
"Hands-On"
Certification/audit programs
Conferences
Workshops
Proficiency tests
Circulation of journals
Other: (describe briefly)
5. Is your analytical laboratory accredited by any associations
or state or federal agencies whose accreditation is pertinent
to the testing under consideration? If yes, please describe
briefly.
-------
TABLE II-2. EDUCATION/TRAINING/EXPERIENCE OF PROPOSED STAFF
INDIVIDUAL'S
NAME AND
DEGREE
POSITION
OR
TITLE
YEARS
EXPERIENCE
APPROPRIATE NUMBER
OF COMPLETE
TESTS SUPERVISED/
PERFORMED IN LAST
12 MON.
3 MON.
INDIVIDUAL
WILL PERFORM
FOLLOWING
TEST METHOD/
MEASUREMENTS
IN PROPOSED
WORK
NOTE: If more than one person may perform a specific procedure or you are not able
at this time to specify the personnel most likely to be sent to the test
site, please describe the qualification of all personnel who might be sent.
-------
III. 1. Identify parameters that you would monitor and record for the
process and control equipment listed in Table II-l.
2. Are your personnel trained to monitor these parameters during
the test? Yes No
3. List QA/QC procedures you will use to document the quality of
the test results.
4. Will you subcontract any of the proposed work?
Yes No
If yes, describe how you will select the subcontractor and
verify the validity of their results.
-------
5. List any specific QA/QC procedures that you intend to use in
the proposed work.
IV. FACILITIES
1. Do you have a special laboratory or area designated for source
test analysis? Yes No
2. Is there a reference file available that contains pertinent
reference books on Reference Test Methods for ASTM, EPA, etc.?
Yes No
3. Do you have a sample storage area that can be locked?
Yes _^^ No
Can it be refrigerated? Yes No
Can it be heated? Yes No ^^
4. Do you have a mobile laboratory with capabilities to perform
analysis at the field site? Yes No . If
appropriate for this proposed work, include sketch and/or
description with response.
-------
V. GENERAL OPERATING PROCEDURES
1. Do you have established step-by-step procedures readily
available to those personnel who routinely perform the
following activities?
Yes No
Sampling
Shipping
Storage __
Analysis
2. Is there a document control system to assure that these
procedures are current and complete? Yes No
3. Do you have established, written chain-of-custody procedures
for samples? Yes No
4. Is there an inspection procedure which you follow to determine
if procurements meet quality control and acceptance
requirements when received? Yes No
If yes, describe briefly.
5. Do you maintain a log of incoming items such as chemicals,
reagents, and other materials with an indication of their
expected shelf life?
Yes No _^_^
Do you require that the expiration date be recorded on the
container? Yes No
6. Do you frequently perform consecutive source test projects
without returning equipment or personnel to the base
laboratory? Yes No
If yes, do you have procedures available that would be used to
field check the calibration and accuracy of the equipment when
it cannot be returned to the laboratory for this purpose?
Yes No
7. What is the normal time lapse (in months) between testing and
issuance of a report for a compliance test and for a
test done for noncompl i ance purposes .
VI. SAMPLING
1. Do you normally perform a site pre-survey via telephone or
site visit as appropriate? Yes No
-------
2. Indicate the calibration equipment to which you have access,
the size or capacity of the equipment and its location, e.g.,
on-site, via purchase order/contract, etc.?
Location
Calibrated Dry Gas Meter
Wet test meter
Wind tunnel
NBS traceable weights
Other
3. Do you have written calibration procedures and schedules for
all important sampling equipment? Yes No
4. How often are the following calibrated:
Dry gas meter and orifice
Nozzles
Pi tot tubes
Temperature measuring device
Barometers
Flowmeters
VII. DATA REDUCTION AND VALIDATION
1. Do you have standard written procedures or programs for the
computations, data reductions, and validations usually done in
your laboratory? Yes No
2. Do you use the Method of Standard Additions to check for
matrix effects in the chemical analysis of samples where
applicable? Yes No
3. Are chart papers and tapes retained as a part of the permanent
record? Yes No
4. Are quality control charts routinely used in your laboratory
for any of the proposed measurements? Yes No
VIII.PREVENTIVE MAINTENANCE
1. Are preventive maintenance schedules established and
documented for each important piece of sampling and analysis
equipment? Yes No
2. Are written records of maintenance actions that affect data
quality filed in an organized system? Yes No
-------
IX. AUDITS
Does your company participate in the following external audits
conducted by the Source Branch, Environmental Monitoring
Systems Laboratory, US EPA, Research Triangle Park, NC?
Yes No
Dry gas meter calibration survey
Performance survey for Method 6(SO )
Performance survey for Method 7(NO )
2. What other audits does your laboratory participate in?
X. REFERENCES
Name at least three of your clients that can be called for
references about your previous sampling and analysis work in the areas
of the proposed work.
Compound Measured Company Name Person to Contact Phone No.
XI. GUARANTEE
Will you quarantee that your procedures and equipment will meet
the specifications outlined in the test plan? Yes No
-------
PART III
TEST COMPANY VISIT
This visit to the test company's office is intended to give the
evaluator a more comprehensive knowledge of their ability to conduct a
quality stack test. It is intended to check questionnaire responses
that are unclear and to ensure capabilities in areas that are critical
to the conduct of the stack testing being considered. During this
visit the evaluator can interview laboratory and field personnel to
determine their capabilities, experience, knowledge about the test
method, etc. The evaluator can also also examine: conditions of the
sampling equipment, calibration logbooks, calibration schedule, quality
and completeness of randomly selected field test reports, overall size
of the facility (work area size, equipment location, lighting,
cleanliness, ventilation, sample/solvent storage), and the existence of
sample chain-of-custody procedures.
For the visit to be cost-effective, the individual conducting the
visit should have prior actual field testing experience and preferably
extensive laboratory experience with source testing methods. If a
single, one-time only test is planned, and the cost of the visit is
15-20% of the total cost of the test, this visit will likely not be
cost-effective unless they are the only one available and the test
results will be used in litigation, obtaining a permit to operate, etc.
A surprise visit should be avoided to ensure that the appropriate
personnel, equipment, and records are available. As an alternate to
the visit, simply having the company submit copies of some test reports
'for inspection will yield the information desired by the evaluator.
-------
LABORATORY VISIT
I. EXPERIENCE/TRAINING/KNOwlEDGE
Select specific individuals in the organization who would be
involved in conducting the testing and interview them to determine
their level of experience and knowledge about the job responsi-
bilities. Some examples follow:
A. Identify the quality assurance/control officer named in the
questionnaire. Ask him for a copy of the written policy
directive for quality assurance. If no written plan is
available, have him briefly describe the procedures they would
use for the specific testing under consideration.
Comments:
B. Talk to the person(s) responsible for training and have him
briefly describe training procedures. In your judgement:
1) Is there an active training program? Yes No
Are the individual elements of the
program clearly defined? Yes No
Comments:
2) Ask to review the records of employee training
accomplishments, particularly for those personnel who will
be involved in the proposed testing.
Are the records well-organized? Yes No
Are there records of recent accomplishments, including
continuing education? Yes No
Comments:
3) Many accrediting associations or agencies require periodic
analysis of known samples. If the laboratory is partici-
pating in such programs, ask to examine the results,
particularly if they relate to the testing under
consideration.
C. Ask the person who normally performs the pre-survey to briefly
describe presurvey procedures and examine a completed pre-
survey form if at all possible.
Comments:
-------
D. If applicable, ask a member of the sampling crew to show you a
copy of the written sampling procedures for the specific test
methods that will be used.
Are they easily located?
Step-by-step?
Concise?
Clear? """
Comments:
E. Ask an analyst to show you a copy of written analysis proce-
dures for the specific methods under consideration.
Are they easily located?
Step-by-step?
Concise?
Clear? ""
Updated? "~
Comments:
II. EQUIPMENT INSPECTION
A. Ask a member of the sampling team to show you the equipment
typically used for a particular test method or for the methods
that will be used in the testing.
Is the equipment as specified in the test method?
Quickly located?
Clean?
In good repair?
Recently calibrated (check logbook)?
Comments:
B. Review the calibration procedures and records for selected
pieces of equipment. Are deficiencies, corrective actions and
dates clearly documented? Yes _____ No
C. Examine field communication systems to determine condition and
applicability for use in the sampling situation anticipated.
-------
III. Recordkeeping/Report Writing
A. Ask to see final reports for at least three field tests and
record the following:
Report 1 Report 2 Report 3
Type Test
Date of testing
Date report
issued
B. Select at least one of these reports and determine its
completeness, clarity, etc., using the report itself and the
background file. For example, most test reports should
address the following points:
Date of test
Objective of test
Process points monitored and interval at which data were
recorded
Physical layout of sampling site
Plant operating schedule during test
Pollutant concentrations found
Stack velocity, profile, temperature, gas composition
Names, functions of persons conducting the test and
analyzing the samples
Sampling equipment ID number/calibrat ion factor
Specific test procedure(s) used in sampling and analysis
Pre-test/post-test equipment maintenance/calibration,
standardization done
How samples were shipped, stored, labelled, and their
condition upon receipt, etc.
Calculation methods used
Any special conditions that occurred that might have
affected the quality or validity of the data
IV. Data Reduction and Validation
A. Ask a staff member for a copy of procedures, forms, etc., for
data reduction.
Are they step-by-step? Yes No
Easy-to-follow? Yes No
B. Are the statistical techniques used to validate data (control
charts, duplicate sample analysis, method of standard addi-
tions or control samples, internal audits, external audits)
adequate for the testing under consideration? Yes No
-------
V. Preventive Maintenance
1. Select a major piece of equipment used in testing and ask to
see the maintenance history of the equipment. Is it well
organized and the action taken to correct problems clearly
documented?
-------
Part IV
Field Performance Evaluation
Stack testing 1s a highly-technical field in which proficiency
comes about primarily through hands-on experience. Although the
assembled equipment may look sophisticated, most stack testing
equipment is made from inexpensive, readily-aval Table equipment.
Further, many test methods utilize identical equipment, e.g., the EPA
Reference Method 5 (particulate) meter box (control console) is also
used in EPA Reference Methods 8, 12, 13A, 13B, 14, and 17.
Spot-checking the reliability of a testing company's equipment can
be done, frequently, inexpensively and in a straightforward manner by
someone without extensive field experience. The use of audit samples
and devices, where appropriate, and careful observation of test
personnel actions supplemented by conversations with them can also be
effective tools in evaluating a test company's ability to conduct a
valid stack test.
Some procedures and equipment that can be used in the field
evaluation are described here. .All field calibration checks should
give a value within three percent of the stated value unless a
different specification is given. If stack testing is done at a
certain site on a frequent basis, appropriate auditing equipment could
be stored at each site to minimize the chance the equipment will be
damaged during shipment to the test site.
The following field-equipment evaluation procedures are designed
to help evaluate EPA Reference Method equipment and similar equipment
used in ASTM and ASME test methods. However, these procedures can be
used to design a similar evaluation for any other method. Many of
these procedures could also be used when visiting the company's
laboratory facilities.
-------
I. Equipment Checks
A. Nozzle. A sharp-edged, accurately calibrated nozzle is
mandatory when a test method requires isokinetic sampling. For highest
accuracy, the nozzle should be perfectly round. However, acceptable
limits are met when three measurements of the nozzle I.D. differ by
less than 0.004 inches (0.1 mm). Prior to the test, the roundness and
I.D. of the nozzle can be verified using a micrometer (or dial
caliper). A visual inspection will indicate the overall condition of
the nozzle with respect to corrosion, nicks, dirt, etc. Another check
that can be done is to require the contractor to wash the nozzle before
use, collect the wash, and return it to the laboratory with the samples
to serve as a field blank. This will also ensure that the nozzle is
clean before testing is initiated.
B. Probe. A check of the probe heat ing-system calibration cannot
be done in the field, but it is possible to inspect the probe to
ascertain that it is properly constructed and is the correct length;
that the heater is working; and that the probe liner is clean and free
of cracks. As with the nozzle, the probe can be washed before use and
the wash used as a field blank. This can be particularly useful when a
metal liner is being used. By requiring the test team to do this, the
evaluator will also get an idea about the proficiency of the test team,
their capability to adequately recover sample from a probe and the
cleanliness of their equipment.
C. Filter Temperature Control. Accurate temperature control can
be a critical parameter when the regulation requires a specific minimum
or maximum filter temperature. An inexpensive, calibrated dial type
thermometer can be used to verify the accuracy of the filter tempera-
ture box heating system. This dial thermometer can be inserted through
the filter entrance or exit holes in the filter box and the temperature
monitored to ensure that the proper temperature range is maintained
during sampling. Portable thermocouples are also available for this
purpose. Care should be taken to be sure that the thermometer is not
placed too near the heater strip or a false temperature reading may
result.
D. Pitot Tybe Calibration. The "S" type pitot tube is generally
used to measure the velocity at each sampling point to establish
-------
stack volumetric flow rate and the required sampling rate. Since it is
usually used 1n the presence of a thermocouple and nozzle, it is
Important that this latter equipment not affect the pitot tube
calibration coefficient. A simple way to check this 1n a stack with
constant flow rates and flow patterns is to measure the velocity head
at three points in the stack first without the nozzle and thermocouple
attached and then with them attached. The average velocity head for
the two situations should agree within three percent.
Alternately, the evaluator can require that the testing company
measure the velocity head using their "S" pitot and a regular ASME L
type pitot tube supplied by the evaluator. After correcting for the
different calibration factors, the two pitot tubes should yield
velocity heads that agree within three percent. For best results, the
evaluator should also supply a differential pressure gauge for use with
his pitot tube to verify the accuracy of the tester's differential
pressure gauge.
E. Differential Pressure Gauge. The pressure gauges should first
be checked to be sure that they are appropriate for the test. For
example, a 0 to 25 inch H20 Magnehelic gauge (0-62 cm) is not
appropriate if the pressure differential to be measured is less than
one inch (2.5 cm). A Magnehelic that has been dropped or is dirty or
corroded may no longer be accurate, but most testing crews do not
routinely check the accuracy of these pressure gauges. However, as
shown below, checking the accuracy and applicability of these gauges is
quite simple.
The relative accuracy of the two pressure gauges normally found in
a meter box (one for the pitot tube and one for the orifice) can be
checked by setting a sampling rate in the meter box that is suitable
for both gauges and measuring the pressure differential across the
limiting orifice using first one and then the other gauge.
If there is a difference, this will only tell you that at least
one of the gauges 1s Inaccurate, but not which one. To determine the
defective gauge, it is better to bring a portable inclined manometer
(0-10 inches 1^0; 0-25 cm ^0) to the test site and use it to check
the accuracy of the tester's differential pressure gauge.
-------
F. Volume Measuring Device. Some stack sampling methods use a
calibrated rotameter and a stopwatch to determine the volume of gas
sampled, but most employ a dry gas meter for this purpose. Volume
meters can be checked for accuracy at the test site in several ways.
For example, the evaluator can connect his own reference (calibrated)
dry gas meter (DGM) to the meter box inlet using an appropriate fitting
and compare the volume measured by the reference meter to that measured
by the one in the meter box.
Alternately, a critical orifice such as that described by
Mitchell, (Pollution Engineering, J.3, 45-47, June 1981)* can be used to
establish the accuracy of the volume meter and the overall integrity of
the meter box.
A third method is to set the meter box orifice pressure gauge (AH)
to the box's AH@ and see if the average flow rate is between 0.73 and
0.77 cfm for a 5-minute sample after correcting the volume to standard
temperature and pressure. If it is not in this range, then one more of
the following situations exists:
1. The orifice calibration is incorrect.
2. The DGM calibration is incorrect.
3. There is a leak between the DGM and the limiting orifice or
between the orifice taps and the orifice differential pressure
gauge.
4. The orifice pressure gauge is' out of calibration.
* A correction to the article was published in the Letters to the
Editor section in the September 1981 issue of Pollution Engineering.
In the corrections the parameter, e, was printed as 0 through a type-
setting error.
G. Barometer. If an accurate barometric pressure is required for
the test, it is well to bring a calibrated aneroid barometer to the
test site to check the tester's unit. Alternately, it may be possible
to use a mercury barometer located in a laboratory at the test site.
Sometimes a National Weather Service station is nearby and the
barometric pressure of the station can be used to check the accuracy of
the tester's equipment. If this is done, it is important to get the
actual pressure at the Weather Service Station and correct 1t to the
elevation of the sampling port, not to sea level.
-------
H. Filter. The filter holder should be essentially leak free at
15 inches Hg (38 cm Hg). Assembling the filter holder with the
appropriate filter and leak checking it before placing it in the
sampling train can be a very cost effective, time-saving procedure. By
observing the test team during filter assembly and filter recovery the
evaulator can determine the cleanliness of the filter and filter
holder; determine that the proper filter is being used; confirm that
the filter is not torn by the filter holder and that filter material is
not left in the gasket. If desired an unused filter can be taken by
the evaluator and returned to his own laboratory to check its tare
weight. However, if this is done, the filters must be reweighed after
conditioning it at a temperature and relative humidity identical to
those used in taring the filter.
I. Impinger Design/Condition. The impingers used should meet the
requirements of the test method, particularly if a gas sample is being
collected by impingement. At times, it may be necessary to substitute
a Greenburg-Smith impinger for a modified Greenburg-Smith impinger
(orifice tip removed) but this should not affect the collection
efficiency; substitution may affect collection efficiency. A visual
inspection will also show the general cleanliness of the tester's
glassware. If desired, the impingers can be rinsed before use and the
rinse used as a field blank. (This test will only be useful if the
impinger contents will be analyzed.)
J. Stack Temperature. Analogous to the filter heating box, a
calibrated dial thermometer or portable thermocouple can be used to
check the accuracy of the tester's stack temperature measurement.
K. Nomograph/Calculator. Frequently a nomograph or calculator is
used to select the appropriate nozzle size and required sampling rate.
Nomographs are light and if properly constructed, accurate and easy to
use. The evaluator can carry either a nomograph or calculator to the
test site and confirm the accuracy of the tester's device and calcu-
lations. Alternately, the evaluator can supply the tester with a
theoretical situation and ask him to calculate the nozzle size and
sampling rate.
-------
II. Audit Sample
If appropriate, an audit sample procured commercially or prepared
by the evaluator should be given to the test team for analysis. For
example, an aqueous sulfuric acid sample can serve to verify the
accuracy of a sulfate analysis. A lecture-size cylinder containing
C02 and Q£ in nitrogen can be used to determine the accuracy of an
Orsat analysis when this analysis will be used to correct the emission
rate to 12% C02 or to calculate excess air. Alternately, ambient air
can be used to check the proficiency of the analyst since the oxygen
content of air should be 20.5 + 0.3.
III. Other Procedures
A. Before allowing testing to begin, ask the test supervisor to
show you any special instructions he was given about the objectives of
the test. If he seems unfamiliar with the test design, testing should
not'be initiated until this situation is corrected.
Before the testing starts, it is also useful to discuss the
action(s) the test team supervisor should take if the plant goes down
or sampling must be stopped because a thunderstorm approaches, or
someone is injured, etc. For example, if the plant suffers an upset in
the middle of a sampling run or the filter plugs up, should the
sampling continue, should it stop until the situation is corrected or
should the entire test be terminated.
B. By talking with the appropriate personnel assure that they
know what to do and what data they should record.
C. Examine the equipment as it is being assembled to be sure it
is adequate and properly designed to conduct the test. Ask questions
1f you are not sure that a piece of equipment is suitable. Also check
the date of calibration for each piece of equipment to see that it
meets reasonable QA/QC procedures.
-------
D. Check to see that the test team conducts all mandatory leak
checks and that they correct any leaks that exceed the allowable rate.
The efficiency with which these leaks are corrected can give the
observer a reliable assessment of the test team's knowledge of their
equipment and how it works.
E. Periodically, check to see that the test team is recording the
required data in a permanent, accurate, and timely manner.
F. Observe the probe exit (if appropriate) to be sure that
condensation is not occur ing in the probe. Place your hand near the
probe sheath to see if the heating system is operating.
G. If critical for measurement accuracy, periodically check all
equipment to be sure that it is level. This is particularly true of
manometers and rotameters.
H. Observe the proficiency of test personnel as they carry out
their duties. For example, see if the samples are being recovered
carefully and quantitatively and that each sample is identified
promptly concerning date, type, sample number, etc.
I. Examine the color of particulate samples and see if the
particulate is symmetrically distributed on the filter. If the filter
and particulate are yellow or charcoal black and the probe and filter
were heated above 100°C during testing, this could indicate a cracked
glass probe liner. If the particulate deposit on a flat filter is not
symmetrically distributed but rather seems to flow towards or away from
one edge, this could indicate that the filter was not properly placed
in the filter holder.
-------
APPENDIX A
PERTINENT INFORMATION
THAT SHOULD BE CONTAINED ON THE STANDARD FORMS
-------
APPENDIX A
Check to see that the standardized forms address the following
points:
1. Presurvey form
Objective
Desired test date
Desired report date
Contact personnel and phone numbers
Plant operating schedule/anticipated shutdowns
Sketch of sampling location/points
Velocity profile
Pollutant concentration expected
Composition of stack gas/hazardous materials
Test methods
Special calibrations or equipment required
Physical layout of sampling site, size of test ports
Equipment required to put sampling train in place
Minimum number of samples to be collected
Process points to be monitored and interval of data recording
Identification and physical description of sample recovery
facility
Map of area, motels, etc.
Availability and location of electrical power
2. Test Design/Accomplishment Form (May be included in Presurvey
Form)
Number/type of equipment required for test
Specific test procedure(s) to be used.
Pre test/post-test equipment maintenance and calibration
.How samples are to be shipped, stored, labelled, analyzed, etc.
Sample stability problems
I
-------
3. Equipment Shipment/Transport Forms (May be included in Test
Design Form)
Type/number of items required and packaged
Calibrations done/calibration factors
Method of shipment
Number of boxes
Field calibrations to be done by test crew
4. Shipping labels for Equipment
Clear identification of contents
Destination
5. Field Testing Form
Sampling equipment ID number/calibration factor
Stack velocity, profile, temperature
Name/function of persons conducting the test
Plant name/date of test
_ Special instructions to analyst
How samples were stored, shipped, etc.
Field calibrations
6. Sample Analysis Form (may be a laboratory notebook)
Analytical method used
Equipment calibration/standardization done
_ Integrity of samples as received
Calculation methods with at least one sample of each type of
calculation
Any special conditions that occurred that might have affected
the quality or validity of the data
Analyst(s) name, date of analysis
-------
7. Sample ID Forms (Labels)
Sample number
Plant name
Project number
Date collected
Sample type
Special storage/transport conditions
Sample volume
Name or initials of person taking sample
-------
TECHNICAL REPORT DATA
(Please read Instructions on the reverse before completing)
1. REPORT NO.
12.
3. RECIPIENT'S ACCESSION NO.
4. TITLE AND SUBTITLE
TECHNIQUES TO DETERMINE A COMPANY'S ABILITY TO CONDUCT
A QUALITY STACK TEST
6. REPORT DATE
6. PERFORMING ORGANIZATION CODE
7. AUTHORIS)
William J. Mitchell, USEPA
Eva Estes, Research Triangle Institute
8. PERFORMING ORGANIZATION REPORT NO.
9. PERFORMING ORGANIZATION NAME AND ADDRESS
Research Triangle Institute
P.O. Box 12194
Research Triangle Park, NC 27709
10. PROGRAM ELEMENT NO.
11. CONTRACT/GRANT NO.
12. SPONSORING AGENCY NAME AND ADDRESS
13. TYPE OF REPORT AND PERIOD COVERED
Environmental Monitoring Systems Laboratory
Office of Research and Development
U.S. Environmental Protection Agency
Research Triangle Park, NC 27711
14. SPONSORING AGENCY CODE
EPA 600/08
15. SUPPLEMENTARY NOTES
TECHNICAL ASSISTANCE DOCUMENT
16. ABSTRACT
Techniques to determine a testing company's ability to conduct a quality stack test
for compliance or process engineering purposes are presented. The document has four
sections. The first identifies characteristics commonly associated with laboratories
that have a history of performing high quality stack tests. The second part suggests
questions that can be asked of candidate companies during the selection process and
presents process information that should be given to them so that they may be evalu-
ated fairly. The third part presents techniques the evaluator can use at a company's
laboratory during a pretest selection visit. The last part presents techniques that
can be used during and after the actual test to determine the performance of the
testing company and to estimate the quality of their test results.
17.
KEY WORDS AND DOCUMENT ANALYSIS
DESCRIPTORS
b.lDENTIFIERS/OPEN ENDED TERMS C. COS AT I Field/Group
Stack Testing
Quality Assurance Techniques
18. DISTRIBUTION STATEMENT
RELEASE TO PUBLIC
19. SECURITY CLASS (TliiiReport)
Unclassified
20. SECURITY CLASS (Tliilpagel
Unclassified
EPA Fern 2220-1 (R«». 4-77) PREVIOUS KOITION it OBSOLETE
------- |