United States Office of Air Quality EPA-340/1 -83-009
Environmental Protection Planning and Standards January 1983
Agency Research Triangle Park NC 27711
Stationary Source Compliance Series
Guidelines
for the Observation
of Performance
Specification Tests
of Continuous
Emission Monitors
-------
EPA-340/1-83-009
Guidelines for the
Observation of Performance Specification Tests
of Continuous Emission Monitors
Prepared by:
Entropy Environmentalists, Inc.
P.O. Box 12291
Research Triangle Park
North Carolina 27709
Prepared for:
Anthony Wayne
Region VII
and
Louis R. Paley
Stationary Source Compliance Division
United States Environmental Protection Agency
SSCD Contract No. 68-01 -6317
U.S. ENVIRONMENTAL PROTECTION AGENCY
Office of Air Quality Planning and Standards
Stationary Source Compliance Division
Washington, D.C. 20460
January 1983
U.S. Environmental Protection Agency
7-:!or> 5, Library (5PL-16)
•' 1' l'j. Dearborn Street, Room 1670
Chicago, IL 60604
-------
The Stationary Source Compliance series of reports is issued by the
Office of Air Quality Planning and Standards, U. S Environmental
Protection Agency, to assist Regional Offices in activities related to
compliance with implementation plans, new source emission standards,
and hazardous emission standards to be developed under the Clean Air
Act. Copies of Stationary Source Compliance Reports are available -
as supplies permit - from library Services, U.S. Environmental
Protection Agency, MD-35, Research Triangle Park, North Carolina
27711, or may be obtained, for a nominal cost, from the National
Technical Information Service, 5285 Port Royal Road, Springfield,
Virginia 22151.
This report has been reviewed by the Office of Air Quality Planning
and Standards, U.S. Environmental Protection Agency, and approved for
publication as received from Entropy Environmentalists, Inc. Approval
does not signify that the contents necessarily reflect the views and
policies of the U.S. Environmental Protection Agency, nor does mention
of trade names or commercial products constitute endorsement or
recommendation for use.
ii
-------
ABSTRACT
Mien stationary source owners or operators plan to conduct Performance
Specification Tests of installed continuous emission monitors, they are reo-'red
to notify the applicable control agency. The agency should then appoi.iu a
representative to observe the tests.
This document contains general guidelines for the agency observer on the
performance of pretest negotiations, on-site observations, and the preparation
of his report. Also, an observer's checklist for these activities is included.
111
-------
TABLE OF CONTENTS
1.0 Introduction 1
2.0 Pretest Activities 3
3.0 Onsite Observation 7
4.0 Observer's Report 11
Appendix 15
-------
GUIDELINES FOR THE
OBSERVATION OF CONTINUOUS EMISSION MONITOR
PERFORMANCE SPECIFICATION TESTS
1.0 Introduction
Vhen stationary source owners or operators conduct Performance
Specification Tests (PSTs) of installed continuous emission monitors (CEMs) as
required by New Source Performance Standards ( NSPS), State Implementation Plans
(SIP), or other local regulations, the agency appoints a representative to
observe the tests. This manual is intended to provide guidelines for the
representative's role and general responsibilities as an observer. Detailed
explanations of the PST and specific instructions and recommendations for the
test report reviewer appear in a separate manual entitled, "Monitor Performance
Tests for Pollutant and .Diluent Gas Monitors: Reporting Requirements, Report
Format, and Review Procedures." The observer is referred to this manual and to
the appropriate Federal and State regulations for detailed technical information
concerning the tests.
The source owner or operator is responsible for planning and conducting the
PST. In most cases, a consultant is hired by the owner or operator to perform
the test. Typically, this consultant is designated as an authorized
representative of the source and, throughout this document, is referred to as
"tester" and "test consultant."
The functions of the agency observer are to represent the interests of the
agency during the planning and performance of the PST, and to provide
appropriate documentation of the PST for the test report reviewer. The observer
must ensure that the PST is performed according to the procedures delineated in
-------
the Performance Specifications (40 CFR 60, Appendix B) . In doing so, the
observer must work cooperatively with the source and with the test consultant.
He must be specific and forthright in his requests, while being respectful of
the positions of the other parties involved. Antagonistic or adversary
approaches will usually defeat themselves in these situations. Cooperation is
in the best interest of all parties, and will result in maximum success.
The activities of the agency observer may be divided into three phases:
pretest activities, onsite observation, and preparation of the observer's
report. These activities are discussed in the following sections.
It is strongly recommended that all of the observer's activities be
conducted by the same individual; when more than one person are involved, all
persons representing the agency must work closely together to ensure an
effective agency role in the PST.
The appendix of this document contains a detailed checklist for the PST
observer. Separate checklists for preliminary communications, pretest meetings,
and onsite observation are included. The observer should review these
checklists and use them appropriately to document all phases of the test
sequence .
-------
2. 0 Pretest Activities
The source owner or operator is required to notify the agency of his
intention to conduct the PST. (NSPS sources are required to give 30 days
notice; requirements for SIP sources may vary.) The notice indicates chat
arrangements for the test are underway and that the source is prepared to
schedule the test. Once source notification has been received, an agency
representative should be designated to observe the test.
Initially, the observer should establish contact with the source owner or
operator and inform him that he plans to observe the test. Ihe source is then
responsible for notifying the observer of all PST activities (i.e., test
schedule, pretest meeting, etc.). This initial agency-source contact
establishes the lines of communication among all parties from the outset of the
test sequence. At this time, the observer should query the source owner or
operator for information pertaining to the source (e.g., identification and
location, plant and monitor design and operating parameters, pollutants to be
measured, sampling locations, data from previous PST, etc.), and pertaining to
the tester (e.g., identification and location, sampling equipment and laboratory
information, test methods to be used, etc.).
After the source is contacted, the observer should familiarize himself with
the operating principles of the installed CEM, the source process and
anticipated operating conditions, and the requisite test methodology. The
observer should also review available information from control agency records of
the monitor (and/or relevant source emission tests). He should discuss
procedural details not specified in the Reference Methods and Performance
Specifications with the source and the tester, and achieve an
agency-source-tester concurrence prior to initiation of the test.
-------
The observer should attend any pretest meeting held by the source. Although
there is no requirement for pretest meetings, these meetings and other informal
gatherings of source, testing, and agency personnel provide opportunities for
clarification and negotiation of issues, and for the dissemination of
information about the test protocol and its possible problems. Typically,
pretest meetings are scheduled for the convenience of the source and the tester;
however, most sources are quite willing to accotnodate the observer's schedule
provided that they know in advance that he plans to attend and if a schedule
satisfactory for all three parties can be arranged .
The source and/or the tester must formulate a test protocol and submit it
to the observer for review and approval before the initiation of the PST. The
observer must make clear any special requests he has for the tester or the
source at this time. He should also review and approve (or disapprove) any
modifications to the standard procedures or other procedures not prescribed in
the Performance Specifications proposed by the tester. The observer should
include all draft versions and the finalized protocol in his report.
The observer must know the limits of his authority. Undoubtedly, the test
observer will be called upon to determine the acceptability of test procedures,
sampling methodology, etc. If the observer does not have authority for such
decisions, he must be able to contact an agency representative who has such
authority. Failure to establish this line of authority in advance will hinder
the satisfactory completion of the test program, which is the goal of the
source, the tester, and the agency.
The test schedule should be finalized during the pretest negotiations.
However, this test schedule is dependent upon many variable factors: boiler
operation, monitor operation, process conditions, etc. Testing cannot begin
-------
until satisfactory conditions are achieved simultaneously for all of these
variables, and despite the most careful planning, delays in the test schedule
are common. The tester and the observer must therefore be ready when appropriate
conditions are achieved. Once the PST has commenced, test problems may ' r.jse
additional delays. To complete the tests, the tester may continue his work well
into the evening and for most of the night, provided that proper lighting is
available. The observer should consult with the tester during the pretest
negotiations as to his firm's policy regarding the length of the test day.
Thus, the observer must be flexible and accommodate the tester's schedule.
If additional testing, modified sampling procedures, or additional quality
assurance procedures, etc. are necessary to accommodate particular regulatory
provisions, monitor-specific problems, and/or unusual source-specific
circumstances, the observer must inform the source and tester of any such
deviations from standard procedure as early as possible. Failure to do so may
compromise the test results and create problems.
Before the PST, the observer should review the "Onsite Observation" portion
of the observer's checklist contained in the appendix of this document. The
checklist is designed to apply to all types of continuous monitor PSTs . For the
checklist to be useful, the observer must be familiar with the portions of it
that apply to his particular situation, and he should note in advance those
portions of the checklist that must be completed.
The checklist is intended to facilitate the job of the observer; it is not
a complete, self-contained report in itself. Although its length may seem
cumbersome, it provides a format for the recording of information the observer
may wish to use. Tne observer himself, however, must finally decide vhich
information is important and which is not.
-------
In conclusion, responsibilities and activities of the agency and the
observer during the pretest period may be summarized as follows:
1. The agency should designate an observer upon receiving notice from the
so ur ce.
2. The observer should contact the source and obtain pertinent
information.
3. The observer should review all available pertinent information, both on
the source and on the PST methodology.
4. The observer, as agency representative, should clarify his authority to
the source and tester. If the necessary authority resides outside the
observer, contingency plans should be made for contacting the agency
representative who is authorized to make decisions.
5. The observer should attend the pretest meeting, if possible.
6. Before the initiation of the PST, the observer should review the test
protocol, negotiate necessary changes, and approve it.
7. The observer should obtain the test schedule and make appropriate plans
to attend.
8. The observer should review the observer's checklist and utilize it for
the onsite observation.
-------
3.0 Onsite Observation
The observation of the PST is perhaps the most important of the observer's
responsibilities following the establishment of the test protocol. The
observer must perform all preparatory activities to maximize the effectiveness
of his observation. He must complete the gathering of preliminary information
prior to the initiation of testing so that he can give his full attention to
the performance of the test.
The observer should arrive onsite on time for any pretest activities and
should attend any informal meetings with the tester and the source immediately
before the test to settle last minute details. He may be asked to observe and
approve non-standard procedures to be employed during the test. Consequently,
the observer should allow ample time for these activities and for any other
pretest inspections he may wish to perform, the observer should obtain and
examine: (1) the results of any tests for which the monitor manufacturer is
allowed to certify conformance with the applicable specifications (i.e., design
specification tests for transmissometers, some calibration error checks for
some gas monitors), (2) the results of calibration gas analysis and/or
manufacturer's certification of calibration gas cell values, and (3) the
required pretest calibration results for the tester's sampling equipment.
Once testing commences, the observer should watch all aspects of the test,
including test procedures, monitor operation, and source operating conditions.
It is unnecessary and time consuming for the observer to make his own separate
records of the data; he can observe the tester's record and initial the data
sheets to ensure that the data recorded are the data reported. The observer
must make judicious use of the checklist to note his observations; however,
time spent completing the checklist is time not spent observing. Therefore,
-------
the observer is advised to review the checklist in advance of the test and to
mark the appropriate information to be completed; if this is not done, the
observer could spend the majority of the test period working on the checklist.
The observer should be considerate of the tester during the PST; the
tester's attention will be focused on getting the job done, and during some
phases of the testing, distractions could jeopardize his work. When the
observer has questions for the tester, he should wait until an appropriate time
to ask them, unless it is imperative that the questions be resolved before
testing proceeds. Also, the observer should be respectful of the source's and
tester's equipment. Most sampling equipment is quite expensive, and much of it
is easily damaged . In general, a "hands-off" posture is the safest approach.
Regardless of the care taken to resolve all procedural questions in
advance, unforeseen situations are encountered frequently during the actual
test period. In such instances, the observer must approve or disapprove a
source's or tester's proposed solution on the spot. If he does not have the
authority to make such decisions, he should contact an agency representative
having the necessary authority. Any delays at this critical stage will result
in wasting valuable time and money.
At the conclusion of the test, the observer may wish to inspect the
tester's data sheets and initial them. Such a request should be made in the
spirit of cooperation. Indeed, the tester may wish to receive feedback from
the observer, and often such an interchange can prove mutually beneficial. The
observer should remember that his ultimate goal is to obtain reliable PST data
for the agency, and he should be willing to share his subjective and objective
observations and information.
-------
In conclusion, the observer's activities during the test may be summarized
as follows :
1. The observer should arrive onsite well in advance of the
scheduled beginning of the test to make final preparations for
accurate observation of both source and tester operations.
2. The observer should review data and/or test results from any
manufacturer's certification of conformance with specifications
and of any calibration gas analysis.
3. The observer should devote his primary energies to a careful
observation of the test procedures.
ij. The observer should be considerate of the source and the tester
and work with them to achieve the desired results.
5. The observer should be prepared to rule on problems that may
arise during the course of the test.
6. The observer should confer with the tester at the conclusion of
the test to exchange appropriate information and to discuss the
data quality and possible deficiencies.
-------
4.0 Observer's Report
At the conclusion of the test program, the observer must prepare a report
containing all pertinent information from his observation activities. The
observer's report aids the test report reviewer. Although the soundest and
most expedient practice is for the test observer to review the report, this is
not always feasible. Therefore, the observer should prepare a complete report
that can serve as an independent documentation of the test.
The outline below suggests an organization for an observer's report.
While the observer will finally determine the most appropriate report format,
the report must document thoroughly the activities of the observer, the source,
and the tester to be useful to the reviewer. Although recording the test
details is tedious, it is necessary to enable the reviewer to evaluate the test
report accurately. The observer must make notations of problems encountered,
and also state the acceptability of procedures. The observer should be specific
and concise, regardless of the report format chosen.
11
-------
Observer's Report
I. Background Information
- Source identification (station, unit)
- Tester identification
- Observer identification
- Test date
- Monitor description and location
- Pertinent source operational parameters
- Other useful background information
II. Pretest Communications
- Summary of pretest meeting, if one was held
- Copies of all protocols proposed during pretest negotiations
- Complete copy of finalized protocol
- Notations on communications among all parties before testing
- Notations of any deviations in standard methodology approved by observer
- Notations of any special tests or additional procedures recommended by
observer
III. On site Observations
- Observer's checklist and data from tests, well annotated with pertinent
information and notations
- Additional descriptions of checklist items meriting further discussion
- General notes from onsite observations not found on the checklist
- Notes on information-sharing activities conducted at conclusion of test
program (i.e., did the observer initial the tester's data sheets; did
the tester have access to the observer's notes; etc.)
12
-------
IV. Summary and Conclusions
(The observer may wish to present this information at the beginning,
rather than at the end, of his report.)
- Concise listing of any observations or information that may affect
test results.
- Conclusions on the part of the observer as to the validity of the
test procedures employed and the general successfulness of the
test.
13
-------
APPENDIX
-------
CONTINUOUS EMISSIONS MONITOR (CEM)
PERFORMANCE SPECIFICATION TEST (PST)
OBSERVATION REPORT
Plant:
Unit:
Date of PST:
Observer: _
-------
CONTENTS:
TEST SUMMARY FORM i
INSTRUCTIONS ii
PRELIMINARY COMMUNICATION SUMMARY 1-A
PRETEST MEETING SUMMARY 1-8
ONSITE OBSERVATION FORMS 1-C
GENERAL SITE INFORMATION 1-C
PERFORMANCE SPECIFICATION 1 3-C
PERFORMANCE SPECIFICATION 2 4-C
CALIBRATION ERROR 4-C
RESPONSE TIME 5-C
24-HOUR DRIFT 6-C
2-HOUR DRIFT 7-C
RELATIVE ACCURACY 8-C
METHOD 6 8-C
METHOD 4 12-C
METHOD 7 14-C
PERFORMANCE SPECIFICATION 3 17-C
RESPONSE TIME 18-C
24-HOUR DRIFT 19-C
2-HOUR DRIFT 20-C
GENERAL COMMENTS 21-C
-------
CONTINUOUS EMISSIONS MONITORING PERFORMANCE SPECIFICATION TEST
OBSERVATION REPORT
PLANT NAME:
UNIT TESTED:
LOCATION:
DATE OF PST:
OBSERVED BY:
SUMMARY OF TEST PROCEDURES
Test
SO,
Opacity
Moisture
Reference method
Calibration error
Response time
2-h drift
24-h drift
Calibration check
Other:
REPORT CONTENTS
Preliminary communications summary
Pretest meeting report or notes
Observer's data sheets
Supplemental observation report
Other:
Code
1. Acceptable
2. Unacceptable
3. Indeterminate
4. Not observed
5. Not performed
-------
CONTINUOUS EMISSION MONITOR (CEM)
OBSERVER'S FORMS
INSTRUCTIONS FOR OBSERVER:
PART 1 (Pages 1-A through 4-A): Preliminary Communication Summary
Part 1, the Preliminary Communication Summary, is to be filled in by the source. Detach
pages 1-A through 5-A and fill in the name and address of the person or agency to whom you wish
the form returned in the space provided at the beginning of the first page. Forward the Communica-
tion Form to the source and request that he/she complete and return it as soon as possible.
PART 2 (Pages 1-B through 5-B): Pretest Meeting Summary
Part 2, the Pretest Meeting Summary, which begins immediately after the Communication
Summary and continues through page 8-B, is intended to be an outline of the information that
should be collected prior to the actual testing. This section is to be filled in by the observer during
the pretest meeting.
PARTS (Pages 1-C to End): Observer's Data Sheets
Part 3, the Observer's Data Sheets, is to be filled in by the observer during the actual test
period. The form is designed so that it may be used for general PST observations of several different
types of sources, and some of the questions may not be applicable to a specific source or observa-
tion. It is not expected that the observer will be able to collect all of the information outlined on
the data sheets. Record only that information that corresponds to activities actually observed.
-------
CONTINUOUS EMISSIONS MONITORING (GEM) PERFORMANCE SPECIFICATION TEST
PRELIMINARY COMMUNICATION SUMMARY
'COMPLETE AND RETURN TO:
(OBSERVER)
Phone contact date:
Mailing date:
Return date:
THE FOLLOWING INFORMATION IS TO BE PROVIDED BY PLANT PERSONNEL AND THE FORM RETURNED TO THE
OBSERVER:
1. Plant name:
2. Mailing address:
3. Site location:
4. Source(s) to be tested:
5. Plant contacts:
Plant manager:
Environmental coordinator:
CEM Contact:
Test team leader:
Phone:
Phone:
Phone:
Phone:
•Communication form (pages 1-A through 4-A) to be completed and returned 30 days prior to pretest meeting.
1-A
-------
PRELIMINARY COMMUNICATION SUMMARY
6. Process description:
A. Raw materials and/or fuel used:
B. End product(s):
C. Maximum production capacity:
Normal emission operating range: S02: .
D. Other: _
7. Dates for:
A. Initial startup
B. Maximum production rate reached
C. 60 days after maximum production reached
D. 180 days after startup
E. Preevaluation/pretest meeting
F. Conditioning period for:
1. Opacity
2. S02
3. NOX
4. Diluent (02,CD2)
5. Other
8. Monitors to be tested:
NOX:
Date
Planned/Scheduled
Actual Date
Anticipated
conditioning date:
Anticipated
performance test dates:
2-A
-------
PRELIMINARY COMMUNICATION SUMMARY
9. Does the test contractor participate in an external audit program?
Yes: No:
If so, are the audit samples (A) Gaseous (cylinders)?
or (B) Liquid S02/NOX?
If not, is the test contractor willing to participate in an external audit program? Yes: No:
10. Schematic diagram of monitor location:
(Include stack/duct cross section dimensions, straight runs before and after location I with distances in diameters), and locations
of air in-leakages).
3-A
-------
PRELIMINARY COMMUNICATION SUMMARY
11. Tests are to be performed by:
Name:
Address:
Phone:
Contacts' names:
Name:
Address:
Phone: .
Name: .
Address:
Phone:
4-A
-------
PRETEST MEETING SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
General CEM information: Date:
1. Plant:
Is this the initial PST? Yes No
If not, when was the last PST performed?
Describe problems encountered during last PST:
2. Pretest meeting participants:
Name: Affiliation:
3. Performance test procedures:
Standard Test Departures From
(Reference Method) Standard Test
Opacity
so2
N0x
Diluents
Moisture
CO
1-B
-------
PRETEST MEETING SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
4. Testing
Organization and person performing test*/proposed test date
Calibration error
Response time
Zero/calibration drift-2-h
Zero/calibration drift-24-h
Accuracy
Opacity/Date
Pollutant/Date
Diluent/Date
5. General CEM system characteristics
Manufacturer
Model No.
Serial No.
Dates for:
1. System startup:
2. Zero alignment on cool,
clear flue:
3. Anticipated conditioning
period:
4. Anticipated operational
test period (DTP):
Final alignments prior to DTP ?
7
Data recorder type :
System span value :
% zero offset:
4
Daily span-check method :
Daily zero method :
Automatic zero compensation?
Measured "wet" or "dry":
Opacity
SO 2
NOX
Diluent
Other
Other
j
*lf different from Preliminary Communication Summary, update here.
Operational test period of 168 hours
Specify-strip chart, computer, data logger, mag tape or other
100% of system output
Cell or gas injection
Means for simulating zero value for daily zero check 2-B
-------
PRETEST MEETING SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
6. Fuel analysis or process feed rate?
7. "F" factor, if appropriate:
8. Calibration gas verification method:
S02:
Diluents (02, C02):
Other (e.g., CO):
9. Is there provision for injecting calibration gas at probe? (If not, explain)
3-B
-------
PRETEST MEETING SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
11. Do all monitors meet the location and stratification requirements as outlined by the applicable parformance specifications?
Yes: No:
This was determined by:
A. Physical location Yesi
Per regulation (40 CFR SO Appendix B)
B, Testing Ya«:
If determined by testing «?*p'ain t"st? ind attach msu'tv , ,
No-.
Nn-
12. Does monitor installation site meet vendor operating environment requirements (temperature, humidity, vibration, etc.)?*
13. Describe the relative accuracy reference method sampling system location in relation to the monitor location.*
14. Are there adequate provisions for the reference method test (i.e., access, sampling ports, sufficient work area, suitable work en-
vironment, etc.)?*
15. Major points of action to be resolved before test date: Resolution/Date
'Resolution of these questions may require preliminary onsite inspection.
4-B
-------
PRETEST MEETING SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
10. Calibration gas cylinder or cell information
en
do
Pollutant1
so2
so2
so2
NOX
NOX
NOX
o2/co2
oz/co2
o2/co2
Range2
Zero
Mid
Span
Zero
Mid
Span
Zero
Mid
Span
Carrier3
Container
Vendor5
ppm Value
Gas
Cell
Date6
Triple Analysis
ppm Value
Date6
Cylinder
IQit
If NOX, specify NO or N02; if diluent, circle 02 or
As applicable.
o
Specify carrier gas contained in cylinder or cell.
Specify vendor cell, regulated cylinder, instrument air, or other.
Supplied by vendor when calibration mixture purchased or assigned value of calibration cell.
Give date on which vendor certification and/or triplicate reference method test performed.
Average of three valid reference method tests on cylinders only.
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
GENERAL PERFORMANCE SPECIFICATION INFORMATION
Plant; DTP Start Date:-
OTP Start Time:-
1. Observers: Affiliation:
2. Test performed by: (If different from those specified in pretest meeting)
3. Were pretest meeting notes reviewed? Yes: No:
4. Were velocity traverse data taken? Yes: No:
By whom? ——
When taken? —
5. Actual zero and calibration schedule:
1-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
6. Comments:
2-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 1 (OPACITY)
Day in operational test period sequence
Were initial 24 h readings taken after system was calibrated?
Were zero readings taken before cleaning and adjustment?
Value before zero cleaning and adjustment
Amount of zero compensation before cleaning and
adjustment
Value after zero ad|ustment
Were span readings taken after cleaning and zero
adjustment, but prior to span adjustment?
Value before span adjustment
Value after span adjustment
Day 1
Day 2
Day 3
Day 4
Day 5
2»
Day 6
Day?
Day8
NOTE: Complete checklist for portions of test actually observed.
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 2
1
Manufacturer and model of monitor
What method was used to perform sampling
of calibration gas?'
Is a stainless steel regulator used for each
calibration gas? If not, explain. 2
Are stainless steel and Teflon®calibration
gas sampling system components (lines,
valves, fittings) used? If not, explain. 2
Were calibration gas/cells certified by:
1. The manufacturer?
2. Other? (Please explain ft
What was the concentration of the span
gas/cell?
Is the span value in agreement with the
applicable subpart?
S02
NOX
/^//////'^///////'//
Wwffl/- 't'''-'.'','
'l^p/^y' - '''^ '' '
/mM:^'
M»v
'Attach a sketch of sampling train.
'Explain m comments section.
CALIBRATION ERROR TEST
Comments:
1. Was calibration error test performed in
the field or in the laboratory?*
2. Were 15 nonconsecutive readings taken?*
3. Was entire continuous monitoring system
(probe, sample conditioning system,
transport system, etc.) subjected to
calibration error test?*
S02
NOX
Comments:
"If the answer to either part of 3 is "No", please explain:
4-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 2
RESPONSE TIME
What was the concentration of the gas or
gas cell used?
Identification number of the tank or gas
cell.
For NOX monitors which oxidize NO to
N02, was NO used for response time?
Was entire continuous monitoring system
(including sample conditioning system)
used? If not, please explain in Comments
section.
What was the response time? (seconds)
so2
NOX
Comments:
5-C
-------
ONSITE TEST SUMMAMY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 2
ZERO AMU CALIBRATION ORIFT-24 H
culls used?
Wait juiu itiitliiiiji
ValiiL' bufurc /eiu a
Vdluu dltvr mo adjustment.
Wuie span
24 h intervals?
or calibration
ive, was span
instrument span?
ractive, was span
instrument span?
s taken altar
?
before adjustment
Ilient.
tut.
i dfter zero
r to span
line nt.
lent.
Day 1
S02 NOX
_ -
•"'•••T'.'^rr/?''?'^'''
-' ' ,' '///'.
,•'%$
'ss'tC?/?????/'
''''' //>''', ''''.
Day 2 f
S02 NOX
' ~ "
Day 3
S02 NOK
-
_ ..
Day 4
S02 NOX
- - - -
. .
Day 5
S02 NOX
' '/
/ ' - ' / • '*
Day 6
S02 NOX
~, , ~ • . , rf/r/>
.' .,///', ,
t''/
—
— - - -
Day 7
S02 NOX
•/"/-/' y> •//-• •
^^
•' '•y't':-,',
— -
— — - -
Da
S02
; .''/''//,
' ^//
NOX
Value belore span adjustment
Value after span adjustment.
NU11 Complete checklist lor portions of test actually observed.
t Were dd|ustments made only at 24 h intervals?
Yes: No: If no. explain -
3. Fur in situ monitors usiny incremental addition metliod, what are the
calibration cell values?
4. Comments:
2. What is the manufacturer's recommended procedure lor mo and span
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 2
ZERO AND CALIBRATION DRIFT-2-H
Were 2-h drift readings taken so that no
adjustments were made?
Were a total of 15 nonconsecutive 2-h
drift periods completed?
How many were observed?
If the system was extractive, was span
gas/cell 80 to 90% of the instrument span?
If the system was nonextractive, was span
gas/cell 45 to 55% of this instrument
span?
so2
NOX
Comments:
7-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 2
ACCURACY TEST
REFERENCE METHOD 6
1. How many runs were made?
2, Probe type (material):
3. Were glass ball joints greased?
4. What type of grease?
5. Date of last dry gas meter calibration?
6. Calibration factor of dry gas meter: _
7. Were impingers iced down?
8. Barometric pressure measured? If not, how were the data obtained?
9. Was the first midget impinger filled with 15 ml of 80% isopropanol (100 ml of 80% isopropanol for large impingers)?
10. Were the next two midget impingers filled with 15 ml of 3% \\2®2 '10° mt- of H2°2 for Iar3e impingers)?
11. What leak check procedure was used?
12. Were laboratory QA samples accepted? Yes No
Serial Numbers: (1) (2) (3) (4).
If not accepted, please explain.
8r
-o
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 2
ACCURACY TEST
METHOD 6 TESTING
NOTE: No more than one S02 accuracy test may be run per hour.
Run Number
Date
Probe heated?
Probe free of condensate?
Initial leak check run? (Optional)
Was probe purged immediately prior
to each test?
Sample starting time
Flow rate constant? (1 L/min)
5-min readings taken for:
A. Flow rate
B. DGM volume
C. DGM temperature
0. Was temperature leaving last
! impinger 68" F or less?
1 Sample end time
: Total volume sampled
Final leak test? (Mandatory)
Sample purge time (Total) > 10 minutes
Was purge air filtered? (Optional)
Concurrent moisture test?
CEM reading
1
2
3
4
5
6
7
8
9
10
11
12
1
1
3-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 2
ACCURACY TEST
REFERENCE METHOD 6
SAMPLE RECOVERY
Run number
Date
Sample containers clean
Container type
Glass
Polyethylene
Teflon
Other {Soecifyl ..
1st impinger
(IPA discarded)
Last 3 impingers, connecting glass
rinsed with 01 water and
combined in recovery
containers
Containers labelled, sealed, and
content level marked
Container code or ID number
1
2
3
4
5
6
7
8
9
10
11
12
Additional comments: (Include description of general environment of cleanup area)
10-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 2
ACCURACY TEST
REFERENCE METHOD 6
SAMPLE DISPOSITION:
1. Were samples titrated onsite? Yes: Mo:
2. If no, were they analyzed
A) At the base laboratory of the test team; or
B) (Subcontracted to) another laboratory?
3. Who analyzed the samples?
Name of analyst: . .
Name of firm: . _—
4. Did anyone observe the analyses? Yes: No:
Who:
5. Are the data and/or results available? Yes No:
If yes, from whom? —
6. Were audit samples included? Yes: No:
Actual values: S02
Observed values:
Comments:
11-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 2
ACCURACY TEST
REFERENCE METHOD 4
NOTE: Concurrent with Method 6 and/or Method 7
1. Probe type (material):
2. Were glass ball joints greased?
3. What type of grease?
4. Date of last dry gas meter calibration:
5. Calibration factor of dry gas meter—
6. Were impingers iced down?
7. Barometric pressure measured?
If not, how were these data obtained?
8. Was silica gel or Drierite used in last impinger
9. What leak test procedure was used?
? (SpecifyI
10. How was moisture determined?
Volumetrically
or Gravimetrically
12-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 2
ACCURACY TEST
REFERENCE METHOD 4
NOTE: Concurrent with Method 6 and/or Method 7
Run Number
Date
Probe heated?
Probe free of condensate?
Initial leak check run? (Optional)
Was probe purged immediately prior
to each test?
Sample starting time
Flow rate constant?
5-min readings:
A. Flow rate
i
B. DGM volume
C. DGM temperature
0. Was temperature leaving last
impinger 68° F or less?
Sample end time
Total volume sampled
Final leakiest? (Mandatory)
1
2
3
4
5
6
7
8
9
10
11
12
I
13-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 2
ACCURACY TEST
REFERENCE METHOD 7
NOTE: No more than one NOX accuracy test may be run per hour
Run Number
1
10
11
12
Date
Probe material?
Was probe heated?
Barometric pressure taken?
If not, how was this information obtained?
Were ball joints greased?
Type of grease
Flask evacuated to 75 mm Hg
absolute or less? (3 in. Hg)
Leakage rate under vacuum less
than 10 mm Hg in 1 min? (0.4 in. Hg)
System purged before sampling?
Flask pressure and temperature taken
before sampling?
Time required to fill flask
Three flasks shot in three minutes?
Sample flask shaken for 5 min?
CEM reading
U-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 2
ACCURACY TEST
REFERENCE METHOD 7
SAMPLE RECOVERY
Run Number
Sample held for 16 h before recovery?
Sample shaken for 2 min before
recovery?
Flask temperature and pressure
recorded before vented?
Sample containers clean?
Container type
Glass
Polyethylene
Teflon
Other (Flask)
Rinsed twice with 5 ml 01 water?
pH adjusted to between 9 and 12?
Containers labelled, sealed, and
content level marked?
1
2
3
4
5
6
7
8
I
9
10
11
12
What was the environment of the cleanup area?
15-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSION MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 2
SAMPLE DISPOSITION:
1. Were samples titrated onsite? Yes: No:
2. If no, were they analyzed
A) At the base laboratory of the test team; or
B) (Subcontracted to) another laboratory?
3. Who analyzed the samples?
Name of analyst: .
Name of firm:
4. Did anyone observe the analyses? Yes: No:
Who:
5. Are the data and/or results available? Yes: No:
If yej, from whom?
6. Were audit samples included? Yes: No:
Actual values: NOX
Observed values: NOX
COMMENTS:
16-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 3 (DILUENTS) (02, C02)
1. Manufacturer and model of monitor:
2. Serial number of monitor:
3. Gas sampled:
4. Were ORSAT analyses performed to verify diluent monitors?
5. If bags were used to collect 0 RSAT samples, of what material were the bags made?
6. Was a calibration curve established to verify the expected response curve as described by the analyzer manufacturer?
7. Were calibration gases (if used) injected at probe?
17-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 3
RESPONSE TIME
Diluent
Gas or cell concentration used
Low
Medium
High
Was entire continuous monitoring system used? If not, explain.*
What was the response time?
°2
co2
'Comments:
18-C
-------
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 3
ZERO AND CALIBRATION ORIFT-24-H
Were all readings taken at 24 h intervals?
Were zero and span gases or calibration
cells used?
If the system was extractive, was span
gas/cell 80 to 90% of instrument span?
It the system was nonextractive, was span
gas/cell 45 to 55% of instrument span?
Were zero readings taken before adjustment?
Value before zero adjustment.
Value after zero adjustment.
Were span readings taken after zero
adjustment, but prior to span
adjustment?
Value before span adjustment.
Value after span adjustment
Day 1
'/r/S'/r'7'-/77''r/
'yw//yyfa,,
'S'////^////', '
m-''//''//'
/ ' /•// ,
' ' f/ • ' '
s ' /
/,• /:'••':'• ''
Day 2
Day 3
Day 4
DayB
Day 6
Day?
Day 8
U3
O
NOTE: Complete checklist for portions of test actually observed.
1. Were adjustments made only at 24-h intervals?
Yes: No: If no, explain.
3. Comments:
2. What is the manufacturer's recommended procedure fur zero and span
adjustment frequency?
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
PERFORMANCE SPECIFICATION 3
ZERO AND CALIBRATION DRIFT-2-H
Diluent
Were 2-h drift readings taken so that no adjustments were made
during drift period?
Were a total of 15 nonconsecutivs drift periods completed?
How many were observed?
If the system was extractive, was span gas/cell 80 to 90% of
instrument span?
If the system was nonextractive, was span gas/cell 45 to 55%
of instrument span?
°2
co2
Comments:
20-C
-------
ONSITE TEST SUMMARY
CONTINUOUS EMISSIONS MONITOR OBSERVER'S DATA SHEET
GENERAL COMMENTS ON ENTIRE SAMPLING PROJECT:
21-C
-------
Did observer sign observed data sheets? Yes: No:
Did observer sign strip charts/computer printouts? Yes: No:
Was the test team supervisor given the opportunity to read over this data sheet?
Did he do so?
Test team supervisor signature:
Observer's name:
Titie:
Affiliation:
Observer's signature:
OatE:
-------
TECHNICAL REPORT DATA
(Please read Instructions on the reverse before completing)
1. REPORT NO. 2.
EPA-340/1-83/ 009
4. TITLE AND SUBTITLE
Guidelines for the Observation of Performance
Specification Tests of Continuous Emission Monitors
1. AUTHOR(S)
Entropy Environmentalists, Inc.
>. PERFORMING ORGANIZATION NAME AND ADDRESS
Entropy Environmentalists, Inc.
P.O. Box 12291
Research Triangle Park, NC 27709
12. SPONSORING AGENCY NAME AND ADDRESS
OAQPS
Stationary Source Compliance Division
Waterside Mall, 401 M Street, SW
Washington, DC 20460
3. RECIPIENT'S ACCESSION NO.
5. REPORT DATE
January 1983
6. PERFORMING ORGANIZATION CODE
8. PERFORMING ORGANIZATION REPORT NO.
10. PROGRAM ELEMENT NO.
1 1 . CONTRACT/GRANT NO.
68-01-6317
13. TYPE OF REPORT AND PERIOD COVERED
FINAL - IN-HOUSE
14. SPONSORING AGENCY CODE
EPA/200/04
15. SUPPLEMENTARY NOTES
16. ABSTRACT
When stationary source owners or operators plan to conduct Performance Specifi-
cation Tests of installed continuous emission monitors, they are required to
notify the applicable control agency. The agency should then appoint a represen-
tative to observe the tests.
This document contains general guidelines for the agency observer on the performance
of pretest negotiations, on-site observations, and the preparation of his report.
Also, an observer's checklist for these activities is included.
7. KEY WORDS AND DOCUMENT ANALYSIS
L. DESCRIPTORS
Air Pollution
Performance Specification Tests
H.
8. DISTRIBUTION STATEMENT
Release to Public
b.lDENTIFIERS/OPEN ENDED TERMS
Guidelines for Observer
Checklist for Observer
19. SECURITY CLASS /This Report)
unclassified
20. SECURITY CLASS (This page)
unclassified
c. COSATI Field/Group
21. NO. OF PAGES
64
22. PRICE
:PA Form 2220-1 (R«». 4-77) PREVIOUS EDITION is OBSOLETE
-------
-------
United States Office of Air Quality Planning and Standards
Environmental Protection Stationary Source Compliance Division
Agency Washington, D C. 20460
Official Business Publication No EPA-340/1-83-009 D . „ .
Penalty for Private Use Postage and
$300 Fees Paid
Environmental
Protection
Agency
EPA 335
If your address is incorrect, please change on the above label,
tear off, and return to the above address
If you do not desire to continue receiving this technical report
series, CHECK HERE D , tear off label, and return it to the
above address.
------- |