United States         Office of           Norfolk, VA
TD480          Environmental Protection     Water Programs        Mar., 8-9, 1979
. $25           Agency           Washington, DC 20460

1979                                	
             Water
&EPA       Seminar for Analytical
             Methods for Priority
             Pollutants
                                    OOOR79103
             Norfolk, Va.
             March 8-9, 1979
                                     FE8 2 9 198°
             Effluent Guidelines Division

-------
       ENVIRONMENTAL PROTECTION AGENCY




                Seminar For




ANALYTICAL  METHODS FOR PRIORITY POLLUTANTS




                  Held On




            March 8 and 9, 1979




                     at




         Omni  International Hotel




             Norfolk, Virginia
                          OS Environment* Protection
                           Chicago,

-------

-------
                                 INDEX
March 1979

Speaker                                                     Page

ROBERT B. SCHAFFER,                                           1
EPA, Effluent Guidelines Division

JIM LONGBOTTOM                                                9
EPA, Cincinnati

WALTER SHACKELFORD                                           19
EPA, Athens

PAUL FAHRENTHOLD                                             26
EPA, Headquarters

ROGER NOVAK                                                  44
Gulf South Research Institute

BRUCE COLBY                                                  50
Systems, Science and Software

JIM LICHTENBERG                                              58
EPA, Cincinnati

PANEL DISCUSSION                                             68

GAIL GOLDBERG                                                82
EPA, Headquarters

GEORGE STANKO                                                88
Shell Development

PRISCILLA HOLTZCLAW                                          97
EPA, Headquarters

MIKE CARTER                                                 100
EPA, Headquarters

KATHLEEN THRUN                                              102
Arthur D. Little

FRED BISHOP                                                 109
EPA, Cincinnati

-------
                              INDEX  (Continued)

Speaker                                                     Page

C. HAILE                                                    118
Mid-West Research Institute

DALE R. RUSHNECK                                            123
PJB Laboratories

APPENDIX I                                                  127
BRUCE COLBY

APPENDIX II                                                 144
DALE RUSHNECK

APPENDIX III                                                147
POTW QUALITY CONTROL STUDY

APPENDIX IV                                                 .
LIST OF ATTENDEES                                             °

-------
                             Preface

                      WILLIAM A. TELLIARD
     The Effluent Guidelines Division  of  EPA has been sponsoring
a series of  meetings  to promote  the free exchange  of technical
information among contractors, EPA  personnel,  and various indus-
try groups concerned with  analytical methods for the measurement
of priority pollutants.

     This paper  summarizes  the  proceedings of  a meeting held in
Norfolk, Virginia  in  March  1979.   The meeting  focused  particu-
larly on the measurement of priority pollutants  in POTW's, and on
efforts to establish a measurement protocol for municipal sludge.

-------
                           OPENING REMARKS
                             R. Schaffer
                                USEPA
     Mr. Schaffer;  Good morning.  Welcome to our third meeting
on analytical methods and effluent guidelines.  Since the first
meeting in Denver, and since the beginning of this program effort,
we've made a lot of progress.  We've solved a few problems and we
know we have some additional ones to solve.  We hope that the re-
sults of this meeting and our efforts in the future will do a lot
to solve many of them.

     One thing has become very apparent to me over the past year,
that has caused some confusion in Washington.  We are introducing
a new science into the minds of some of the regulators, that being
analytical chemistry.  Many of the people who have been dealing
with the engineering aspects of pollution control, regulation, etc.,
have become familiar enough with the technologies to be able to deal
with them.  Many of our attorneys have learned enough engineering
to communicate with engineers -- many of our engineers have learned
enough lawyering to communicate with the attorneys.  However, nei-
ther of them has learned enough analytical chemistry to communicate
with some of you and vice versa.  Because of that, there's a great
deal of misunderstanding; not only within the agency, but with many
of the folks that we deal with.  That takes a lot of patience and
perseverance to make sure that we all understand one another.  Many
of the folks in Washington are learning, but we will need a great
deal of patience to get the level of understanding up to where it's
necessary.  We are making progress with our effort in developing
effluent guidelines even in the absence of proven analytical meth-
ods.  We will, whether or not we have all of the needed methods,
solve a lot of pollution problems.  We know a lot more about the
problems than we did when we began this effort some three years ago.
We do see that there are solutions to our analytical and regulatory
problems.  It would have been nice if we could, in our guidelines
and standards, have had specific analytical methods to be able to
limit all of the specific materials that we're concerned about.
We don't, so, we've had to proceed without them, but we can still
solve the problem.

     Whether or not our efforts bear fruit in the sense of regu-
lations where you will see some of these methods used to measure
compliance with permit limits, I don't know for sure.  We are
certainly going to be using many of the things that you're working
on, but whether or not in this round of guidelines, we will have
enough information to either specify a particular method, or
limit a particular toxic pollutant, is unclear.  However, that
doesn't mean we can't solve the problem.

     We have invited some other folks to join us in this meeting,
not only the analytical folks, but representatives from other
offices, i.e. the permit program, who will have to use the product
of our labors, and they are very interested in what we are doing.

-------
So, as we proceed we will be getting involved with many more of
these people.  It's very nice to see so many smiling faces,  and
we look forward to a very interesting couple of days.

     Mr. Telliardt  The first speaker is Bruce Colby from Systems,
Science and Software, effectively called S .

     Mr. Colby:  I'm from a company called Systems, Science and
Software.  We're located in Southern California.  About a year ago
we got involved with the effluent guidelines protocol and and we've
encountered a variety of problems with the preparation and anal-
ysis.  The first problem that we encountered is the one I plan to
spend Some time talking about here this morning.  It was encoun-
tered when we put our first sample in the purge and trap device.
The first thing that happened when we initiated the purge was that
we had an instrument full of foam.  It was guite disappointing to
us.

     For some of you who may not know what the purge and trap is
all about, (going over to the slide projector), the sample water
goes in a purging tube.  At the bottom there's fritted glass.  As
we purge an inert gas, or reasonably inert gas through that, the
volatile non-soluble organics are enframed within the purging gas,
passed through on a Tenex trap, and when the purging is finished,
the value is turned here, the trap is then thermally detoured and
back-flushed into the GC/MS, and from there we acguire our data as
if it were just a regular GC/MS run.

     The next slide shows a comparison of the results that we get
between a normal run, here, where we have very nicely shaped GC
peaks for about a 30 PPb standard solution, and what happens, if
we get just the tiniest bit of foam into the trap.  Apparently,
we're degrading some of the foam thermally in there arid we get a
lot of extraneous peaks.  Here is one in particular, and another
one over here  (indicating) and, actually, we seem to lose some
peaks.  I don't know quite what that means yet, but all in all it
doesn't seem to be a very satisfactory thing to happen.  It takes
anywhere from three to 24 hours to clean out the system depending
upon how bad this stuff was that got in there.

     The next slide shows you the kinds of solutions that there
are to this kind of problem.  The first one — I know it makes us
all chuckle a little bit -- but there are some here who know that
that's not really such a laughing matter.  This is a tremendous
problem when you just can't analyze the sample at all, and when
it's being done in a situation where you've got to get it done by
by day two and you can't, then you just don't do it.  An alterna-
tive which actually will provide data is to  modify the  sur-
face tension — add a surfactant like some of the silicone anti-
fearning agents.  There are a variety of them around.  Dow makes
some.  I believe there are a number of other manufacturers that
make them and we've tried those.  They work in our hands moder-
ately well.  They certainly eliminate the foam, but they do
provide some other interesting artifacts in the process.

 *See Appendix  I

-------
     Another thing is something that we came up with which is very
simple — it involves using a heat gun to remove the foam from the
sample tube.  The next slide will give you a little better idea of
what we're talking about.  We simple place the purging tube in a
little heat bath here, water bath, just slightly above ambient, and
then as foam starts to rise up in the tube we take a hair-style heat
gun and direct it at the top of the sample, or the purge tube, and
this basically, I believe, is evaporating the water from the foam
and nothing seems to get up.  We put a little piece of glass tubing
up above so that we can monitor for any water that might be condens-
ing.  If we start to get a lot of water condensing up in this line,
it's time to change that piece of tubing so that we don't get any-
more water in the trap then we have to.  It's not an application of
very high technology, but it seems to be very effective.

     The next slide will give you an idea of the comparison of re-
sults between, again, a pure standard -- this one is at 10 ppb I
believe — nad this one using a silicone anti-foaming agent.  I
believe this is anti-foam A by Dow.  I don't know if it's the best
one, it's just one that I had a chromatogram for at the right con-
centration.  You can see that there are changes in the relative
areas of a lot of the peaks that we see.  Some seem to be increas-
ing, compared to some of the others which are decreasing.  We have
a major artifact problem right here (indicating).  Depending upon
which anti-foaming agent you select, you can change the artifacts
around, at least that's our experience with it.  When we use the
heat gun we have a chromatogram which is basically very similar to
that which we got with a purer standard, except that there are,
again, a few changes in relative areas of the peaks.  The result
now is that yes, we can get the data, we don't ruin the instrument
or make a mess out of it, but what does this do to the quantita-
tion?  It sounds like  it might be fairly harsh thing to do to the
sample.  To investigate this we made up a series of standard solu-
tions.  One in pure water, and one with soap solution added to it
at about one percent in volume.  The standards were prepared over
a range of concentrations from one to a thousand parts per billion
and were separated by a factor of three each.  So, we had one,
three, ten, thirty, and so on up.  We lookied at all of the dif-
ferent components in there and prepared a calibration curve, if
you will, and compared the precision with which we could create
the calibration curve from the standard, the pure solution, with
that from the soap solution.

     The next slide will give you the comparison, or part of the
comparison that I'm going to talk about.  It uses something that
I define as an error factor.  I should just say a little bit
about that.  In the protocol we are presumably acquiring data
which falls within a window defined by minus 50 percent on the
low side, and a plus 100 percent on the upper side.  That window,
in effect, is one-half X, X and two X, or two to the minus one
X, X, two X, which is really two to the plus or minus one, or a
factor of two.  What I'm going to do is spend some time talking
about things in terms of, this is within a factor of 1.7, or,
this is within a factor of 1.1, and, basically, if it's within a
factor of 1.1 it is within ten percent on the plus side, and five

-------
percent on the minus side.  This allows us to have a zero point.
We can take averages when it gets bad.  The number gets higher
and that's all this means.  I think I'd like to answer any ques-
tions that there might be concerning this right now, if anyone
has one, because if this doesn't make sense, then none of the rest
of it is going to make sense.

     Mr. Blum;  A number of years ago, in one of our affiliated
laboratories, somebody tried to use a heat dryer to break foam,
and unrelated to working with wastewater, the glass cracked, the
organics splashed, and there was a loss of time accident.  None
of our safety committees will approve the use of a heat dryer
because it means that the analyst has to hold it, and nobody
wants it to happen again.  In instances where foam arises, the
preferred approach that people have found is to try to have a
type of a heating mantle fabricated to surround the area so
that, at least, if there is a structural defect in the glass,
there is some kind of protection that is afforded the operator.

     Mr. Colby;  That sounds like a great idea.  I think that
perhaps some of the manufacturers who make these tubes might do
well to come up with something like that for us to buy.

     Mr. Colby;  It was GC/MS.

     Mr. Stanko;  All of your slides were GC/MS?

     Mr. Colby;  Absolutely.

     Mr. Stanko;  Thank you.

     Mr. Flory;  Did you try any kind of expansion bulb, or were
you using the Tekmar so that it wasn't possible to do modifications
like that?

     Mr. Colby;  We were using the Teckmar LSC-1 and it is pos-
sible to use expansion bulbs with that device.  We held success
using our expansion bulb for samples which did it not foam
greatly.  If the sample was very prone to foaming, the expansion
volume required would be very, very large.  We gave up on this
approach for this reason.

     Mr. Flory;  Well,  we had the same experience you did.  The
first sample we got foamed all over the place.  We went and built
things and then we never got any more samples that foamed.

     Mr. Colby;  We've continued to get them.

     Mr. Lichtenberg;  Have you made any studies on the dilution
of the sample made in comparison to the data obtained that way
with what you have.

     Mr. Colby;  We have diluted samples as much as a factor of
ten.  We lose sensitivity for some compounds, so, we're not very

-------
encouraged with this approach.  We're not able to meet the sensi-
tivity requirements for a couple of the compounds unless we use
different quantitation masses than those specified in the protocol.

     Mr. Lichtenberg;  But, have you made any comparisons of data
on what you could acquire?

     Mr. Colby:  No.

     Mr. Chian;  About two years ago we were working on some sam-
ples collected from the laundry detergent industry.  The apparatus
we used was not a modification of the Bellar and Lichtenberg appa-
ratus.  We used the surface impingement principal to strip off the
gas. It turned out that impinging steam was a good foam breaker.
We were able to run quite high for a while and encountered a severe
foam problem.  This is just one other example.  It may be one can
look into another type of apparatus, but in your case you simply
modify.  There seems to be a more simplified way to do it.

     Mr. Colby;  Did you evaluate the possibility that you were
getting breakthrough of any of the more volatile compounds with
this steam method?

     Mr. Chian;  Well, in terms of breakthrough, yes.  You just
have to use a water dilution technique.  You dilute out your sam-
ple,

     Mr. Mosesman;  Did you find increased purging inefficiency
in the compound when you heated it in your chamber?

     Mr. Colby:  We didn't investigate that at all, so, I'm
afraid I can't comment on that.  I can't see how the sample
itself could be more than four or five degrees above ambient.
The amount of sample which constitutes the foam is really a
fairly small fraction of the overall
sample volume.

     Mr. Mosesman;  So, the water itself doesn't get very hot in
the chamber?

     Mr. Colby;  No.

     Mr. Chian:  When you were asking me about the breakthrough,
It was not clear which of the two problems you were concerned
about.  Were we using too much stripping gas, is that right, or
were you concerned about a concentration of the stripping samples
being too high?  What did you refer to?

     Mr. Colby:  I was concerned about the possibility that some
of the more volatile compounds passed entirely through the trap.

     Mr. Chian;  When we worked on this, we actually had two to
three traps in series to look into this.  The amount of the gas
volume used, the actual encounter, depends upon what compound
you're talking about.  If you're talking about some of those that

-------
are more volatile and poorer compounds,  of course,  you have some
break through there because the Tenax trap is not a good trap for
that.  Other than that, our recovery on the first trap was much
better than 80 percent.  Now,  where you've only got about five to
ten percent in the second trap, it is due to the additional amount
of purging gas used, but if your sample concentration is too high,
then you run into the problem of the breakthrough.   Vie used dilu-
tion, so, I do not quite follow what you were driving at.

     Mr. Wallin;  Have you, or do you know of anyone, that's
given any consideration to or done any work with the effect of
ultrasound on this problem of foaming?  Would it be possible to
put the purge tube in an ultrasonic water bath?  This might make
the compounds smaller and then micelle those compounds that would
foam.  Also, it might help get those components out where you have
plus and minus curves.

     Mr. Colby;  I'm not quite sure which way it would go.  We
have considered it, but in light of the fact that ultrasonics
have a tendency to impart a tremendous amount of mechanical agita-
tion, our suspicion was that we would increase the foaming.  How-
ever, we never checked it out.

     Mr. Moberg;  It doesn't work.

     Mr. Wallin;  Okay.

     Mr. Bishop;  We have, in the past, encountered some foaming
problems in dealing with certain municipal wastes.  One of our
chemists developed a foam trap modification which is It was rather
simple.  He put a section of perhaps a hundred or so capillary
tubes in the foam trap, and it worked beautifully for the  foaming
problems that we encountered.  It broke the foam and allowed con-
ventional purge and trap without any difficulty.  It's cheap and
easy to do; you throw away the group of capillary tubes after-
wards.  It's a pretty nice solution to modest foam problems.   I
don't know that it would solve all of your industrial foam prob-
lems.  They may overwhelm that kind of a trap.

     Mr. Colby;  They do, or some of them do.  It depends on the
sample.

     Mr. Marrs;  Let me cast our vote in favor of using  ultra-
sound; at least as a potential.  We use it for the foaming prob-
lems we have with the classic oil and grease analyses, and I
think it might be able to work.  The only thing is, ultrasound
is used in some wastewater treatments, so you might worry  too
about the energy you're imparting in there, and whether you'd be
actually destroying compounds, or not.

     Mr. Moberg;  We used an ultrasound gun and did not  find
that the smaller foam bubbles helped at all.  They went  right
through our little spray trap and all.  That is what I meant by
being negative about ultrasound.  However, we did improve  the
collection efficiency by making a very small ice coil trap around
the Tenax column.  That seemed to retain methyl chloride,  vinyl

-------
chloride, and a few others that have a tendency to go right on
through the Tenax.  I might also say that some anomalies develop
in your GC/MS data where you see two peaks for methyl chloride,
or vinyl chloride, or some of the other very volatile compound.
We suspected that — before we started to cool the Tenax column
-the materials would pass down through the collector tube,  and
as they got onto the analytical column, before we started our
analysis, they were moving half-way down through the column,
and, in effect, we were destroying our chromatographic separa-
tions.  We would see these high volatiles in two different, or
three different places, which was impossible by the theories of
chromatography.  We suspected that what we had was the material
moving down through the column before we did the analysis as a
result of the Tenax doing a poor job in collection down the tube.
This cooling coil around the Tenax column helped immensely to
keep them in one location before we started to heat and purge.

     Mr. Marrs:  That's interesting, Bud.  I was of the opinion
that the purge and trap device that we have vented not directly
onto the column during the purge — through the trap and onto the
column -- but into the room.  If you have a different piece of
equipment, that's an interesting kind of a problem that we've
never encountered.

     Mr. Moberg:  We've modified the Tenax column somewhat.  It's
a short, eight inch tube, and we were able to slip a little jacket
over that.  We tried quite a few variations, and where we had foam-
ing, or water problems I would like to say, we had to purge longer
or shorter times.  We had either losses, or changes of retention
time of the high volatiles, and it appeared that as we moved from
the collector Tenax tube into the column, the locations were crit-
ical.  That's all I could think of.  Once we stopped them in a
very short zone on the Tenax column, we were more effective and
getting good reproducibility in our peaks.

     Mr. Marrs;  Is that a commercial unit that you're using, then,
that you've changed?

     Mr. Moberg;  Well, we have quite a penchant for not using com-
mercial things if we can use our own little devices.  Even though
we purchase commercial devices, we tend to modify them after pur-
chase.  We find it is often more effective to construct our own.
We have a little spray trap above our purge tube, and then direct-
ly above that is our Tenax column with just a snap fitting.

     Mr. Marrs:  Then, you're not back-flushing your Tenax?

     Mr. Moberg;  We do not back-flush the Tenax...

     Mr. Marrs;  Your Tenax is...

     Mr. Moberg;  I'm sorry, we back-flush it as we go onto the
column -- the analytical column.

-------
     Mr. Marrs;   Our retention times have reproduced very, very
nicely — we're using a Tekmar — and it's been tremendously
reliable from the retention time standpoint.

     Mr. Telliardt  Is there anyone else?  Thank you,  Bruce.
Bruce's research was sponsored and funded by himself.   Every meet-
ing we take this point in the program to give Jim Longbottom an
opportunity to come tell us about his contracts.  Jim Longbottom,
for those of you who don't know him, is with our EMSL,  in Cincin-
nati, and is the project officer for oqr methods development con-
tracts -- twelve of them.  He's going to try to give us a status
report and a preliminary look-see.
                                8

-------
                       METHODS DEVELOPMENT
                          J. Longbottom
                              USEPA
     Mr. Longbottom;  I passed out a single sheet earlier.  I know
I'm not supposed to have any left over, so somebody doesn't have
one.  Did anybody not get it?  This table (Table 1) was put together
as an alternative to showing slides.  I hope it works out a little
bit better.  There's a lot of information missing off of it that
has probably been confusing if you've been trying to look at it.
This is a summary of the methods that we're writing up now and pre-
paring for inter-laboratory studies, roundrobinning to gather accu-
racy and precision levels.  We wound up with 13 procedures to cover
the 112 organic priority pollutants.  I'd like to step through the
procedures and tell you the technology that we are proposing here.
I also want to tell you about the delivery schedules that we have
for distributing the methods for comment and for proposal -- under
304(h) of the Clean Water Act — to provide methods for monitoring
the priority pollutants after they appear on permits.  I'd like to
start down the list and discuss the methods.

     The method 601 is a conventional purge and trap procedure with
a Hall detector, or equivalent, for the halogenated priority pollu-
tants.  When we propose a method, it's our best judgement as a com-
promise between the level of equipment that is necessary to provide
the accuracy and precision with a minimum amount of interferences,
and what that technology will cost as based on our observations of
what is likely to be encountered in wastewaters.  I'm sure every-
one is familiar with the purge and trap with the Hall detector.
Very few interferences are encountered because there just are not
that many halogenated materials in that volatility range.

     Method 602 is a variation of the purge and trap with a dif-
ferent column, to test specifically for benzene, toluene, and
ethylbenzene.  I have a photo-ionization detector down there as
the approach to overcome the interferences that we might get from
hydrocarbons in the same volatility range.  It's an example of the
compromises that we make between equipment investment and cleanup
steps.  Instead of using a flame ionization detector on the purge
and trap,  to achieve the sensitivity levels with a minimum amount
of interferences, we proposed a photo ionization detector to mini-
mize the number of peaks,  since you do not have very many cleanup
options available in a purge and trap type of methodology; in fact,
none.

     The third method is a high temperature purge for those mate-
rials -- acrolein, acrylonitrile -- that are not purged efficiently.
Compounds for which, at room temperature, you'd get a two percent
to five percent purging efficiency, and have subsequent wobble on
your results that would not be up to our standards of precision for
a test procedure.  So, we elevate the temperature and increase the
purging efficiency up to around 50 percent or so for those two com-
pounds, and we use a flame ionization detector.

-------
     Method 604 is for the 11 phenolics that are on the priority
pollutant list, and from here on down we're talking about extrac-
tion procedures.  The information that has been omitted from Table
1 is that all of the remaining priority pollutants are extracted
with methylene chloride as the solvent of choice.  The solvent
listed in the table is the solvent that replaces the methylene
chloride for the GC detector, or for the cleanup that is provided.
In all of the studies that our contractors performed, with the ex-
ception of the bendizines, methylene chloride emerged as the pre-
ferred solvent.  What we have is an extraction at different pH's
-- for acids, bases, and neutrals — but all the neutrals could
be co-extracted with methylene chloride.  So, in talking about
phenols, it's a methylene chloride extraction, at pH 2, followed
by flame ionization detection.  Then, if we have interferences
that we can't cope with, we have an approach to form an EC sensi-
tive derivative.  The fact that we are forming such a highly sen-
sitive derivative allows us to effectively dilute out the inter-
ferences and take advantage of the sensitivity of the electron
capture with a pentafluorobenzylbromide derivative, and overcome
interferences that way.  Then, after we neutralize the activity
of the phenol by forming the derivative, we have additional clean-
up options available to us, such as a silica gel column.  This
has been a difficult category for us and the interferences that
we've encountered have been fairly surprising.

     The method 605 is the method for benzidine, dicholorobenzidine,
and the diphenylhydrazine, although actually it's now just the
benzidine and the dichlorobenzidine.  Batelle developed a liquid
chromatographic approach with an electro-chemical detector.  This
is not a methylene chloride extraction.  It's an extraction per-
formed with ethyl acetate.  There is a cleanup -- a solvent back
extraction partition approach with sulfuric acid.  The method then
proceeds to an electro-conductivity detector on the liquid chroma-
tograph after exchanging the solvent into the methanol acetate
buffer, mobile phase for the LC column.  It's an exotic approach
for a Cincinnati method.  We don't anticipate that the occurrences
of bendizine will place a great demand on the method.  We've been
playing with some wastewaters from a benzidine dye manufacturer.
This appears to be the best approach that we can come up with for
those types of samples.  There are problems in thermal decomposi-
tion that we've encountered with methylene chloride when we concen-
trate an extract containing benzidine.  We do a base-neutral parti-
tion to concentrate the materials a bit, and then use the liquid
chromatographic approach.

     The method 606 is a rather straightforward procedure for
phthalates by gas chromatography.  We have listed electron capture
and flame ionization detection.  We certainly do not need that  sen-
sitivity with the electron capture, but the EC is somewhat prefer-
ential toward the phthalates, and gives a fair semi-selective
detection; that is, it's a little bit better than the flame ioni-
zation approach.  With the FID you'd get more interferences, so
we marginally prefer the electron capture detector.  We exchange
into hexane after extracting with methylene chloride for the
                              10

-------
electron capture detector.  We have two cleanup procedures that
are attached as options to the method -- the florisil column,  and
alumina column.

     The method 607, Nitrosamines, is a gas chromatographic pro-
cedure.  We're using a nitrogen phosphorous detector for that
method.  We exchange the methylene chloride with acetone for the
alkali flame detector.  Methylene chloride forms hydrochloric acid
in the flame which causes a broad baseline disturbance after the
solvent elutes.  Dimethyl nitrosamine, is such an early eluter that
there's a problem with the tailing solvent, so we have substituted
acetone to overcome that problem.  The diphenylnitrosamine, which
which is in this category, is separated from diphenylamine by a
cleanup procedure — an acid wash, 10 percent HCL in this case, to
remove the diphenylamine -- followed by a column cleanup in the
case that partition isn't completely successful.  We strip out
the amine and then chromatograph the diphenylnitrosamine on the
GC, allow it to convert to the diphenylamine, and measure it as
the amine.

     Method 608 is the pesticide procedure very similar to the
current 304(h) procedure for pesticides, except it is a methylene
chloride extraction followed by a solvent exchange to hexane, a
florisil column cleanup option, and the use of elemental mercury
to remove sulfur interferences if they are encountered.

     Method 609 was a grab bag category including two dinitroto-
luenes, nitrobenzerie, and isophorone.  The .contractor, Battelle,
ended up splitting the four compounds, doing the dinitrotoluenes
by electron capture and nitrobenzene and isophorone by flame
ionization detector after solvent exchange to toluene.  Battelle
also developed a florisil cleanup for that particular category
of compounds.

     Method 610 is the approach that we discussed at Savannah,
with a liquid chromatographic procedure to separate the particular
PAH's that comprise that part of the priority pollutant list.  We
use both an UV detector and a fluorescence detector.  We have the
option to use a silica gel cleanup procedure after 2H exchange from
methylene chloride to cyclohexane.  After the cleanup, if  it is
required, we use acetonitrile  for the LC separation.

     Method 611 is for the haloethers.  It's, a GC procedure.  After
solvent exchange to hexane we prefer  the Hall organo-halide detec-
tor.  We have a florisil cleanup procedure very similar to conven-
tional pesticide technology.

     Method 612 is for a family of chlorinated hydrocarbons.  These
are the hexachlorocyclopentadiene and butadiene type volatility
range compounds.  They are extracted with methylene chloride and
exchanged in hexane.  They are polychlorinated, so electron cap-
ture detector  is appropriate and we have a florisil cleanup pro-
cedure available for use.
                               11

-------
     Method 613 is for dioxin.  It's a complicated procedure, be-
cause we search for extremely low sensitivities with this method.
We have the method structured with electron capture as a screening
tool.  If there is a positive finding, or if you cannot resolve
or define the situation properly, you go directly into GC/MS
capillary column technology.  We exchange solvents after extrac-
tion from methylene chloride to hexane.  We include, along the
way, washes with sodium hydroxide, sulfuric acid and a column
chromatographic cleanup procedure.  We have a choice of three
that we are currently working with — a silica gel column, an
alumina column, and a mixture of charcoal-silica gel.

     As presented here, we have offered up a non-MS method for
each of the priority pollutants in this package.  There are
other approaches that we're considering, marked under "others".
For example, the pentane extraction of the volatiles is a method
that we would like to propose as an alternate.  We're looking at
phenols with a resin extraction.  We're trying to evaluate how
far direct aqueous injection can be applied in this particular
list of compounds, and we're trying to work on a protocol that I
discussed at Savannah.  In this protocol we take all of the neu-
trals, for example, and offer a combined write-up that would be
equivalent to the eight methods that I have here for neutrals.
With this protocol you extract all of the compounds with methylene
chloride, and just follow the formula to arrive at the minimum
number of determinative steps.  It's apparently easier to propose
than it is to write.  I don't know when that can be delivered.

     As far as delivery on these methods, we are now writing up
the procedures in .a format that we can distribute in the public
comment period.  We have, so far, prepared write-ups for methods
606 through 612.  We're having multiple copies of them prepared
right now.  They should be ready for distribution from EMSL to
anyone here that's interested as soon as we can get multiple
copies.  We anticipate mailing them on the 16th of March.

     The rest of the procedures on this list, 601 to 613, will be
prepared in the next two weeks and copies should be available for
distribution by the 30th of the month.  At the end of this month,
then, we will have a single monitoring method for each of the
priority pollutants.  These methods are going out for evaluation
during the inter-laboratory studies.  The studies have been de-
delayed a bit from the original schedule because of method format.
Again, as we've discussed in Savannah, after the methods develop-
ment and we say, "These are the methods," we go immediately into
an inter-laboratory study.  The methods are "round-robbinned" by
twenty laboratories.  They're going to be using three wastewaters,
a municipal effluent, and a drinking water, to do a full-blown
method evaluation study.  These will start as soon as we have
final writeups available.

     By the end of March, then, we will go with these methods
plus, perhaps, the GC mass spec procedures, and propose them to
the 304(h) Working Group of the Agency.  A couple of months after
                               12

-------
internal review, we anticipate that the methods will be proposed
under Section 304(h).

     The alternative procedures that we have here are taking a
little bit longer to finalize because they're being developed
through other studies -- and they're trailing the original pack-
age.  Where they will be exactly when we propose our procedures,
I don't know.  I would think that, at least the ones here would
have been fully evaluated and either proposed simultaneously, or
would follow very quickly the original package.  That's about it
in a capsule summary.  I would have liked to have had copies of
methods 606 to 612 ready for distribution at this meeting.  How-
ever, I did bring along some address labels.  If anybody would
like to fill out the address labels, and then on the 16th, when
we get the methods back from the printers, we can just stuff
envelopes and send them out to you immediately.

     Mr. Schaffer;  Jim, put them on the table.

     Mr. Longbottom;  Yes, we'll put them on the table.  Anybody
that's interested in the methods, the first ones will be sent out
that way, and then we'll be able to pick the addresses off of that
for later copies -- the package that we prepare on the 30th would
also be sent out.

     Mr. Dudenbostal;  Jim, I have four questions.  First of all
-- thefirst one isn't a question -you mentioned when you were
talking about "perhaps" putting GC/MS in the Federal Register.
Now, I'd like to clarify that.  I can't see the word "perhaps"
there, but to get to the actual questions: I'm interested --
suppose the laboratory had to analyze for 114 organics, which
are currently on the list.  Number one, how many work hours are
involved in doing his analysis; number two, what's the elapsed
time for the overall analysis; and number three, what would be
the equipment cost?  I think these are very important points
that ought to be addressed.

     Mr. Longbottom;  Are you proposing looking for all 114 at

once?

     Mr. Dudenbostal;  Yes, I'd like to get a feel.  That's what
we have to do.  That's why we're using GC mass spec.

     Mr. Longbottom;  Well, I'm not proposing these methods for
that application.  This is not a recommended way of looking for
114 compounds, as you know.  We crossed that bridge a long time
ago, and realized GC mass spec is the only viable approach.  There
will be GC mass spec procedures proposed for 304(h) use.  I wanted
to clarify that.  It is simply a timing sequence that caused the
"perhaps" to be inserted.
                               13

-------
     Mr. Dudenbostal;  Okay.  Now, I can see, certainly, that in
a case where you only have a few compounds that you're looking
for, that a number of these would certainly be very, very useful
and less costly; but I think somebody is going to want to know,
someone that's going to have to set up to do any analyses, what
we're talking about in the way of costs — perhaps,  a breakdown
on the costs, and work hours involved, so that they can compare
it to other approaches, such as using GC mass spec if they have
it available.

     Mr. Longbottom:  Well, that gets back to which compounds
you want to monitor, and at what levels, etc.  We recognize that
there's certainly a break-even point at which GC mass spec be-
comes available, but if you're only looking for three phthalates
that you happen to manufacture, GC mass spec is a rather expen-
sive way to go.  So, we are going to have both options.  The
"perhaps" was a matter of whether we can arrive at an acceptable
GC/MS write-up for distribution by the 30th.

     Mr. Flory;  Jim, you mentioned the Hall detector here for
halogens.  Does that mean that my coulometric detector is not
going to be acceptable.

     Mr. Longbottom;  No, I have an organo-halide detector, that's
what my little abbreviation in the table means.

     Mr. Flory;  So, that would be either one of those?

     Mr. Longbottom;  Yes.

     Mr. Stanko;  Jim, I have two suggestions and then a question.
My first suggestion is that when you evaluate the methods that you
have listed on your table, I would like to see repeatability, or
intra-laboratory precision, and I would also like to include, inter-
laboratory precision.  I suggest that you send duplicate samples
and get all of this information while you're doing it — do it
right.

     The second suggestion is, in addition to doing it on pure
compounds and distilled water, I would like to see some pure com-
pounds in some real world water.  Those are my suggestions.

     My question is, that on the list of methods that you have
just handed out, are these better by a magnitude of one order than
the protocol GC/MS procedure that is used for the 114 compounds
now?  Can you state this, or is the protocol GC/MS procedure used
for the screening phase equivalent to or better than, the methods
you have on your list?

     Mr. Longbottom;  Let me defer that question.  What we're say-
ing here, as we're looking at GC mass spec, is that we will con-
sider it suitable for verification and equivalent to these methods.
Now, this afternoon, the schedule calls for a discussion of the GC
                               14

-------
mass spec protocol for verification.  The question is more appro-
priate there to compare GC mass spec for screening versus GC mass
spec for verification to see whether there is an order of magnitude
difference.  Our position is that if GC mass spec is used correctly,
then the methods will be equivalent to these.

     Mr. Stanko:  Thank you.

     Mr. Longbottom;  As far as the design for the inter-laboratory
studies, they are rather elaborate designs.  There are duplicates,
with everything built into the system.  There are wastewaters (be-
ing collected with the assistance of effluent guidelines, we hope),
from relevant wastewater sources that have naturally occurring
background.  Logistically, though, generally the method studies
are set up so that the prime contractor sends out the wastewater
sample to the subcontracted analytical laboratories, rather than
each participant going out and collecting five wastewaters, or
whatever.  So, they are being conducted with wastewaters, but not
as many, perhaps, as you would like to see.

     Mr. Hall:  I'd like to make a comment concerning method 607.
Yesterday morning, at the Pittsburgh Conference in Cleveland, Ohio,
a gentleman by the name of Roland Anderson gave a paper from Tracor
describing a new mode for the detection of nitrosamines using the
Hall electro conductivity detector.  This mode used their new de-
tector and, essentially, it performed a low temperature, oxidative,
pyrolysis of the nitrosamines.  It formed something that was be-
lieved to be a nitrogencontaining acid; so, it is different from
that which was described several years ago by Rhoades and Johnson
which was not specific for nitrosamines, but would respond to
nitrosamines and other amines.  This is possibly a new technology
that could impact the detection of nitrosamines.  It was admitted
by the gentleman from Tracor that an extensive study was not done,
but the variety of nitrogen containing compounds including amines,
nitroaromatics, et cetera, did not interfere with the nitrosamines.
So, since this device would be the same device used for Method 6C1
and 611, it would result in a considerable savings since the list
includes five different GC detectors which are impossible to mount
on any one gas chromatograph.  It may be worth looking into in the
future.

     Mr. Longbottom;  In that regard, Southwest Research is doing
the research on nitrosamines.  And Dr. Rhoades has been in contact,
at Southwest, with the manufacturer of the detector and has been
in regular contact throughout the development of this, and yes,
we're tracking it pretty well.

     Mr. Hall:  I'm familiar with that.  I was at Tracor when he
was in contact with us and he was not looking at this new mode at
that time.  He was primarily looking at it as a low temperature
reductive pyrolysis rather than a low temperature oxidative pyroly-
sis on a nickel surface.  So, this is a new technology that was
just released yesterday.
                               15

-------
     Ms. Olsavicky;  You mentioned the florisil cleanup proce-
dures on several of your methods.   I wondered how different
those florisil procedures were,  because if you have to do two or
three of those different classes of compounds,  does that mean
you have to do two or three entirely different procedures?

     Mr. Longbottom;  The priority pollutant methods development
project was set up with a contract structure that called for mul-
tiple awards,  which was ill-conceived from our point of view be-
cause we ended up branching out into 12 different directions for
the research.   We've tried to ride herd on that as much as pos-
sible, attempting, for example,  to standardize the florisil prep-
aration.  Now, you talk about the differences,  for the haloethers,
for example, in 611.  This approach is similar to the procedures
that are used for PCB's with petroleum ether used as the solvent,
rather than six percent ethel ether.  As much as possible we tried
to standardize the technology, but we haven't been completely
successful with that.  Each of the 12 categories was a project in
itself -- what is the best procedure for this particular collection
of compounds,  and, secondarily,  how does this fit into the overall
analytical approach?  The write-up mentioned earlier with the com-
bined neutrals is difficult to prepare, as I said, because there
are some technique variations in the cleanup procedures.  Most are
based on the classic ether-pet ether separation, however.  The
cleanup procedures are optional.

     Ms. Olsavicky;  Depending on the sample?

     Mr. Longbottom;  Yes.

     Mr. Mosesman;  I have a question and a comment about the
method 610.  We have been doing polynuclear aromatics for quite
a while using GC, and I wondered why LC was selected here since
part of this is making things cost beneficial to the users.
Most of the methods here are using GC, where PNA's are using LC,
and if you were going to be looking at phthalates and aromatics
in your samples, you would have to buy two separate pieces of
instrumentation, versus just buying a GC with an FID on it to
look at things.  Aromatics have been somewhat common in many of
the samples, and I wanted to know why LC was chosen over GC?

     Mr. Longbottom;  The LC approach was selected for additional
research —(we pretty well knew the state of the art of the PAH's
with the gas chromatography) — because we wanted to fully ex-
plore liquid chroma tography.  It turns out that during the per-
formance period of the contract, Perkin Elmer came out with a
priority pollutant list, certainly better than you can get with
column for the LC that will separate all of the PAH's on the
packed column, and better, I understand, than you can get with
capillary column.  I think we would consider the use of a GC to
be a common variation of the procedure.  How it fits into what
we're doing right now, I don't know, but I anticipate that very
few people will end up monitoring with liquid chromatography, al-
though it alone will do the complete priority pollutant separation.
                               16

-------
     Mr. Mosesman;  My question is, how rigid are these methodol-
ogies going to be?  Most of the people will not be looking for one
particular type of compound on this list.  There are going to be
multiples of compounds and people obviously would want to be given
a choice as to how much instrumentation they were going to buy.  I
don't see why LC was chosen in this particular case over GC if it
would mean certain laboratories are having to buy two pieces of
equipment rather than one -- certain industries would have to buy
two pieces of equipment, rather than one.  Also, how locked in are
these methodologies?  People do need verification.

     Mr. Longbottom;  Right.  The problem that was posed at the
beginning of the research was:  given this list of compounds,  what
is the approach that should be used to separate and quantitate them
in the most cost-effective way?  If you're looking at those PAH's,
we feel that this is the best approach that we could find, if
you're looking at all of them.  If you're looking at a couple of
them — if resolution, overlapping peaks, is not a problem -- say
you're only looking for naphthalene, a simple gas chromato-
graphic procedure would be an acceptable method.

     Mr. Flory;  I'd just like to make a comment.  I thought I
understood what you were talking about here, but some of the
comments I've heard from the audience — I'm not sure.  I'd like
to clarify.  I understood this list to have been a list where an
industry only has to monitor for a few compounds...

     Mr. Longbottom;  Yes.

     Mr. Flory;  Therefore, this procedure may be cheaper and more
cost efficient than a GC/MS, which, I have assumed, is always
going to be an approved procedure -- some type of GC/MS analysis.
So, it's up to me, then, to talk with my industrial client to say
you've got these things to monitor for.  I can show you that a
GC/MS analysis is going to be cheaper in your analytical program
than doing nine of these methods, or three, or how many ever it
is, and I don't expect EPA to make those cost estimates for me.

     Mr. Henderson;  Jim Henderson with the Carborundum Company.
I'd like to question you about a comment you made a little earlier.
That is, that the cleanup procedure was optional.  I assume that
means when it's necessary,  it's not an option.  To follow up in
terms of cost-effectiveness, it appears to me that if an industry
had to monitor compounds in, perhaps,  two categories, that many of
these combinations of even two categories might require sample
processing using two entirely different processing procedures, al-
most starting from ground zero on each one of them.  Now, I would
seriously question whether or not such a process would be cost-
effective compared to GC/MS procedures.  That is, just two cate-
gories in combination, much less three or four, I suspect would
be cost-effective compared to GC/MS procedures.
                                17

-------
     Mr. Longbottom;  I don't know what the quality control over-
head is going to be on the GC mass spec verification procedure.
You might try to get a better feel for that this afternoon, but
to get a good GC mass spec number is going to be a little bit
more expensive, that's all I can offer now.

     Mr. Hall;  Jim, to answer some of the questions that were
brought up here in relation to the cost and everything else in
the bidding for the initial part of the inter-lab study (which
is going to be redone by our organization, again) there was
quite a variety of ranges in the price ranges through this, gen-
tlemen.  It varied anywhere from $15 per sample to $650 per sam-
ple, per category, in the two categories that we had. So, there
was quite a variety there.  I would hope to think that the aver-
age of that would be the reasonable thing.  We normally think of
processing by either the chlorinated hydrocarbons, or the phenols
category, anywhere from three to six samples a day depending
upon the degree of difficulty involved.  Again, I think we want
to emphasize that we were looking at effluent samples and not
necessarily at something that the guy is monitoring way back up
in his plant, hopefully, there is some degree of credence.

     To answer some of George's question in relation to the intra-
lab data, there is going to be some intra-lab data in the reports
on the method development that Jim will be issuing.  That's with-
in the lab where they tried the method on distilled water and
industrial waste — real world samples.  Again, it's limited, but
it does show you that there is some degree of single operator de-
cision and that data will be available.  Then, in the inter-lab
study, there will not be duplicates run, George as you know them,
but they will be running in pairs.  There will be three units of
the samples involved -- three industrial waste, one drinking water
and one distilled water, and one natural water source — so that
hopefully that data will help give us an idea of the range.  Again,
we're dealing with maybe three industrial waste sources out of
21,000 or something like that.  Maybe it's not a statistical repre-
sentation, but it's the best the government can afford.

     Mr. Telliard:  Thanks, Jim.  We are going to break for coffee
and after that we'll get back to data systems.

                             (BREAK)

     As the program indicates, it's now time  for data systems.
                                18

-------
                            DATA SYSTEMS
                           W. Shackelford
                                USEPA
     Mr. Shackelford;  I would like to talk directly to the con-
tractors about supplying magnetic tape containing GC/MS data, and
also about supplying the extracts of all samples, as was called
for in the protocol.  I might give just a little background on why
we are saving this data. The original consent decree defined 65
compounds and compound classes that had to be looked for in the
effuents of 21 industrial categories.  In order to make the analy-
sis problem more well defined, certain assumptions were made, and
certain data was surveyed.  That is,  had the compound been found
in water; was it a member of one of the compound classes; did a
standard exist, was it manufactured in quantity in industry; so
that the list of 65 rather ambiguous classes was resolved to 129
specific compounds, 114 of these being organic.

     It was recognized that the 114 organics did not cover the
known world of organic compounds that might be found in water.  It
was decided that EPA would retain the raw GC/MS data for screening
at a later date, using whatever methods were at hand. At the pres-
ent time, we have a program underway at the Athens laboratory to
screen this GC/MS data using computer techniques. We have a couple
of chemists who are involved in confirming or denying the computer's
choices of components and matches against the spectral library.

     The extracts were to be saved in case reanalysis became neces-
sary.  Such as the case that there might be a large, unidentified
peak in the chromatogram from some particular sample, which could
not be identified by spectral match techniques.  The extract would
then be reanalyzed.  That brings us to the point at which we are
now.  That is, only a few contractors have sent magnetic tape to
us, and only one contractor has sent extracts.

     The guidelines for the format on the magnetic tape are rather
liberal.  At present, we have received magnetic tapes in only two
formats and are not having significant problems with the magnetic
tape format at all.  However, we are having some problems with the
documentation form that accompanied each one of these.  The docu-
mentation merely gives us a number of handles by which we can
define each run that is on the magnetic tape.  The most important
three of these are three identification labels, one of which is
what we call the EPA sample number.  That is the number that is
written on a tag on the sample as it is taken in the field and
then delivered to your laboratory.

     The next number requested is whatever you call the sample
in your laboratory -- a lab sample number, if you will.  Finally,
we are requesting the name of that file on the magnetic tape.
In other words, when your operator ran this sample on the GC/MS,
he entered in some file name for it.  This file name is retained
throughout the processing, and also is deposited along with the
                                19

-------
file on the magnetic tape.  In this way we have three checks to
make sure we are talking about the same sample; it is important
that we have all of these.

     Some of the people who are using the Finnegan Incos System
were having a problem in that the software would allow them to
change a file name during data processing, but when the data was
written onto mag tape the original file name was retained.  We
were having a considerable number of errors in our processing,
because we could not match up the EPA sample number,  the lab
identification number, and the mag tape file name.

     Mr. Colby;  Are you saying, Walt,  that if I put in my Incos
Data System to rename a file, it would never rename it?

     Mr. Shackelford;  As far as the EPA mag tape program is con-
cerned, you stick with the original file name.  Yes,  Dean?

     Mr. Neptune;  I think we can just do the copies, the copies
of the disc and give it a new name.

     Mr. Flory;  Wrong.

     Mr. Shackelford;  Yes, I believe that's incorrect.

     Mr. Neptune;  No?

     Mr. Shackelford:  No.

     Mr Colby;  Bruce Colby, S  .  The Incos Data System creates a
header when you first define an acquisition, and the header file
name is never changed.  If you call out the parameter program,
I believe it will leave that in parentheses, next to any other
current working file name you have for it.  However,  the header
file name is what goes on tape, and you can change the file name
around in your data system -- let's call that a working file
name — any number of times you want, but you cannot get at the
header unless you go into IDOS and use the editor and fiddle with
it.

     Mr. Fahrenthold;  It doesn't say how to change it then prior
to results?

     Mr. CoIby;  No.  It was fascinating to learn that.

     Mr. Flory;  That must only be true when you make EPA format
tapes then?

     Mr. Colby;  I would not doubt that that is true, because,
with every other thing that I've ever looked at, the file has
been renamed, but I don't print out many EPA format tapes.
                                20

-------
     Mr. Shackelford;  I'm glad we had something that everyone
could learn from.  Finnegan became aware of this problem and sent
me a little write-up explaining how to cheek exactly what file
name is on your mag tape.  This is being mailed to all the con-
tractors.

     Mr. Neptune;  In fact, I have a letter typed right now, Walt,
and it's just waiting for signature to go out.

     Mr. Shackelford;  Very good.  The next thing that is on that
sheet that we're having some confusion with is the industrial
category, or SIC code.  We're asking that the EPA list of indus-
trial code numbers be added there.  The people I have checked with
received this list sometime ago.  It also had a list of sampling
contractors.  These are to be used in the column for sampler.

     On the back there is a space for comments.  The comments sec-
tion is meant for use when you have made some deviation from the
analysis protocol.  This would mean, for instance, if you diluted
your VGA sample before you ran it, a comment should be made. Or if
you used a concentration factor that was far removed from the 2,000
which is called for in a protocol, a comment should be made. Those
of you who have already sent me data, I think I have checked with
you about this already.

     Finally, the GC columns that were listed in the protocol were
Carbowax 1500 on Carbopak C, a one percent SP-2250 and Tenax GC.
If you're using different columns, we have to know about it.  The
comments column is the a perfect place to document this.  You can
also give inclusive dates for a particular column's use, if you
want to. If you're sending data for the first time -- in other
words if we have not looked at any of your tapes, or any of the
documentation forms that go with them — please send only one tape
and form. That way I don't have to send a large number back to be
redone.

     One of the problems that we are having in cataloging the tape
-- that is before we even look at the data, just getting it logged
into our system -- is typographical errors on the sheets.   100
percent is the minimum confidence level we should have in this
material.  We have a number of checks built into our  logging
system to make sure that what is on the tape is the same thing
that you have described on the sheet.  It would speed up things
quite a bit if you did some quality control on your typographical
errors on the sheets that you are sending.

     The next topic is mag tape use errors.  Apparently some of
the documentation on using the software to write the  data onto
mag tape is not clear.  The symptoms is that end-of file marks
get thrown  in the middle of the data, which tells us, when  our
program is  reading the data, that there is no more data following.
We have to  send the tape back and report that your tape and docu-
mentation do not agree.  Unfortunately, a computer is not a mind
                                 21

-------
reader.  When it gets to the end of the data, or it gets to a
mark that says there's no more data,  it quits.  The result is
that we have to send the tape back and get you to remake it.
Do it again.  As I understand, most of these problems are due
to the way you are writing the data onto the tape, and in mis-
understanding some of their documentation.

     The next thing -- and I don't think we've had this problem
but once --- is that you should definitely have a regular program
of preventive maintenance on your mag tape unit, including clean-
ing the Read/Write heads; because dirty heads are going to make
parity errors.  Whereas your machine might not catch it at the
time of writing, we might.  I think we've had one case of this
and the problem is hard to define.  The question becomes, "Is the
error something that happened in transit, or was it caused by
improper writing heads at the contractor, or did we damage the
tape in some way at Athens" -- it's hard to find out.  If you
will insure that you have a regular maintenance program of
cleaning your tape heads, I think we can eliminate those kinds
of problems.

     One item for contractors to remember is that people have
to read these sheets, and they have to transcribe them via key
punching.  Please make them legible.  Illegibility stops us
completely, and we have to send everything back to you and get
it redone.,  Remember, "zero" and "oh" are two different characters
and the computer will never match "zero" and "oh" as being the
same.  Thus, always diffentiate between the number "zero" and the
letter "oh."

     The next subject concerns extracts.  Each extract, of course,
will have to be matched with its corresponding GC/MS run. As you
might guess, there are no extracts for the VOA analysis, and we
have a list of required documentation for each extract. The first
two items are the EPA sample number and your local laboratory
sample number so we can match it with the run.  The fraction
type is extremely important, because given any sample number,
one more than likely will have three to five fractions that will
fall under the same sample number.  Of course, we want the in-
dustrial type — which is the two digit EPA code, and a treatment
stage.  Again, this is redundant with the data that you're put-
ting on your mag tape submission forms.  Now, your date sampled
and date sealed, and your laboratory name, will take care of  the
documentation on the extract.  When you mail these extracts, take
into consideration the people receiving it.  First, they're going
to need a packing list and documentation of what's in the box.
That sounds like a minor point, but sometimes it's apparently
not understood.

     The second point concerns sending large numbers of small
pieces. One needs to have them arranged in some kind of orderly
fashion within the box, and not just jumbled up or wrapped up in
paper and thrown in.  Put each ampule in some sort of a rack,
or a holder, so that the person who opens the box can have some
idea of what he's digging out.  To give you  some more hints on
                                22

-------
the extracts, be sure you mark the vial so we know whether or
not there has been any loss in solvent.  As far as labelling the
contents of your shipment, I have seen extracts shipped in as
ORM-A, since they're essentially solvent.  Dean, do you have
any ideas on that?

     Mr. Neptune:  The ones that are in methylene chloride solvent
will cause no problems from a hazard standpoint as far as DOT is
concerned.  If it's not a hazardous sample, then they do not con-
trol it, nor do they want any indication that the sample is not
hazardous or anything else; it should flow as normal traffic.  The
ones that are in hexane -- for instance, the pesticide fractions --
will have to be shipped and labeled as containing hexane; the
amount with the proper placards and things like that.

     Mr. Moberg;  I received communication that no vials were to
be sent until the procedure for sealing the vials was established
and that the hold was put on by, I think, Dean's office.  What
you're saying now is you're supposed to have been receiving vials,
and we're saying we've got samples and we're not sending them out
because the boss man says don't send them.

     Mr. Telliard;  Right.

     Mr. Moberg:  What's the score?

     Mr. Telliard;  When they're shipped, if you're told to ship
them, Bud, take note of Walt's concern here.

     Mr. Neptune;  I've got one more thing.  Occasionally, we have
called for a number of people, different IFB labs primarily, to
send sample extracts to other laboratories, primarily for pesti-
cides monitoring laboratory, so that we can have them take inde-
pendent looks at some of the sample fractions.  In those particu-
lar cases, they should be shipped like Walter is saying when we
call for them.

     There is still an outstanding question and the question has
not been adequately answered to date, at least as far as we're
concerned.  John Winter is working on helping us to get an answer
to that, and that is, what, if any, potential degradation of either
the sample or the solvent to the sample  is undergone during the
sealing process.  That's what we're trying to get resolved prior
to asking everybody to send the things  to Athens.  So, that is why
you've been asked to hold the extracts to date except for those
that we called for.

     Mr. Flory;  Do you have forms for  the tape log, I'd guess you
call it.*.

     Mr. Shacke1ford:  Ye s.
                               23

-------
     Mr. Flory;  Are you going to have one for the extracts also?

     Mr. Shackelford;  Yes, we'll do that.

     Mr. Flory:  Okay, we'd like to get those.

     Mr. Shackelford;  I'll make sure you get one, too.  Did you
not get a form for your tape?

     Mr. Moberg:  No forms.

     Mr. Flory:  No, we haven't gotten one yet.

     Mr. Mosesman;  Now, I just want to clarify a point -- do you
want the EPA number, the lab number, the type, and the industrial
category on the vial that contains the sample?  I mean, that's an
awful lot of information to put on a one dram vial.

     Mr. Shackelford;  Well, we will need to have clearly indicated
what the contents of that vial is, so that there's no chance of tags
being lost and that sort of thing.  I don't know of any other way to
do it than to attach it to the vial itself.

     Mr. Mosesman:  I know what you require.  I mean, couldn't we
limit it to one or two of these numbers rather than trying to put
12 or 15 numbers.

     Mr. Shackelford;  Unfortunately, our experience has been that
the rate of mistakes in writing and transcribing this material is
such that if we don't have more than one number to compare we lose
run into difficulties.  I know that most people are going to do
a perfect job, but so far that hasn't been true.  We have to have
more than one identification to match things up; the computer can
do that very quickly.

     Mr. Northington;  We had some sample labels made up that fit
on a one dram vial, and they have all the information you need.
There's not problem writing on them.

     Mr. Shackelford;  Did you send those out?

     Mr. Northington;  No.

     Mr. Flory;  One more thing.  I thought the sample number that
we now have on our vials, since we had sample tracking sheets,
is unique and identifies the sample.  Those are the only numbers
that are on our vials now.

     Mr. Shackelford;  Before the advent of the present sample
tracking system sample numbers were assigned using more than one
system.

     Mr. Flory;  You're talking about the earlier ones.
                                24

-------
     Mr. Shackelford;  Literally, a couple of thousand are out
there that do not necessarily have unique sample numbers.  The
earlier samples could very well duplicate every one of the unique
numbers on the sample tracking sheet.

     Mr. Telliard;  The next session that we'll start now deals
with verification.  Verification will vary in form depending
on the particular program, branch, and industry category you're
in.  Mr. Lichentenberg will talk about a GC/MS methodology that
effluent guidelines, and Cincinnati, and S&A people have been
working on over the last number of months. Some day we will get
it to you.  Dr. Fahrenthold is going to talk about what the
organic chemicals branch is doing on verification, at least in
the initial stages.
                                25

-------
                     VERIFICATION-ORGANICS
                         P. Fahrenthold
                              USEPA
     Mr. Fahrenthold;  I have some material to show you later on
something that is very complicated to try to describe without a
picture and, so, I think at that time I'll digress and go over
there and write it down for you on the viewgraph.  I think it
would be a little bit easier to understand what I'm talking about
if you see a picture.  It's kind of hard to understand what's
happening in sequential days of sampling, where the samples go,
etc.  It's easier if you can see a picture of it rather than me
just telling you about it.

     The organic chemicals branch is composed of several industry
categories.  When I speak of the methods that we are using, both
engineering and analytical to do verification, I'm referring to
the way we have proceeded in all of these industry sub-categories.
They are:  the synthetic rubber industry, which is pretty much
done.  Engineering and analytical work is finished on that one,
for the most part anyhow.  The pesticide industry is in progress,
the Pharmaceuticals industry is nearly complete, the plastic and
synthetic materials category and the organic chemicals category
are ongoing at the present time.  The methodology that I will go
through with you is in a state of constant evolution in contrast
to a lot of other things that you have heard already today and
probably will hear later on.  We look at the methodology to be
used as a continuum rather than as fixed, to use the common par-
lance of protocol so to speak.  Our world is so diverse that the
concept of a protocol gives us a lot more trouble than the real
world does, so, we have tried to minimize our problems and say,
"Well, we'll do it whatever way we think," and that's the protocol
for today.  We're very flexible, unfortunately, much to people's
dismay in our particular concept of what the world looks like.
Yes, we're a pushover, as Bill says; or anyhow that's probably
true.

     I started out about this time last year after screening a very
large number of chemical plants and plastics plants with a need to
quantitate the priority pollutants that were  Fou'il on screening.
We analyzed this problem at length -- over several months  -- and we
looked at the various options that were available, GC, mass spec,
and various other types of methods that can be used for specific
pollutants.  We made some decisions.  "We" is Lamar and I  and  vari-
ous other people that were associated with the program.  These
decisions were more or less revolutionary at the time.  They're
still a little bit revolutionary, but they're not quite as far
out as they once were.  At any rate, what we decided to do was
to go with gas chromatography procedures, almost exclusively.
Now, I'm going to tell you what I mean by almost exclusively in
just a minute, but the program concept is that we would use GC -
period.  No questions asked, that's itl  We would not use  the
screening protocol, or any modification of the screening protocol
to gather quantitative data.  What this does  is  substitute the
                                26

-------
analyst's expertise in his sample processing procedures, his column
selection for chromatography, and his detector selection, based on
what he actually needs to quantitate in the sample he has in front
of him.  It's a highly specific technique.  It may or may not have
wide applicability; that remains to be seen.  We don't care about
that.  I'm going to tell you in a minute why, but the objective
is to get something that is very specific.  We thought the best
way to do this is with GC procedures.

     The objective of our program was to gather data for the analy-
sis to serve as a basis for effluent discharge limitations.  In
other words, the program objective is to gather data to use to
establish effluent guidelines.  Now, if we work back from that, we
have to say, "Well, what kind of data do we need to do that?" and
"What is a reasonable amount of data that we can get for the amount
of money we have, the amount of time we have, and the amount of
people resources we have available?"  So, we looked at that and we
said, "Well, the standard verification concept calls for three days
worth of sampling and we'll stay with that.  We'll take three days
worth of sampling.  We won't accept any less and we won't take any
more, unless there are extenuating circumstances."  So, the program
evolved around three days worth of sampling.

     Now, the kind of methodology that we would use was already
determined to be gas chromatography.  There were ROJIH-;  r.-ioLors that
went into that particular decision.  There are two major ones.
One of them is that we are looking for what we think is the long
term, long range, most cost-effective way to monitor for specific
pollutants.  It is probably GC.  The other one is that we are look-
ing for a method that is also the most widely available, in terms
of hardware and technology - available, not necessarily to the big
12 that manufacture organic chemicals in the world, but to the
other 300 out there who don't have the staff and the resources in
terms of research labs to carry out a very extensive monitoring
program.  It was our thought that it is not necessarily fair for
the government to lay out a program that only those that were the
first ten in the industry could afford.  That tends to put compe-
tition in a place where it's not really appropriate in terms of
potential enforcement actions down the road for lack of data, or
some other problems that creep in to the scenario.  We had a third
objective.  Having three days worth of data is something of a
limitation from an engineering standpoint.  A lot of the comments
I make today are supported by engineering concepts and the need
for reliable information to feed into engineering cost design,
process design, and unit operations design.  Three data points,  !.!
turns out, is a somewhat marginal amount of data in some cases —
not in all cases — but in some cases.  Now, the discharge concen-
tration of priority pollutants in various industries can fluctuate
by as much as two orders of magnitude from any single process
line.  When you get involved in a situation like that, three data
points can be different by a factor of 100.  It's very difficult
for a design engineer to cost out a reasonable waste treatment
option on the basis of a feed rate that varies by a factor of 100.
That's a tough design problem.  We have been bitten by that problem
                                27

-------
on numerous occasions.  We had in mind a concept that requires
long term data, 90 days worth of data, in some cases, maybe even
more than 90 days worth of data, depending upon the specific
treatment system that is applicable to that priority pollutant,
or that conventional pollutant, as the case may be.  We backed up
from that train of thought and said it is not reasonable in our
view to require somebody to monitor every day for 120 days using a
GC/MS instrument; that that is in itself, a cost burden that is a
substantial burden, thinking that the cost of monitoring with that
particular instrument does not decrease as rapidly over a period
of time as a comparable GC procedure would, as you repeated the
procedure, many, many days on the same wastewater matrix.

     It's important that we go through the concepts that we have.
Soiiie of them have turned out to be, I wouldn't say wrong, but they
have been too narrow in our view of what the world really was.  We
have changed a lot of them from what they originally were.  They
have evolved, like most things, over a period of time.  At any
rate, the essence of our concept is still basically true, today.

     Now, I'm going to talk a little bit about verification.  Be-
fore we get into the analytical methodology part of it, which is
the part that most people are interested in, I have to tell you
some of the other facets of the program as well, because it turns
out that from an engineering standpoint they're very important.
We would be slighting the entire concept if we didn't mention
that.  When we talk about verification, we talk about quantita-
tion of priority pollutants, be they in a raw waste load discharge,
whether they are across some treatment system -- biological treat-
ment, for example — or inlet and outlet to the treatment system.
We are interested in quantitation of that prior-;.;-./ pollutant, in
that particular wastewater matrix.  We are also interested in de-
veloping a procedure, a GC procedure, that works for that priority
pollutant in that particular sampling matrix.  That is what we do;
the program is based upon those two activities.

     The way we do this requires three activities.  It requires
sampling, it requires flow measurement, and it requires analysis
of the samples taken during the site visit.  I want to talk to you
just very briefly about sampling and flow measurement.  They are
important, but this is not the proper forum for extensive discus-
sions.  Sampling, however, in this regard is much different than
it is in screening.  The sampling that's is predominantly manual
sampling.  There is very little sampling that is done by composi-
ting devices, and my guess is — I have not checked recently —
that only 25 percent maybe, in that range, of the samples, are
taken by compositing devices.  The other 75 percent are taken
manually.  The reason why they're taken manually is because the
SCO Sampler is not rated for Class 1, Group D installations, and
in the plants where we are sampling they require the explosion-
proof enclosures for the Isco Samplers, and there is no such thing.
Therefore, we are obligated to use manual sampling.  Because we do
manual sampling, we normally do not sample a 24 hour day.  We would
                                28

-------
normally sample something in the range of an 8 to 12-hour day with
aliquots, or grabs taken every 2 to 3 hours during the 8 to 12-hour
period.  There's a big provision in that, and that is you cannot
do that except on processes which operate continuously.  You cannot
do that on batch, or campaign operations.  If the plant that you're
sampling produces a material in which you have an interest in the
discharge, or production of priority pollutants, you have to ar-
range a special sampling procedure for batch processes.  You either
have to sample through 24 hours, or you have to flow composite grab
samples every few hours, or you have to make some other special ar-
rangement to sample that process.

     Let's talk a little bit about flow measurement.  There are
several basic ways of making flow measurements.  The ideal way is
to go out and read the guy's meter and hope it's right, usually at
the discharge of his treatment plant.  When you leave the treatment
plant you are faced with decreasingly attractive alternatives to
measure flow.  You come up with things like mass balances on water;
which is pounds of water into the process -- and there's not a Max-
well demon that eats any -- so that's what comes out.  Or, you in-
stall a V-notch weir where you can grab a sample, read the weir,
write down the reading, and what you have is an average of instan-
taneous flow readings over a 12-hour period.  This can be assumed
to be an average flow rate.  If you do this for three days you have,
then, an average of three days averaged over one day, if you will.
It's a very inexact portion of the sampling.  It's very difficult
to get highly accurate flow measurements.  There is an art to that;
there are a number of people who are much more skilled at it than I
am, and I think probably we should leave the sampling and flow mea-
surement where it is.  If there are specific questions, later, I'll
do the best I can to answer them.  Because of the time frame, we
best go on to the analytical methodology.

     I have organized this in chronological order.  I'm not sure
that that's necessarily the best way to do it, but in terms of
where we are today I think you'll get a better appreciation of our
status if we look backwards for awhile and see where we came from.
Briefly, I'll go back, and then we'll spend considerably more time
speaking about the way it's done today.

     In the summer and spring of 1977, there was prepared a collec-
tion of analytical methods -- predominantly GC techniques.  I have
called it a collection just for my own purposes.  If you have your
own private term for that particular group of methods, you're wel-
come to it.  We won't dispute that.  Yes, a collection of a group
of methods.  In November of 1977, the branch contractor, Catalytic
in Philadelphia, sat down, and we looked at these methods in light
of what our program objectives were.  We said, "Okay, how are we
going to solve this problem?  We have to get from here to there
and we have this little green book, and how do we do it?"  First
of all we had to understand what our problem was.  Jim Longbottom
has given us a little bit of a handle on this.  If we use GC/MS as
a baseline in the four basic fractions that are analyzed in GC/MS,
                                29

-------
we have to take those four fractions and expand them using the GC
techniques.  If we do that, I get 12 methods,  Jim Longbottom gets
13.  There's evidently something I'm missing,  but anyhow, I get
12.  The 29 volatiles that are analyzed by purge and trap slid
into two methods for us.  The acid fraction goes right on through
just as the acid fraction; it's the base neutral fraction where
most of our headaches are.  The single base neutral extract ana-
lyzed by GC/MS turns out to require 8 methods  by GC techniques in
order to cover the base neutral fraction.  The pesticides remain
as pesticides.  It turns out that the base neutral area is where
most of our problems are, and, unfortunately,  it turns out that
that's where a very goodly number of the priority pollutants that
we found in the industries we're concerned about, normally turn
up.  So, we looked at this and said, "Well, we're going to play
with what we got.  We've got a collection of methods.  We don't
have any other brilliant alternatives.  We will do whatever we
can."  We laid out a program for our contractor.  First, I'll
describes the constraints placed upon him during the development
of this program, and then I'll tell you what his reaction was to
these constraints.

     We said to the contractor "If you go to a plant to do veri-
fication sampling, you must analyze that sample and give me a num-
ber.  You absolutely must.  You have no option, whatsoever.  You
cannot come back and tell me that you cannot analyze the sample;
that is unallowed.  That is not allowed under any circumstances.
There are three others:  you may not use purge and trap.  You may
not use liquid chromatography in any fashion,  and there is a strong
bias against any type of capillary column in your GC.

     Now, the reaction this generated on the part of the contractor
is probably described by a number of adjectives like fright, trepi-
dation, fear, apprehension, what am I going to do, I don't have any
tools any more to do this horrendous job.  There's more, it gets
worse.  You are to operate with these constraints in this scenario.
You are to go to the plant and take a sample at every location
which we specify to be sampled.  You have two weeks to take all
of those grab samples and develop a GC method to be used for three
days worth of sampling.  At the end of the two week period you go
back to sample for three days.  The worst is yet to come.

     You will sample like this.  You will grab a sample at every
sample location on the first day of sampling.   Immediately when you
grab a sample you will start compositing on the first day, making
the first day's composite sample.  You will send the grab samples
to the laboratory and immediately analyze them, no matter when
they get there, no matter when they arrive.  You will start —
whenever the sample get there -- analyzing.  On the second day
of sampling you will take off the first day's composites.  You
will start the second day's composites; mail the first day's com-
posites back to the laboratory.  On the third day of sampling you
will take off the second day's composites, and start the third
day's composites in the field.  The laboratory will report during
                                30

-------
the morning of the third day the results of the grab samples taken
on the first day.  In the evening of the third day it will report
the results of the composite samples that were taken off the day
before.  On the fourth day of sampling, it will continue taking
off the composites, mailing them to the laboratory, and the labo-
ratory will report out the results of the second day's composites.
The same scenario goes on until you have sampled for three days.

     The procedures that were used in this scenario were to be
those that were collected during the time frame -- early  '77 to
January, February of  '78.  We put together this scenario and said,
"We will pick six plants.  We will try this scenario and see what
happens."  Everybody agrees we're going to do this.  We picked six
plants in February, March, and April of 1978.  Actually it went
slightly longer than that, but most of the basic work was done dur-
ing February, March, and April of 1978, at six plants.  There were
five relatively small plants and one very, very large plant.  Now,
when I say a small plant, I mean a plant where, in three days, you
would take in the range of 50 to 60 samples; a large plant would
have, perhaps, twice that many during the same time frame,  We went
through this scenario with a great deal of difficulty.  We learned
a lot of interesting things.  The first thing we learned is that, we
should count on using GC/MS, which we had anticipated prior to that,
as a backup confirming technique for all of the GC techniques.  It
was predominantly used during the methods development phase, but
only about ten percent of the number of samples would be run by
GC/MS in order to make certain that the identifications made by GC
were correct.

     The samples that were run were not highly quality controlled.
The spiking that was done for recovery data, was done by blind
spiking.  We ran into problems with blind spiking, not because of
any procedural problems, but because the concentration of the pri-
ority pollutants in the effluents where we sample;! wa--  rm j;i
-------
from our standpoint we remember the constraint — the guy has to
produce a number.  We learned that the turnaround time of complet-
ing 50 samples in a week is also unrealistic; that it imposes a
tremendous burden on the analytical laboratory,  particularly in
light of the fact that it was forced to make a commitment to a
24-hour operation at rather infrequent intervals.  The "infre-
quent intervals" part comes from the fact that airlines are highly
unreliable when they deliver samples.  The last thing we learned
that's of note is that the first day's grab samples produced very
little for us in terms of information.  They were taken for the
purpose of determining whether the information we had on screening
was really reliable information, and whether or not we needed to
go back and essentially rescreen that plant with GC procedures
prior to getting three days worth of data.  We determined that
those grab samples have little merit if you do some other things.
The other things that you must do are:  you must analyze the proc-
ess that you are sampling very carefully, and you must analyze
the screening data very carefully to determine if you are going
to miss something by eliminating the use of an analytical method
on any particular sampling.

     Now, there are some things I have forgotten to tell you and
I want to go back for one minute.  When we went to a plant, we
were operating under the premise that every compound found in
screening, in concentration greater than ten parts per billion for
organics, would be analyzed in the samples we had by an appropriate
method.  Now, because of the structure of the chemicals industry
and the fact that the screening samples were taken at combined
influent to treatment, we never knew where the priority pollutants
were coming from.  We would go into the plant to try to sample a
process line, and we would be obligated to sample every process
line by every method even though, at times, there were no priority
pollutants detected by a certain method.  It is highly wasteful
of people's time to continue generating data that shows zero,
essentially, or less than ten parts per billion, or less than
two, or whatever the number happens to be.  However, it allows us
to make a refinement in what the original procedure was.  So,
during September, October, November, December, and up until to-
day, we have a different plan.  It's much more reasonable and
more scientific; we have stopped our approach of brute force and
awkwardness I hope, for the most part anyhow.  We work a little
bit smarter, and not quite so hard.  I think we are getting some-
thing which produces much better data and which, from an engineer-
ing standpoint, is it's far superior to what we had in the original
six plants.

     We have made some significant changes.  We have extended the
time for methods development to approximately 30 days.  There are
obvious problems in scheduling plant sampling, and the time of
methods development is variable.  It depends upon whether it's
snowing in Chicago, or whether the plant's process is shut down,
or whatever, but it rarely gets shorter than three weeks.  It fre-
quently gets longer than four weeks because of turnaround time,
and the inconvenience of sampling and lots of other things.
                                32

-------
      The  time  for  the  completion  of  the  sample  analysis has been
 extended.   It  is no  longer  essential that we complete the analysis
 of  every  sample during the  week the  sampling takes place.  It turns
 out that  that's an extremely burdensome  procedure.   It calls for
 multiple  shift operation  in a  small  laboratory.  All of these
 things  create  havoc  with  an analytical laboratory that is trying
 to  cope with massive numbers of samples.  We have done one addi-
 tional  thing.  We  have developed, with the help of our in-house
 analytical  staff,  a  very, very good  QA/QC procedure.  I was not a
 part  of this.  I'm reporting on what they have done, which I think
 is  really good.

      I  want to put this on  the viewgraph, because there's just no
 way in  the  world I can ever explain  this to you without some help.
 Priscilla Holtzclaw, Bill Cowen and  Dean Neptune are the contribu-
 tors  to this particular procedure.  We have not changed the three
 days  worth  of  sampling, but we have  changed the way  the samples
 are processed  in the laboratory significantly.  This is what hap-
 pens  in the laboratory -- the  field  work is not much different
 than  it originally was — on day  one in  the laboratory, the samples
 are brought in, extracted,  and analyzed  in duplicate.

      On the second day, when the  second  day samples  arrive in the
 laboratory, one aliquot each of those samples is extracted and
 analyzed; one  aliquot  is held  pending the determination from the
 first analysis.  It's  spiked at an appropriate  level, then ex-
 tracted and analyzed.

      On the third  day,  three samples are taken, one  of which is
 extracted with duplicate analysis, one of which is held 24 hours,
 spiked, extracted, and analyzed,  and one of which is held 96 hours,
 then  extracted and analyzed.

      There  is  a terribly involved rationale for all  of this that I
 confess I do not completely understand.  I do not understand it
 well  enough to repeat  it verbatim.   From the rationale, it is all
 very  logical.  It  makes emminently good  sense to do  it this way,
 and,  with the people we have talked to,  it appears to be a rea-
 sonably cost-effective way  of  getting QA/QC into our three days of
 sampling.   I think that's all  on  that one.

     Mr. Rushneck;  You said on day two the location where the
 spike comes from.   Where does  this spike come from on day three?
 The amount  that you  spiked,  what's that based on?

      Mr. Fahrenthold;   It's based on analysis for that day's sample.

     Mr. Telliard;  The top line.

     Mr. Fahrenthold;  Top  line.  The first sample on day three is
 just analyzed straight and  not spiked.   We are only spiking on the
basis of what a true value  is.  Priscilla Holtzclaw has more infor-
mation on it;  I'll defer to her.
                                33

-------
     Ms. Holtzclaw;  One thing that may make it clearer is an
explanation of the three different analyses on day three.   One of
the questions that arose when we decided to hold the sample before
we spiked it was, "what about the degradation of the sample itself
during the holding time?"  The purpose of the third analysis on
day three is to show that no significant changes have occurred in
the priority pollutants during that holding time.  It will be
extracted and analyzed at the same time that the spiked sample is
extracted and analyzed on day three; that spike will be determined
by the original extraction from day three.

     Mr. Stanko;  I'd like to ask one question.  On your day three
sample — it also applies to day one and two -- is all of your
sample collected in one container?  Then, do you try to sample
your sample to get your first day's extract, your 24-hour hold
and then your ^6?  Does it come out of that one container, or are
you talking about four, five, six, or whatever number of containers
that's required?  At what point do you really sample the sample?

     Mr. Fahrenthold;  The general tendency is that you do aoL ever
split a sample; it is a sample to be analyzed and it is analyzed.
If you don't need the rest of it, I assume it would be thrown away.
We will not, for example, go to the field and divide our sample and
give part of it to anybody else.  If you want a sample with us, you
take your own sample.  I know that's not what you asked me, okay...

     Mr. Stanko:  No, that's not the question.

     Mr. Fahrenthold;  ...All right, but the same principle applies.
We would not refuse to do something in the field, and, I don't
think, do it in the lab.  Bill probably knows the specific answer.
We maybe have not done that a time, or two.

     Mr. Stanko;  The question is, is the day three sample a collec-
tion in a three gallon bottle?

     Mr. Fahrenthold;  The answer to that is no.  The answer is, if
we are going to analyze three samples on day three, the field man
will composite three samples to be analyzed in that particular
fashion, simultaneously in the field.

     Mr. Stanko;  Thank you.

     Ms. Olsavicky;  When you're  saying  "hold,"  are you implying
that under cold storage?

     Mr. Fahrenthold;  Right.  They don't put those on the shelf.
You cannot preserve the sample, so you refrigerate them and hold
them the minimum length of time that you can.  That's the reason
for the 96-hour sample.

     Ms. Olsavicky;  Would there be any  reason why you would take
a sample each of those three days and, then, in  addition compile a
three day composite?
                                34

-------
     Mr. Fahrenthold;  No.

     Ms. Olsavicky;  To your knowledge,  have any of your contrac-
tors done that in their verification phase sampling since Septem-
ber?

     Mr. Fahrenthold;  I don't know of any that have done that for
me.

     Ms. Olsavicky;  Thank you.

     Mr. Fahrenthold;  I am interested in three data points.  If
somebody wants to composite it, I only get one data point, that's
not the objective of the exercise.

     Mr. Telliard;  However, they might have done it for another
branch, or another industrial study.

     Ms. Olsavicky;  Of EPA?

     Mr. Telliard;  Yes.  In other words, this is the organic chem-
icals branch.  We have one in food, and energy and mining, and,  for
a study on textiles or something, they may have composited three
days.  What you will find here is that not all branches look the
same.

     Ms. Olsavicky;  This is not including all organic industries.

     Mr. Telliard;  That's why Paul had mentioned early on the list
of industries that he is addressing this study only to the Organic
Chemicals Branch Industries.  Textiles does not fall under his
domain, for example, nor does petroleum refining.  So, in another
branch they nay have a different methodology.  You'll hear pulp
and paper when he's done; how they're handling their stuff will be
different from what organic chemicals is doing.

     Ms. Olsavicky;  Who is in charge of overall EPA samplings,
when the EPA would come in and sample?

     Mr. Telliard;  It would depend on the industrial category
and the study being carried out.  It comes to the branch people.
If whatever they need for engineering data can be gotten by a
three day, 72-hour composite, that's what they'll ask for.  In
this particular case, Paul has asked for a specific set of steps
to be taken to meet his needs to develop engineering.  I may just
want a grab sample for my particular industrial category.

     Mr. Fisher;  This means one sample here split in half, or
whatever, but that's not what you really intend.  You intend that
these are two samples, so there should be two blocks here?

     Mr. Fahrenthold;  I'm not going to answer that, because I
don't know the specific answer, but I'm going to ask Frank Hammer
to answer it, or Clarence Haile, or Bill Cowen, to answer.  Let me
tell you why and that is, I cannot guarantee you that at some time


                                 35

-------
or another in that plant our sampler didn't drop all of the jars
that he had to take those three samples in, and that at one time
he did do exactly what you're saying.  I cannot guarantee you
that every sample was taken specifically that way.  We do not like
to do that.  We would turn over heaven and earth to keep from it,
but it probably has happened at some time or another.

     Mr. Fisher;  I was wondering what intent is here.

     Mr. Fahrenthold;  The intent is that that's the easy way i:o
make the drawing.  Rather than to draw one, two, three, four, five,
six, or seven boxes, I only drew three.

     Mr. Fisher;  So, this is just one bottle there?

     Mr. Fahrenthold;  Wrong, that is not just one bottle.  That is
a sample for that purpose.

     Mr. Hammer;  In the interest of making sure that we had a
single composite as we came out of the field, we had one bottle.
Now, with respect to day one, part of the reason for the double
extract is to insure that we could,  in fact, violently shake the
bottle to insure the two extracts were the same.  On day two --
incidentally, we did turn this around in less than 24 hours on day
two so we didn't have a holding problem -- we took one bottle,
extracted it, analyzed it, and got the results.  Then we took a
sub-sample from that and spiked it.   So, it was one bottle.

     Mr. Fisher;  In the field?

     Mr. Hammer;  In the field; one composite bottle.

     Mr. Fisher;  In the original six plants?

     Mr. Hammer:  Ye s.

     Mr. Fisher;  What do we do with those?

     Mr. Hammer;  Right now, I'm saying one large composite on each

site,  each date.

     Mr. Farenthold;  I'm wrong, George.  I'm sorry; one bottle.

     Mr. Hammer:  Part of the reason for this day one was to insure
that we could do this.  Now, the question has always been, can you
shake two and a half gallon bottles  to get enough uniformity about
it?  Essentially what we've got is one, big, hefty guy that just
shakes the stuff out of them and then takes the two sub-samples.

     Mr. Cowen;  I'd like to explain that on day three the bottom
line holding period is not really fixed.  It's my understanding
that at 96 hours, it's whatever the holding period was.  If he's
                                36

-------
able to do it in less than 24 hours, then, the holding period for
that control, the bottom line control,  is whatever he had to hold
it for to get his spike in.  Those nuin1 >ers,  really, 24 and 96 are
variable depending on the laboratory running the spiking.  It
might be 24, it might be longer.

     Mr. Hammer;  Right.  So far we've hovered at about, ^f. n. 1,11-3;
96 to about 105, something in that ball park.   It could be longer.
The whole idea was to find out if there's degradation.  Day two
was turned around in less than 24 hours, so we had information in
case day three showed degradation.  We could assure ourselves, at
least under the guidelines of the protocol,  that we did not have
degradation problems, or hopefully not, and still have good spiked
data in terms of recovery.

     Mr. Fahrenthold;  Let me go back just a minute.  The sampling
that we did,or George Stanko did, was done in the same manner.  I
asked Clarence Haile while we were sitting here listening to Frank
Hammer.  The scenario in that particular case is exactly as you
see it, and I was wrong when I told you that there are many samples,

     Mr. Northington;  Did you see any degradation after 96, 105
hours?

     Mr. Hammer;  Not really.

     Mr. Flory:  There's a philosophy here that bothered me.
Going back to some of your beginning assumptions at the beginning
of your talk, for example, do you assume that GC would always be
the most cost-effective method?  I presume you're talking about
doing all of these verifications analyses here by GC methods, is
that correct?

     Mr. Fahrenthold;  Yes, that is.  That's a good question and
I'm going to tell you how the impact of this has changed.  It's
not a stepping stone, then, as it originally was, but go ahead.

     Mr. Flory:  Let me have a little dialogue with you.  My next
question is, when we get through these things, are these methods
then established in concrete, and are alternative methods going
to be difficult to get approved?  In my opinion, if someone, you,
or your development contractors, had not been clever enough to make
a GC/MS method less expensive than a GC method, that doesn't mean
that I'm not going to be able to.  I can pick many, many cases of
methods that are prescribed by NIACSH, et cetera, that use gas
chromatography.  They're specific.  There are all kinds of studies
that have been made into them.  Most of them I can take and de-
crease the time required to do the analysis sufficiently on a
GC/MS, to make it cheaper on a GC/MS than on a GC.  We do that
all the time.  We can take a 20 minute analysis and get it done
in three.  I want to be sure that wherever this goes that it
doesn't keep me from being able to do those things; from being
clever and innovative in my own laboratory.  I'm not getting the
feeling that that's where it's going as I sit here.
                                37

-------
     Mr. Fahrenthold;   Don,  I'll comment on that later.

     Mr. Telliard;  We're coming to that.

     Mr. Mosesman;  I  don't see how you can extract and analyze
some samples in the same day.   There are times when with emulsions
and whatever, you can't. You're extracting, evaporating down,  and
analyzing the sample in the same day.  I don't see that happening
with every sample.  I  see a lot of samples, between emulsions and
whatever -- especially pesticides, where we have to put them through
florisil cleanups and  evaporate them.  You would take two, three,
four days,  maybe, before you can analyze the sample once you have
it.  You're assuming you're getting the sample at 9 o'clock in
the morning, which is  not the case with most shipment of samples,
and working it through and having people leaving at 5 o'clock in
the afternoon.  I don't see how you can expect to extract, cleanup,
evaporate,  and analyze the sample in the same day on all nice
clean samples.

     Mr. Fahrenthold;   If you leave at 5 o'clock and you get it at
9:00, you won't make it in the same day; not at the rate we take
samples, anyhow.  I don't know what the specific scheduling is
for manpower in the lab, but I feel sure that probably two shifts
are required during a  day; not graveyard shifts, since we're not
taking a grab sample any more on the first day.  I do not have the
hands-on laboratory knowledge to bn <-ible to answer that specific
questions with all of  the details.  If you give me a couple of
months I will be happy to tell you how many were analyzed this
way, and how long it took, and what the results were, and a lot
of other things.  It's a little bit early to be able to specifi-
cally answer those kind of questions.

     Mr. Mosesman;  Well, them, it's my contention that the people
who wrote this don't have hands-on experience in analyzing a sam-
ple.  They don't know it takes 48 hours to do a pesticide sample.
A system like this could never work.  Holding time on day two be-
tween the extract and the hold could be a. lot kmore than 96 hours.

     Mr. Fahrenthold;   Well, I suppose the only thing I can tell
you is that it works and people do it.  I don't know the rationale
as to why it works and why your reservahions aren't true, but I
can assure you that we have done  it on numerous occasions.  It is
not easy.  It's very difficult, but one of the things that does
happen to go in our favor is that 30 days before this the man in
that laboratory had that identical sample and that same matrix and
if it formed an emulsion, then he had time to plan ahead and take
the proper precautionary measures to try to minimize the amount of
time it takes him to analyze that sample.  There is something I
didn't put up there.  He was not going into day one, day two, and
day three cold.  He has had some  epxerience with that particular
sampling matrix.  If the plant where we are sampling does not do
something that's really far out to change the organic matrix, then,
we anticipate the same kind of procedure performance during the
three days of sampling that we saw during the grab sampling a month
before.
                                38

-------
     Mr. Mosesman;  Okay, aside from the work-up time, you're also
assuming that your instrument has 100 percent up time.  I can tell
you from fact by GC and GC/MS, that's not the case.  You can be
down a week on an instrument and not be able to analyze the sam-
ples at allf just because your instrument's not working.

     Mr. Fahrenthold;  That's very true.  We never assume that.
We encourage people to have more than one instrument that's flex-
ible enough to do that.

     Ms. Holtzclaw:  Paul, could I add something to this also?
The reason the third analysis on day three was added is because
we have no illusions that the sample is going to be done in one
day.  Were we able to get a sample, extract it, analyze it before
somebody went home that night, have them come in the next morning
and spike, and then do the second spiked analysis, we would have
no worry about the holding time.  We have seen, from our experi-
ence, that normally, as on day three, the spiked sample can be
run within 96 hours of receipt of sample in the laboratory.  This
means that there have been 96 hours during which th^y "h-v^e Laken
the sample and extracted it, analyzed it, and analyzed the data
to determine what spiking levels are necessary.  That's why we have
that check on day three; because we don't know how long that will
take, and we want to be able to show that whatever time it does
take, between the analyzed sample and the spiked analyzed sample,
that there has be^a u > l;-';jr--'lation.  We're not assuming 24-hour
turnaround on this.

     Mr. Mosesman:  Well, then, you should point out that that
96 hoursjust happens to be the number you found so far.  It could
be drastically increased.

     Ms. Holtzclaw;  Exactly.  When we were setting it up, and
that's what Bill Cowen was saying, we intended that number to be
very flexible.  We say 96 hours because we found from experience
that that's about the average; that number can vary anywhere from
24 hours if they're extremely lucky, on up, but it's there.  The
second and third samples on day three are analyzed concurrently,
simply  to show that when the sample is spiked and analyzed that we
haven't lost anything.  This is done on each of the matrices to
show — we may find in a plant, for example — that there's degra-
dation because of one matrix, and not because of another.

     Mr. Johnson;  We're working on this particular protocol and,
basically, the majority of our samples are finished in 24 hours.
Circumstances with samples do happen, however, and some of them
take it out to close to 96 hours, but it's not 8:00 to 5:00.

     Mr. Fahrenthold;  There is one additional point that I failed
to make.  That is, that this day one, day two, day three protocol
applies to every sample location.  A number of these plants have
between 12 and 25 sample locations; so, there is a considerable
amount of effort expended in terms of the analysis of the samples
taken and upon the size of the facility.  There have been some
                                39

-------
additional revisions.  They are in the context of a flexible cri-
teria, so I'm not going to go into the specific details of these,
but I will tell you about what has been done.

     First of all, the spiking levels that you see up here have
been revised from the original ones that were used, which was 100
parts per billion for every priority pollutant, and every sample.
There is a procedure that Priscilla Holtzclaw and Bill Cowen hn"1
put together that calls for a varied spiking level depending up m
the value found in the unspiked sample.  We have also made some
revisions in the dictate that every sample has to be run by every
analytical method.  There are ways to eliminate the use of a
method at certain sample locations if, for example, the pollutant
was found in screening and it was not found in the grab samples
taken on the first visit to the plant; or, for example, it was
not found in the grab samples taken at the plant and it was not
found in the first day's composites of the plant, and we don't
anticipate that this particular priority pollutant could be
produced by any of the process chemistry in use at that site.
That situation would frequently arise if we found something on
screening, and it happens to be a one-time occurrence and we
never saw it again.  We would look for it again during the grab
samples on the first day's composites.  If it doesn't show up in
either of those two and we don't have any reason to suspect it
being there, then we forget about it.

     There is an additional consideration that we need to make.
Our previous constraint as to the limits of the use of GC/MS is
weakening somewhat.  During the period of time between December
of 1977 and today there have been, as you will hear later on today,
some considerations given to quantitative GC/MS.  Under certain
circumstances, because of cost-effectiveness and the relative in-
effectiveness of using six GC procedures to analyze for six single
components, it could be much more cost-effective to go to GC/MS.
That's a door that we have not opened entirely.  We would still
prefer to go a few more steps down the road on developiong GC pro-
cedures before saying we have a protocol that we think is good
enough to stand the test of any sample matrix we might find, and
that we will now compare GC methods versus GC/MS methods strictly
on a cost-effectiveness basis.

     Mr. Flory;  Don Flory.  One comment there.  It would seem to
me like the best way of getting the analyses done is to have a
certification program to send out standards.  If people get the
right answers, no matter what methods they use, that should be
acceptable.  Have you thought about that approach?

     Mr. Fahrenthold;  Yes.  If you remember back to one of the
comments I made originally, I said it was our intention to develop
a method that could be used for some long term monitoring program,
like 60 days, or 90 days.  It is not our intention to say "This
method was used for that plant.  You may not use any other method."
We intend to say, "If you don't have the capability to develop your
                                40

-------
own method, here is one we have found that works.  If you use
another method, here is the QA/QC procedure you should use to make
that method a valid method.

     Mr. Flory;  Well, the thing that's b^n V>>ihoring me is that
I've seen some of the RFP's that have come out.  They say for con-
tractors to look into alternate methods.  Therefore, I assume that
it meant that a method is going to be pretty well fixed in con-
crete, and in order to be able to use any alternate method, you're
going to have to go through this elaborate procedure.  My labora-
tory, which is a small labor.* :.:>••/, is not going to have the money
to do that in order to get a procedure into the accepted methods.

     Mr. Fahrenthold;  I understand what you're saying.  I'm going
to answer you a different way and hedge it from a different direc-
tion.  The methods that we have are used for the purpose of pro-
viding the organic chemicals branch with engineering data are not
necessarily for the same purpo^ ->.  i.-. > -i R^P' s that you have seen
for developing analytical methods.  Now, whether those purposes
will coincide at some point or un; , T n >  .oi-_ know.

     Mr. Lichtenberg;  I'd like to make a clarifying point on the
methods' acceptability.  The EPA EMSL Laboratory in Cincinnati is
the official EPA method promulgating body.  The methods that Jim
Longbottom was talking about earlier will be the official recom-
mended EPA methods.  Now, what Paul Fahrenthold is talking about,
the methods that he will have there, may or may not be considered
equivalent methods.  In reviewing his methods, if we judge that
they are different methods, then there wi.~U b^ i-.'i^  used for pro-
viding data to show that they are equivalent to the methods that
we will be promulgating, which will have inter-laboratory study
data behind them.  So, just a point of clarification that the
EMSL will be promulgating the official methods -- these others
may very well be acceptable as alternative methods -- but that is
where the situation lies.

     Mr. Fahrenthold;  Yes, I feel that I owe you another comment.
If you'11look at the sheet that Jim Longbottom handed out, we are
now classified as others.  We're hoping to upgrade our efforts.
Before we have more questions, I should mention where the program
is and where it's going.  We have presently done 11 plants under a
scenario either very similar to this,  or in the original six, very
diverse from this, although as close as wo  .''-.ild get under the time
constraints and the money constraints that we had at the time.  We
will probably do in the neighborhood of 25 more plants using the
basic GC methods, some of which are identical to the ones that Jim
Longbottom has mentioned in his sheet.  Based on our conversations,
they're very,  very similar.  I suppose that after we get through
with the 25 plants we should have a fairly decent handle on what
our methods and our versions of those methods will be worth.  We
don't always know where they're going to go, because we don't know
what we're going to find when we go to a particular plant and look
at the matrix that we have in that parhi.oalar raw waste water to
pick the priority pollutants from analytically.  However,  we should
be close in a lot of cases.
                                41

-------
     There is one additional comment along that line.  Since we
have mentioned GC/MS, I think that it's important to note that any
kind of long range sampling by industry should consider the cost
of the technical options that you have.  You cannot know what the
most effective monitoring system is until you have looked at some
screening data rather carefully -- either GC/MS data, or screening
on various process lines with GC, or some other technique.  When
you have done that,  you can fairly readily determine whether GC/MS
is the most viable option for you, or whether some combination of
GC procedures is, in fact, the most viable option for you.  It was
Bruce Colby that mentioned that earlier th!.:-; m. >r-iing.  He's right;
this is not necessarily the best option for any single installa-
tion to use when monitoring their own particular facility.  It is
entirely possible that GC/MS, because of the distribution of the
priority pollutants throughout the method scheme that you have,
will in fact be the cheaper alternative, or some other method
could be for that matter.  It depends a lot on the capability of
the individual laboratory and the developments that are coming
down the road.

     There are a couple of things that are not in Jim Longbottom's
sack that he has given us the description of today that we are go-
ing to look at.  We're going to look at head space techniques for
haloforms.  Because we're interested in methods that are fast, the
analysis time is a significant factor when you try to do three
days worth of sampling; so, we're interes i-.m! i a looking at other
kinds of methods.  We're interested in looking at methods where you
use a fairly small amount of sample, and do sequential extractions,
perhaps on the same sample, by adjusting pH, or some other kind of
cleanup procedure.  This tends to minimize sampling problems — the
number of samples that you have to have, the number of large con-
tainers for extraction in the laboratory, and things like that.
There are still problems that persist.  They depend upon the plant,
and the kind of products that are made at the plant.  The two most
convnon ones are the phenolics and the polynuclear aromatic materials.
Those are still very difficult to analyze with the procedures that
are available at the present time.  When I say difficult, I'm imply-
ing that the method isn't as rapid as what we would, ideally, like
to have, or it's not as specific as what we would like to have, or
the separation technique is not as good unless you go to a capillary
column.  We have reduced a lot of our original restrictions and pro-
hibitions against people using capillary columns.  In some areas, it's
quite apparent after doing ten or eleven plants that some people will
have to upgrade their technology, or they will have to use contract
services, because that really is the most effective way to do it.  You
will either have to train somebody to do it if that happens to be your
discharge problem, or you will have to go outside and have somebody
do it for you.  The only way to get really reliable data is to use
one of these more sophisticated methods, in spite of the fact that we
have tried to use the simplest, most widely available one possible.

     Just one additional thing.  Normally, there are somewhere be-
tween five and ten analytical methods used per plant.  That includes
the analysis for metals.  This applies to organics, and normally,
                                42

-------
the sampling locations will range from 15 to 30 in some very large
plants.  There aren't too many plants, I hope, that have as many
as 30 sampling points.  The majority of them have something in the
neighborhood of 10 to 20 I suppose — something in that range.
That's about our story.

     Mr. Telliard;  Paul, thanks for the presentation.  I think
it's important to look At. (.'..'• needs that the various programs are
being confronted with the time constraints, money, and I say when
we get this all worked out, we can look back and say, gee, look
at, isn't that wonderful.  Continuing on verification methodology,
Ron Novak from GSRI is going to talk about the methodology that's
presently being employed on the pulp and paper study as it relates
to their verification program.  Again, it's another twist, some-
thing different, and it's trying to meet the needs of developing
engineering data.  Verification has been misinterpreted a lot by
saying, "verifying." People say we're verifying an analytical
method.  It's not what we're saying.  We're verifying the effects
of treatment technology, i.e., what a clarifier will do to handle
these types of pollutants.  So, in some industries we've found
that it's been difficult, not because analytical methods are so
hard, but we can't find some of the things.  Also, Stan Norm, you
should be aware that pulp and paper, unlike the rest of u* wit >
have 129 compounds, added a few.  So, this will be another vac! -.-
tion.  Bruce Colby is going to give one presentation after this,
briefly, in a relationship similar to what GSRI is doing.
                                43

-------
                  VERIFICATION - PULP AND PAPER
                            R. Novak
                              GSRI
     Mr. Novak;  Today, I'd like to discuss the verification
analysis that GSRI is performing for the BAT review of the pulp
and paper industry. What I'm going to describe this afternoon is
something that is fundamentally quite different from what you
heard this morning about the Chemical & Organics program. Primar-
ily, the reason is because of the extremely complex nature of the
waste streams that originate from this particular industry.  I
will over-simplify by saying that the industry is involved in the
conversion of a natural product, wood, into a semi-synthetic final
product, paper and paper products.  Since the raw starting material
is of biological origin, it is chemically undefined.  Most of the
natural products that are found in a tree, i.e., most of the soluble
compounds, are deliberately removed by the industry so that their
presence does not compromise the quality of the end product.  This
industry is also blessed with its own particular set of priority
pollutants, or non-conventional pollutants.  These additional 13
compounds are mainly carboxylic acids that are of historical impor-
tance because of their acquatic toxicity.  These compounds, because
they are carboxylic acids, cannot be analyzed for under the con-
ditions of the regular screening protocol.  Because of problems of
extreme effluent complexity and the additional compounds, it was
decided that GC/MS would be the major analytical tool of the veri-
fication phase analyses.  GSRI was asked to propose an analysis
protocol that also incorporated some measure of quality assurance
above and beyond that normally associated with the screening
protocol.

     Our experience with the screening phase led us to propose the
purge and trap procedure for the 18 compounds of the verification
phase.  To impart a measure of quality assurance to the data, we
spiked with a number of deuterated solvents that in themselves
are priority pollutants.

     Thirty (30) semivolatile organics, were identified in screen-
ing including the 13 industrial specifics.  GSRI developed a pro-
cedure to analyze for all of them in a single injection into the
GC/MS.  An acid neutral extraction was used followed by derivatiza-
tion of the phenolics and the carboxylic acids and capillary column
GC/MS.  Quantitation is done by response ratios relative to Din"
anthracene.  Every sample was spiked with a set of standards that
was representative of each class of compound that we had to look
for.  In addition, the data was retained on mag tape and urideriva-
tized extracts were saved for repeat analysis, if necessary.

     The main emphasis of the verification sampling scheme is on
the influent to and effluent from treatment, where three 24 hour
composites were taken, essentially sampling each stream in tripli-
cate.  It was not feasible in this industry, because of the extreme
diversity of plant configurations, to routinely sample specific
                                44

-------
process streams, such as the Chemical and Organics Branch does.
However, speciality points were sampled if they could be clearly
identified with any particular pulp mill chemical process.  The
sampling also involved the typical blanks, the volatile organic
blank, the Isco sampler blanks, and GSRI method blanks.  Most of
these blanks were analyzed by GC/MS.

     The daily quality assurance program that an operator uses for
the actual analysis of the samples by GC/MS follows a rigorously
defined schedule.  Samples from a particular plant were analyzed
together as a set In Lh« chronological order in which they were
taken.  Included in each sample set were the analyses of at least
one method blank, one external standard, both volatiles and semi-
volatiles, and at least one field blank.  The remaining field
blanks are analyzed by GC-FID.

     An external calibration standard was run with each maehi'ir>e'
tune, or with each sample set, whichever came first.  If we had
to break up a sample set on a couple of different days, and the
machine was tuned on those days, then, one would also have to run
the calibration standard over again.  For every particular set of
analyses that was done on a particular machine on a particular
day, there were a standard that was also analyzed under those same
conditions.  The standard contained all the compounds of interest
so that it could always be referred to for quantitation.  For vola-
tile organics the standard was used as an external standard for
quantification; for semivolatiles, the standard was used for daily
update of the relative response ratios.  After each sample was run,
the operator displayed the extracted ion current for the internal
standards that were spiked into that sample, integrated the peak
areas, checked retention time and peak shape.  This data was then
j.'aerated in hard copy and filed for future reference.

     Ml of the internal standard peak areas were either plotted
or tabulated in a quality assurance log.  If there was any large
variation in the internal standards, the run was flagged,  and very
carefully re-examined,  and re-analyzed if necessary.  This type of
procedure allows the operator to immediately assess at the comple-
tion of each analysis if there are any basic errors in the GC/MS;
chromatographic failure, mass spectrometer drift, or gross proced-
ural errors in processing the sample.

     The verification phase volatile organic compounds are listed
in Figure 1.   They were mainly the halogenated ethanes and methanes
and a few aromaLu^.  ,\ll of these compounds had been identified
during the screening phase.  In addition,  an industry specific com-
pound group was xylenes.  It is thought that xylenes can originate
from the breakdown of wood during the chemical processing,  so
the xylene isonmers were added as a non-conventional pollutant.

     Volatile organic analysis parameters are summarized in Fig-
ure 2.  The major exception to the screening protocol is that
every sample was spiked with antifoam C.  It is an absolute neces-
sity in the pulp and paper mill effluents to control the foam.  It
                                45

-------
                    VERIFICATION PROGRAM
                 VOLATILE ORGANIC COMPOUNDS
methylene chloride                      1,1,1-trichloroethane
chloroform                              tetrachloroethane
carbon tetrachloride                    trichloroethylene
dichlorobromomethane                    tetrachloroethylene
trichlorofluoromethane                  benzene
dibromochloromethane                    chlorobenzene
bromoform                               toluene
1,1-dichloroethane                      ethylbenzene
1,2-dichloroethane                      xylenes
                             Figure 1
                                45a

-------
	Figure 2	VOLATILE ORGANIC ANALYSIS BY GC/HS	


Sparging Conditions

      Sample  size:  5.0 ml + Antifoam C  (3 ppra) + deuterated internal
        standard spike (50 ppb)*
      Sparge  volume:  A50  (30 ml/min x 15 min)
      Sparge  gas:  UHP N
      Adsorbent medium:  60/80 mesh Tenax GC
      Trap dimensions:  7.5 inches long  x I/A inch OD stainless steel

Desorption Conditions - Backflushing

      At 200°C with 200 ml/min He backflush for 8 min, GC oven at -30°C

Mass Spectrometer -  HP 5980/SI-150 Data System

      amu Kange:  20-300
      Integration time:  6 msec/amu
      Source  temperature:  150°C
      Silicon membrane separator

Gas Chromatograph

      Start GC/MS scanning at -30°C
      Reset oven  to  50°C.  Temperature  program of 50" to  220°C  at
        8°/min; hold  at  220°C for  16 min
      He flow carrier:   20 ml/min
      Column:  0.1%  SP1000 on 80/100 Mesh  Carbopack C

 -benzene - d_trichloroethane -  d_methylene  chloride - d
  toluene - d^       dichloroethane - d,        xylene - d1Q
                                  45b

-------
is not at all unreasonable to see the entire sample entrained into
a column of foam during the VOA procedure.   We found,  on the recom-
mendation of the NCASI, that antifoam-C (Sigma Chem. Co.) was very
effective in controlling foam.  It was usually added in the 3 ppm
range, however, sometimes it was necessary to add a great deal
more (100 ppm).  Antifoam-C did not interfere with the priority
pollutant analysis but there is a silicone contaminant from the
antifoam that elutes after xylene.

     Volatile organic analysis was performed on a 5980 HP mass
spec with an SI-150 data system.  The gas chromatograph was held at
-30°C while the sample was being thermally desorbed from the Tenax
trap.  The MS data collection was started immediately after desorp-
tion.  Then we started scanning immediately.  This allows for very
sharp peaks of early eluting compounds.

     Deuterated internal standards wer^ soi.V^ i.ato every sample,
blank, method blank, and the external standard at 50 parts per
billion.  The standards were bought from the various commerical
suppliers at purities >98 percent.  Chloroform was included origi-
nally, but a large number of the samples had high endogenous levels
of chloroform and it proved to be impractical to try and measure
the deuterated ion, which is only one A.M.M. greater than the
native, when you're dealing with such large concentration differ-
ences .

     The long term internal standard response ratio over approxi-
mately a five-month period  is given in Figure 3.  This data was
taken from the analysis of the e<: CT-.I.-.."!. standard that was done 41
times during a five month period.  This corresponds to roughly the
analyses of 400 VOA samples.  Most of the variation probably re-
sults from the inherent mass spec tuning variation, operator judg-
ment in peak integration and the error inherent in actually spiking
the samples.

     Figure 4 is represenative of a typical quality control chart
that was constructed during the analysis of the VOA samples.  In
this case, 14 samples were analyzed in a 20 hour period, which
during the course of the study was a normal working day for our
machines.  We plot the integrated peak area versus sample number
for the six deuterated compounds which are used as internal standards.
Samples one and two are the method blank and the external standard
followed by alternating influents to treatment and effluents from
treatment.  This graph shows that there were no matrix effects ob-
served in any of the samples.  These samples were  'typical1 of the
foaming problems that we encountered.  Even when  'atypical' samples
were analyzed and the data from those examined, there doesn't ap-
pear to be any matrix effects on purging.

     The recovery of the internal standards from 18 different in-
fluents and effluents from treatment is shown in Figure 5.  The
recoveries are all relative to benzene-d^ and range from 90 to 100
percent.  There was no statistical difference betwen influent and
effluent for treatment in these samples.
                                46

-------
o
F-
cr.
q
in
  •
ID
o:
                   en  — o
                   cvi  cvi ro
—  CM
UJ
CO

O
CL
CO
UJ
o:

Q
a:
     C\J
     00
             Q
             CO
             •H
             c
             0
             CD
       CM 00  0>     *•  00
       - O  O  I   —  -
            .   •  I        •
       o o  o     oo
       •H -H  -M 9  -H  -H
       CM r-  o  ,   *
JC

C

•f"
»
X
1
Q.










^
II
C
                              46 a

-------
                                      Figure 4
                        PEAK  AREAS
        100

         50
        100 r~
                                           A
 50 L
 50
C2d3CI3
          0
       200
       A  A
                         A
                                _A
                         A A
                                       A
                                    A
                                                  A
 C6H3d3
        100
       200 r-
A _
                     A -_
                                             _ A
 100
200 r-
                                                  A
                                                         A
  C8d,0
        100
                                    a
                                a  a     L3
                                                  n
           o
8
                                 1C)
                                               12

-------
o
o:
CO
Q
o:
CO

_J  X
h-
U.
O
o:
UJ
o
o
LU
CK
Q

CO
+1
 =
 l_
 0)
 >
 o
 o
 o>
o:
(O  CM  O  O  CM
—  CM  —  —  —
+1  +1  -H  -hi  -H
o>  i^  r^  CM  ^t-
CD    00  O  O






•o
c
MV
-J
O
Q.
E
o
o





CM
TJ
1
CO
TJ
u.
0
O
CO
c
CO
>%
JZ
••—
CO
s


^
T3
1
CO
c
o
-C
^_
CO
o
1_
o
0
•o
CVJ
•»
ro
T3
1
CO
c
o
.C
CO
o
^
o
e~
•JHB
O
l_
H

^
—








ro 2
TJ T3
| I

c 5
CO JD
13 ^>
~* *^^
o x
H Q.











CO
II
c
                                                 LT)


                                                 (1J
                                                 )-J

                                                 3
                          46c

-------
     The semivolatile pollutants are listed in Figure 6.   They
represent three classes of conventional pollutants,  the phenolics,
the polynuclears, and the phthalates.

     The non-conventional pollutants are the C-^g unsaturated fatty
acids, (oleic,  linoleic, linoleinic), the resin acids, (pimaric,
isopimaric, abietic dehydroabietic), the chlorinated derivatives
of these compounds created during a bleaching process and there
are two specific phenolics (trichloro- and tetrachloroguaiacol).

     The semivolatile analytical protocol that GSRI  used is given
in Figure 7.  Samples were preserved in the field at the time of
collection by adjusting to pH 10.  This was to prevent the degra-
dation of the phenols and resin acids.  There are data in the
literature, some generated by EPA, to show this is satisfactory
procedure for sample preservation.  Upon arrival at the lab the
sample was adjusted to pH 2,  spiked with the deuterated internal
standards and serially extracted with dichloromethane.  Emulsions
were a considerable problem with these effluents because of the
presence of naturally occurring soaps; fatty and resin acids.  On
a scale of one to ten on a degree of difficulty, they'd probably
fall in an eight or nine on the difficulty of extraction, however,
we use separatory funnels exclusively.  The emulsions were handled
by filtration through glass wool and/or solvent addition.  The
sample was concentrated and spiked with the internal standard,
D,Q anthracene, and stored.

     GSRI undertook a method development program to examine a num-
ber of different derivatization techniques.  Some of the on-column
methylating reagents gave rather strong reagent peaks right in the
middle of the chromatogram.  The classical way of derivatizing the
carboxylic acids is with diazomethane.  We decided not to pursue
that, mainly on a toxological basis, the precursor of diazomethane
is a guaranteed mutagen. The basic procedure GSRI arrived at for
derivatization is very simple.  100 microliters of the sample is
mixed with 50 microliters of MSTFA, which is a silylating reagent
available from Pierce Chemical and the sample is heated gently for
15 minutes (70°C). Then you're ready to inject.  Normally, we would
prepare enough samples in the morning to be analyzed in the mass
spec during the course of a day.  The derivatives are very stabl«
as long as you carefully exclude moisture from the sample.

     The actual GC/MS parameters are also given in Figure 7.  GSRI
used the 5985 HP GC/MS and SE-30 SCOT columns for the chromato-
graphic analysis.  The SCOT columns are sold by S.G.E, (Austin,
Texas).  One microliter of the derivatized sample is injected in
the splitless mode.  The capillary column is interfaced directly
with the mass spectrometer, i.e., 100 percent of the column ef-
fluent enters the source.  To overcome the problem of what trime-
thylsilyl derivatives will do to the source of the mass spec-
trometer, we vented the first 11 minutes of the run to atmosphere.
This kept the bulk of the reagent out of the source and we found
that we could analyze between 100 and 200 samples before the source
needed to be cleaned.
                                47

-------
               VERIFICATION PROGRAM
          SEMIVOLATILE ORGANIC COMPOUNDS
phenol                             naphthalene
2-chlorophenol                     anthracene
2,4-dichlorophenol                 di-n-butyl-phthalate
2,4,6-trichlorophenol              di-ethyl-phthalate
pentachlorophenol                  bis-(2-ethylhexyl) phthalate
2,4-dini.trophenol                  di-n-octyl phthalate
                                   butyl benzyl phthalate
               FATTY AND RESIN ACIDS
oleic acid                         dehydroabietic
linolelc acid                      monochlorodehydroabietic acid
linolenic acid                     dichlorodehydroabietic acid
pitnaric acid                       epoxystearic acid
isopimaric acid                    dichlorostearic acid
abietic acid                       trichloroquaiacol
                                   tetrachloroquaiacol
                         Figure 6
                            47a

-------
      Figure 7     GC/MS SEMIVOLATILE ORGANIC ANALYSIS PARAMETERS
Extraction Conditions

     1 liter sample  (pH>10)
     Adjust pH<2
     Spike with deutrated internal standards*  (200 to 400 ppb)
     Serial extraction with CH Cl_ (125 x 50 x 50 ml)
     Break emulsions by glass wool filtration or solvent addition
     Dry CHLCl, with sodium sulfate
     Concentrate.sample by Kuderna-Danish evaporation to 1.0 ml
     Add 100 yg/vial d._ anthracene
     Store sample in 5.0 ml serum vial with Teflon/rubber septum
Mass Spectrometer - HP 5985

     amu Range:  35-450
     Scan speed:  300 amu/sec
     A/D per 0.1 amu:  3
Gas Chromatograph - HP 58AO

     Column: SE-30 SCOT, (SGE D grade >50,000 N f  )
     Flow rate:  22 cm/sec at 200°C            e
     Injection volume:  1 ul, splitless injection
     Temperature program:  30 to 260 at 6°/min


Derivation Conditions (trimethylsilyl derivatives)

     Place 100 yl sample plus 50 vl MSTFA in conical vial, heat
     for 15 min at 70°C, cool to room temperature  prior  to injection

*pher.ol - d-        octadecanoic acid - d_5        naphthalene - dg
 diaroyl phthalate
                               47b

-------
     Figure 8 shows a typical chromatogram for the acid/neutral
standard.  This is the calibration standard that is run every day.
There are 39 compounds present;  the 34 different priority pollu-
tants plus five internal standards.  This particular procedure and
GC temperature program (Figure 1)  allows one to see all of the
neutral and acidic compounds through perylene.  The only priority
pollutants not detectable are the early eluting dichlorobenzenes
and a couple of the ethers which elute during the first eleven
minutes.  The total analysis time is 50 minutes.

     Figure 9 shows the variation in internal standard response
ratio over a one-month period.  The calibration standard was run
11 times and corresponds to the analysis of 120 samples.  The re-
sponse ratio of undeuterated anthracene is included to point out
that when using anthracene-d10 as an internal standard for quan-
tification the smallest variation in the response ratio is seen
between that of deuterated anthracene and undeuterated anthracene,
a nine percent standard deviation.  For stearic acid-dgg, which
is a chemically dissimilar structure from anthracene-d-^Q, the
percent variation in the standard deviation triples.  This was not
unexpected, it's related to the factors involved in the chemical
dissimlarity of the two compounds.  One could hypothesize from
this data that roughly one-third of the stearic acid variation is
due to the inherent variation in the mass spectrometer as shown
by the ratio of the anthracenes.  And the other two-thirds of the
stearic acid variation is somehow correlated with TMS derivatives
or other factors which are undefined.  The response ratio of phe-
nol is extremely high as a TMS derivative.  It is roughly ten
times what is on the 1240DA column where it's run underivatized.
Also, our phthalate response ratios are very high.  Diamylphtha-
late is perhaps median in what one sees in the various phthalates.
It can go as high as three to one.

     The recovery of internal standards is in Figure 10.  The basic
observation is that there seems to be no variation between the
different types of matrices into which the standards were spiked.
T> • standard deviations are large but less than others presented
.-i:; this meeting.  However, these standard deviations represent the
overall error of the entire system, from measuring the sample out
into the separatory funnel until the actual analysis in the mass
spectrometer.  There are many points of error that accumulate and
give rise to these variations.  The confidence limits on these
means are about plus or minus 7 percent.  The standards were spiked
at 200 parts per billion of phenol naphthalene, and diamylphthalate,
and 400 ppb of stearic acid.

     Figure 11 is a summary of the GC/MS analysis and it's broken
down according to the type of sample that was processed.  Between
30 and 40 percent of the entire GS/MS runs were blanks and cali-
bration standards, and roughly, 50 to 60 percent of the time was
spent actually running a sample on the mass spectrometer.  There
was a 7-8 percent reject rate.  The reject rate was based mainly
on hardware faults in the mass spectrometer or the software.
                                48

-------
       00
D
~3
 CO

IO-
                                          o
                                          o
                                          8
                                          o
                                        -8
                                          o>
                                          o
                                          o
                                          00
                                          o
                                          o
                                          r-
                                          o
                                          o
                                          
                   0>
                   o
                                                    o


                                                    0)
                   4J

                   rt
                                                    §
                                                    00


                                                    0)
                                                    1-1

                                                    3
                                                    00
                     48al

-------
    Key to Figure 8
 1    TMS-phenol
      TMS-ds-phenol
 2    IMS derivatization artifact
 3    TMS derivatization artifact
 4    TMS derivatization artifact
 5    naphthalene
      naphthalene-d_
 6    TMS-2-chloropnenol
 7    TMS-p-chloro-m-cresol
 8    TMS-dichlorophenol
 9    acenaphthylene
10    acenaphthene
11    TMS-2,4,6-trichlorophenol
12    diethylphthalate
13    anthracene
      anthracene-d.. _
      TMS-trichloroguaiacol
14    TMS-tetrachloroguaiacol
15    TMS-pentachlorophenol
16    di-n-butyl phthalate
17    fluoranthene
18    pyrene
19    diamyl phthalate
20    TMS-oleic acid
21    TMS-linoleic acid
22    TMS-linolenic acid
      TMS-stearic acid-d
23    butyl benzyl phthalate
24    TMS-pimaric acid
25    TMS-isopimaric acid
26    TMS-dehydroabietic acid
27    chrysene
      TMS-abietic acid
      TMS-epoxystearic acid
28    bis(ethylhexyl)phthalate
29    TMS-monochlorodehydroabietic acid
30    TMS-dichlorostearic acid
31    di-n-octyl phthalate
32    TMS-dichlorodehydroabietic acid
        Figure 8
        48a2

-------
Q

CO

  •

15

a:
CM  ro  CD —
cvj  —  CM ro
                                      o>
UJ
CO
Z!
o
a.
CO

ci
CO
•H

-
ro

CM
6
+1
in
in

in
•
o
•H
ro
o>

CM
•
0
•H
0>
ro

1 5
1 O
^•%
°- *
1 °>
1 CM
    00

^




•o
c=
13
0
Q.
O
O






CO
2
1
in
•MM
^
1
"o
c
o
JC
QL



00
•o
V
^
E
o
Q
CO
•5
tei
^
ID
to
T3
1
1
TJ
'o
O
f.
O
CD
^—
CO



o
TJ
1
0)
c
n\
w
O
O
i_
JC
c
<



o
•o
0)
c:
o>
o
o
v_
JZ
c
<
1 — 1
                                            II
                               48b

-------

CO
Q
01
< 9
Q CO
•5? j_i
4Z~ T-l
< ^
H -
0) *
O
i O
-J a>
< cc
0I^HHK
2: ^
a: ^
LU
h-
^
c
(D
13
**-
M—
LU
to
CM
-H
O>
IO

c
CD
ZJ
M—
C
CM
CM
•H
C\J
CD


to
-*:
c
a
00
•»_•
LL
O


>- -o
(T §
UJ g
> E
O °
s
UJ
DC:
CM
-H
O
CD




CO
^
H
I
i
if)
•o
i
"o
c
0
JZ
a.

r^
—
•H
00
IO

0>
•H
O
CD


CO
4-1
h-
IO



00
TJ
1
Q>
C.
a>
c
D
JC
JZ
CL
O
^

r-
CM
4-1
O>
1^-

^t
CM
4-1
00
r-


N-
CM
4H
Sf
00




CO
•»—
o
o
f-
amylphtl
Q

CM
CM
•H
i^
CD

ro
CM
•H
1^-
CD


CO
CM
•H
O
h-
CO
«—
IE
H
T
IO
ro
•o
1
•a
o
<
o
j_
D
a>
CO






O
CD
O
O

-------
               c
               o
CO
O
o
               CO

              5

               to
               CO
               o
               c
                    o
                    
-------
     In summary, it seems that the purge and trap procedure is
satisfactory for verification type analysis.  We did not use a
tekma.r; this is a manual purge and trap procedure.  The recoveries
of the deuterated standards were very high and acceptable.  There
seemed to be a lack of matrix effects, at least in the samples we
analyzed.  The use of the multiple internal standards is a benefit
because it allows the operator, or the analyst, to assess the data
in a real time mode.  One has a very powerful tool for detection
of GC and mass spec problems, matrix effects, and gross procedural
errors.  The routine use of capillary columns seems to offer no
particular hazards.  There are some tricks to the trade to make
them operate, but once those are learned one can use them more
efficiently than a packed column.  They afford an increased sensi-
tivity to certain classes of compounds, probably because of their
higher degree of inactivation.  Derivatization is an advantage
because it allows the analysis of many different classes of com-
pounds in a single GC/MS run.  GSRI has experienced no problems in
the routine use of the deuterated internal standards.  They give
rise to some unique ions for quantitation.  The ion for quantita-
tion can be selected by purchasing a particular degree of deuterium-
substitution.  There were no interferences and they add a minimal
cost to the sample.  The cost for the deuterated internal standards
was about $2 per sample.

     As a final note, the quality assurance aspects of the program
are buttressed by the fact that throughout the verification program
there was a split sample analysis program with the National Council
for Air and Stream Improvement, Incorporated (NCASI).  Our results
are in good agreement on the various analyses we have done.  NCASI
used GC/MS also, but entirely different types of procedures; a
Tekmar for VOA, packed columns, diazomethane for derivitization,
etc.  While I've concentrated on the actual analysis, one point I
didn't cover that is even more important, is the record keeping,
i.e., various aspects of the record management protocols that have
to be instituted to provide a concise data base.  I think that it's
more difficult to keep the numbers straight, than it is to keep
the mass specs running.  On that note, I'll close.

     Mr. Flory:  I missed the quality assurance.  Did you do tune
with DFTPP?

     Mr. Novak:  No. PFTBA.

     Mr. Flory;  Did the acid/neutral extract mean you extracted
in an acid pH and in a neutral and put them back together?

     Mr. Novak:  No.

     Mr. Flory;  It's somewhere between 2 and 7, then?

     Mr. Novak:  Two.  It extracts both the acids and the neutrals.
It's the flip-flop of base neutral.
                                49

-------
                        LABELED COMPOUNDS
                            B. Colby
                 Systems, Science, and Software


     Mr. Colby;  I would like to discuss a technology which makes
use of stable, labeled compounds as internal standards for their
unlabeled analogs.  The work I'll describe was performed using
several deuterated NMR solvents as internal standards for both
soap solutions and pure standard solutions using the methodology I
described this morning.  If I can just back up a moment, we iden-
tified a problem this morning with recovery from soap solutions.
The recovery from foaming samples was different from pure standard
solutions.  Attempts to evaluate recovery through the analysis of
spiked samples means both additional sample preparation and analysis
time.  If one could spike with stable, isotopically labeled com-
pounds of interest, the spiked sample data could be obtained in the
same sample run as the unspiked and the need to analyze additional
samples would be reduced somewhat.  The data that I'll be present-
ing shortly will verify that this approach has promise.

     There are three potential ways of overcoming the situation
where the internal standards are purged either more or less effi-
ciently than the compounds of interest.  They are (1) to avoid use
of an internal standard, (2) to analyze by the method of standard
addition, or (3) to use a more appropriate internal standard.
In the work I'll discuss here, the solutions were quantitatively
spiked with several deuterated compounds in addition to the normal
internal standards and the data reduced three ways:  without incor-
porating the internal standard data, using the conventional inter-
nal standards and by isotope dilution.  Isotope dilution is the
method which utilizes the deuterated  (i.e. labeled) analog of a
material as the internal standard.  In terms of the data, this
amounts to absolute areas, relative areas, and relative areas using
the isotopically labeled compounds.

     With isotope dilution, pure compound X, which has some iso-
tope ratio, Rx, is diluted or mixed with a labeled compound Y,
which has an isotope ratio Ry different from Rx.  The isotope ratio
of the mixture will change with the mole ratio of x to y.  The
equation at the bottom of the slide is a three isotope, mathemati-
cal description of what the mole ratio will yield in terms of the
three isotope ratios.

     The use of this model strictly as it stands is not totally
correct.  A correction has to be made for the presence of other
isotopes and other ion fragments, within the spectrum.  This can
be readily taken care of by calculating an apparent concentration
of the unlabeled material.

     The next slide shows the comparison of the three different
measurement approaches applied to a benzene sample.  The labeled
material was D  benzene with an isotopic purity above 95 p>ercent.
                                50

-------
The equation that's used to reduce the isotope data has implicit
within it a correction to accommodate less than 100% isotopic
purity of the standard.  The solid points in the slide are for
pure standards and the X's are for soap solutions.  The line shown
is a regression line through the pure standard data.  Note that
the relative area versus concentration for the pure standards fall
very close to the line, compared to the soap solution data.  Con-
trast this with the relative area data (the screening protocol
approach).  There is a significant improvement in the general
scatter of the data.  This demonstrates that the internal standard
is correcting or making up for some variation in the measurements.
These variations may have been caused by any number of things, in-
cluding sample aliquot volume and changes in performance.  When
these data are compared to the isotope dilution data an additional
improvement in precision is realized and it appears that all but
one data point fall on the regression line.

     The next slide shows the results for toluene.  The scatter or
precision for the absolute area approach is very severe with the
soap solutions even though it is not for the purer standards.  The
correlation coefficient for the pure standards is .999.  An improve-
ment in precision for soap solutions is realized by going to rela-
tive areas.  However, an offset in the curves is also encountered.
This offset results in a low indicated concentration for toluene
in soap solutions when the pure standards are used for calibration.
This offset is apparently due to a nonproportioinal change in purg-
ing efficiency for toluene and the internal standard.

     The isotope dilution data for toluene solves the offset prob-
lem totally.  In addition there is less scatter with the isotope
dilution data than with the other two data sets.  Isotope dilution
corrects for the difference in purging efficiencies for internal
standard versus toluene because the isotope dilution internal stan-
dard is, in fact, toluene, and deuterated toluene purges just as
effectively as naturally abundant toluene.

     The next slide shows the three sets of data for chloroform.
The absolute area data for chloroform is extremely poor for the
four pure standard data points in the upper right.  These form a
fairly reasonable straight line.  The only pure standard data
point which does not fall on this line is the 10 ug/1 value.  The
regression line was drawn using all data even though a much better
fit would have been achieved if only one value had been excluded.
That data point, however, seems to be real and consequently was
not excluded.

     In the relative area graph, it can be seen that the use of
an internal standard again reduces scatter.  The lower concentra-
tion response, however, is still a problem and now appears to be a
real systematic error.  However, when the isotope dilution data is
inspected, the non-linear relationship problem is missing and all
the data fall on a straight line.  Because this data is from the
same set of acquisitions as the other two, it seems that the meth-
ods and not the measurements are the reasons for non-linearity.
                                51

-------
     The next set of curves again demonstrates a scatter problem
for soap solutions.  The relative area approach helps to correct
some of this, but isotope dilution is much more successful.

     The table in the next slide begins with a precision compari-
son of the calibration curves created from the standards which
were doped into pure water solutions.  Note that the correlation
coefficients for those are really quite good, except for the
chloroform.  Relative area values produce an improvement in most
cases.  The one case where this does not provide an improvement
is for the chloroform, where a systematic non-linearity rather
than a random error is the real problem.  The isotope dilution
approach improves precision and also solves the nonlinearity
problem.  The correlation coefficient goes from .913 to .999.

     The fourth line of data (across the bottom) shows the errors
which isotope dilution data produces without the use of a calibra-
tion curve.  It assumes that the theory is absolutely correct,
an assumption which appears valid for these four compounds.  This
approach is a parallel to standard additions except that the added
material is distinguishable from that originally present.

     The preceeding data has clearly pointed out the technical
advantages to using isotope dilution for quantitation.  It must be
pointed out, however, that only four compounds were investigated
and this does not constitute an adequate sampling from which to
draw general conclusions.  Questions with respect to isotope ex-
change, availability, and additional material and data reduction
costs have not been addressed.  Also unaddressed are the analyti-
cal complexities of applying this type of technology on a large
scale.  I believe this work should be extended to additional com-
pounds and some cost estimates should be prepared.

     When I make an estimate of the material costs for 114 labeled
compounds, based on catalog prices for deuterated compounds  (the
least expensive labeled compounds), it comes to $1.98 per sample.
If an estimate of materials based on the most expensive deuterated
priority pollutant is made, it comes out to be about $5.  If the
estimate is based on 13C compounds, then it starts to get a bit
more expensive; $35.  Even at $35 per sample, quantitation at re-
liable level product isotope dilution may be well worth the cost.
It seems like it would be worth taking a more thorough look.

     Mr. Flory:  We've noted in our D-^-anthracene that, after
it's been around a while -- it depends on our sample load how
long it's around -- we begin getting exchange within the deuter-
ated compound if it gets around water.  Have you noted those
kinds of problems in running blanks of just deuterated compounds?

     Mr. Colby;  I've done a lot more with deuterated compounds
than I've shown here, primarily associated with pharmacological
studies of body fluids and tissue extracts.  It is critical to
establish in the beginning that the labeled sites being dealt
with are not labile.  Of course, any site is labile, but the
kinetics as well as the thermodynamics have to be considered.


                                52

-------
     Mr. Flory;  So that had to be considered in selecting these
standards later on, perhaps.

     Mr. Colby;  That's right.

     Mr. Taylor;  Bruce, have you looked at doing some of the same
kind of calculations using your deuterated benzene versus regular
toluene, doing some of these ancillary studies, very similar to
what you did, but looking for priority pollutants that you don't
have on the deuterated level?

     Mr. Colby;  I have not, I could clearly do that very easily
with the data that I have.  I think that some of that was reported
to us just before I got up here, really, and I would anticipate
that what I would get for data in that area would not be much dif-
ferent.

     Mr. Taylor;  Now, everybody who has done this analysis can
take their own data and do something similar to what Bruce has
done.  By simply looking at the phenanthrene anthracene data in
their standards versus D^g-Anthracene, they will find that the
precision is truly remarkable, like around, it should be around
two to three percent.

     Mr. Colby;  On non-scanning data, no.

     Mr. Taylor;  You'd be surprised, I was surprised how good it
was.

     Mr. Colby:  There are some theoretical considerations which
indicate that it shouldn't be that good.

     Mr. Taylor;  One rationale that I have for why it's so good
is they coelute and the data points are so close that you tend to
ameliorate the problem of statistical sampling.  One other thing
about cost, if this were implemented, you should find that the
cost per compound should go down because there's going to be a
market.

     Mr. Colby;  In fact, I talked to the people at Mercke yester-
day about costs and they estimated that the prices would go down
as much as 75 percent beyond what we see in the catalogs, right
now, so even $1.98 might be high.

     Mr. Taylor;  Another compound that should be of interest is
a D2-Acrylonitrile.

     Mr. Colby;  It's too expensive

     Mr. Taylor;  It's $95 to do them.

     Mr. Colby;  In a test situation, yes, it is.  These are NMR
solvents.  You could buy them by the gallon and it wouldn't hurt
your pocket very badly, and so then it would be interesting to try
that.
                                53

-------
     Mr. Taylor;  Did you pay for this out of your own pocket?

     Mr. Colby;   Some of them, in fact, I did.

     Mr. Dudenbostal;  I realize we're just talking analyses here,
but a lot of us  get involved in the analysis of sediments for or-
ganics and I just want to warn people that if they're trying t.l.ic
approach and using deuterated anthracene as an internal standard,
don't, because it just disppears in the sediment extract.

     Mr. Rushneck;  We're using isotopes, deuterium labeled com-
pounds in this case, in another way, but let me present the pioolem
and explain the  solution that we have found for it.  This works i:or
about anything.   The analysis of low levels of certain of the polar
compounds by GC  or GC/MS is difficult and at times impossible be-
cause these polar compounds adsorb on glassware, on the column, and
on other surfaces, and anybody that's familiar with GC has been ex-
posed to this problem.  For example, on the base neutral column,
the present solution is to condition the column with large quanti-
ties of base, as it's called in the protocol, in an attempt Lc
transmit these compounds through the column.  The drawback is chat
it's non-reproducible.  If you lubricate the column, as I c-
-------
of the isotopic carriers, and the other, Beamin calls chaser com-
pounds.  You can call them chaser, or lubricants, I like lubri-
cants, because we send the one we use through the column first.
This is sort of the bottom line.  The curve with the circles was
obtained by taking a freshly prepared SP-2250 DB column, putting
it in the chromatograph and then running successively larger quan-
tities, starting three-tenths to one nanogram of material in methy-
lene chloride with D^g-Anthracene as an internal standard and run-
ning these in successively larger quantities, then developing the
calibration curve.  The one with the circles is for the benzidine,
and benzidine appears on the order of 50 to 100 nanograms; below
that, you can't see it, the hexagons, or benzidine, after we treat
the column by analyzing five micrograms of benzidine, we will get
small signals which are often quantitated as ghosts from this
treatment with benzidine.  The squares are the paraphenylenediamine
which aid in pushing the benzidine through the column.  Of course,
most significant of all is the use of two to three micrograms of
deuterium labeled benzidine as a carrier to carry the unlabeled
benzidine through; and, under those circumstances you can easily
see five nanograms.  In this case, the sensitivity of the GC/MS was
not souped up in any way, we just used it the way we normally use
it for running priority pollutant analyses.  So, what I'm saying is
that in those instances in which you might not be able to deriva-
tize a compound in order to analyze it at low levels, or find some
other technique which allows you to analyze it at low levels, the
addition of a large quantity of the deuterated analog will let you
analyze that compound to these very low levels.  Questions?

     Mr. Moberg:  What did you lubricate the levels with?

     Mr. Rushneck;  The one we used as a lubricant was the para-
phenylenediamine which goes through the column very rapidly and
coats some of the active sites.

     Mr. Moberg:  But you have to do that each time?

     Mr. Rushneck;  Yes, you'd have to put some paraphenylenediamine
in each of the samples in order to help the benzidine along in this
case.

     Mr. Moberg;  Has anyone else tried any kind of base condition-
ing of columns that last more than one time?  I'm interested.

     Mr. Flory;  Yes, we tried making up a column, just plain
SP12250, and we did some base treating of the glassware and the
glass wool, and the first time we used that column and put in 40
nanograms of benzidine, I think we got an area on an Incos data
system of about 150,000 counts, but we made six or seven columns
in a row before we got one that would do that well.  Its perform-
ances got poorer, and we removed the column.  We cleaned the front
end.  One of the problems, of course, with the on-column injections,
is that a lot of material builds up in the front end of the column,
you know, the charred material.  We took the column out after about
three weeks of use, removed the glass wool, cleaned the front end,
KOH treated the front end of the column again, and while it was not


                                55

-------
as good as it was in the beginning, it was greatly improved.  We
used a three percent KOH solution.

     Mr. Henderson;   I'm not sure the concept of your lubricant is
a good one; in fact, it seems to me that the concept of the lubri-
cant almost invalidates the concept of quantitation by chromatog-
raphy, by gas chromatography.  That is, if you get a special effect
in this case, what about many other possible combinations where ef-
fects might occur?

     Mr. Rushneck;  Well, from the calibration curve I showed it's
obvious what the preferred solution to the problem is.  But we
wanted to test the lubricant concept to see if, indeed, it did aid
in order to avoid the expense of the deuterated compound.  Deuter-
ated benzidine cost us, for 50 milligrams, $500.  In larger quan-
tities, of course, as was mentioned before the price would come
down, but we wanted to find some cheaper means if we could.

     Mr. Henderson;   My worry is that the quantitation might suffer
because of the interaction of every compound that elutes before
every other priority pollutant, is there always an interaction?

     Mr. Rushneck;  No, I think it's highly compound-dependent and
it's really the grossly polar things that you need to worry about.
There's another good reason for not using a column lubricant.  If
at some later date,  you want to find out what other things are in
samples that have been analyzed for priority pollutants, and you're
adding a lot of them, yourself, it can confuse that issue.

     Mr. Henderson;   That's only on screening samples, though.

     Mr. Telliard;  It wouldn't be a big come-down for verification.

     Mr. Colby;  I think that it may not be at all unreasonable to
think of a tremendous number of possible matrix effects that might
exist that will affect relative areas from one sample to the next.
The likelihood that they're large seems to be small.  The chance
that you bump into one now and then seems to be real.  The fact,
from my angle right now, is that I've seen retention times change
as a result of the contents of a sample.  If you get a sample (and
I think there are people here that will confirm this), that has a
high concentraiton of an alcohol in it and you get that on the head
of the column, some  instruments are going to record as part of the
stationary phase of the column for a little while and their reten-
tion time will change accordingly.   Maybe I could get someone to
back me up on that,  but I've seen that happen.  I'm sure.

     Mr. Rushneck;  Well, I agree with you, Bruce, I think the
thing that we've all seen is that when you run a small quantity of
one of these highly polar compounds and then run a large quantity
of it, you see the retention time move forward, almost certainly;
and, I think, for example, if you want to deal with specifics, a
                                56

-------
specific example of a solution would be in the verification meth-
odology, when you're looking for a specific compound, to use this
deuterated carrier to eliminate all of these extraneous effects.
This, I think, would be of great value.

     Mr. Pfost;  If I understand you right, you've tried this on
packed columns.  Does it have the same effect on capillary columns?

     Mr. Rushneck;  I haven't tried it; I suspect it's been done.
One of the things that happens when you use highly deuterated mate-
rials, of course, is that the more you get the column resolution
up, the better the possibility is that you'll separate the deuter-
ated from the undeuterated compound.  Then I don't know if it's
going to work as well.  My suspicion is it will work nearly as
well.

     Mr. Telliard;  Our final speaker for today is Jim Lichtenberg.
For those of you who don't know, Jim is with the Environmental Moni-
toring and Support Lab, out of Cincinnati, and he's going to talk
to you about GC/MS quantification methods.
                                57

-------
                  GC/MS QUANTIFICATION METHODS
                        JIM LICHTENBERG
                      EPA, CINNINATI, OHIO


     Mr. Lichtenberg;   Well, it's not very often I get the oppor-
tunity to come to a meeting to make a presentation and get geared
up to do it three times and then sit back down again.   Well, I'll
try to remember what I was supposed to say and carry on.  Before I
get started, I'd like  to let everyone here know that what I am go-
ing to present is not  something strictly from the EMSL laboratory,
but it's an evolution  of something from a document that EMSL pre-
pared for Effluent Guidelines some time ago, and with the cooper-
ation of the Effluent  Guidelines administrative personnel, the
regional people, some  of the effluent guidelines contractors, and
our EMSL laboratory, we put this package together.  I don't believe
there's anyone here in the room who will deny that quality control,
quality assurance, however you wish to refer to it, is a very im-
portant aspect of any  analytical program, and this is particularly
true of a program that has the dimensions and impact,  both for the
present and the future, of the one that we're discussing.

     I'm going to present something of an outline of the document
that we have put together and I have some copies of that document
here for distribution.  I won't pass it out ahead of time, because
as Dude says, I've got guts bringing it, but I don't have enough
guts to pass it out before I outline it to you.  At any rate, I
will pass it out; time permitting.

     You all know that the priority pollutant protocol, initially
prepared for screening analysis, was intended as a qualitative and,
at best, a semi-quantitative tool.  What we're trying to do now is
take that document and make it, as best we can, into a quantitative
tool and that's what this presentation is all about.  A quality con-
trol program, any quality control program, but especially the one
we're talking about here, has to be well documented; it has to be
integrated into field  and laboratory procedures; and,  in our case,
it has to be developed around each subcategory of the sample type.

There are a couple of  phases, an initial validation phase, there
is on-going quality control, that must be considered.  We're trying
to present a program for doing that here.  Another item to keep in
mind is in the verification phase.  As we're considering it, not
all of the 114 compounds are going to be searched for in every sam-
ple.  A relatively small portion of that total will be of concern
in given samples, and  so the quality assurance program will be
oriented around that number of compounds involved with each subcate-
gory of the sample type.  The sample sources within the subcategory
of primary interest are those which have the largest impact on the
programs of Effluent Guidelines.  Those are supply water, the influ-
ent to treatment water, and the effluent from treatment water; and,
our program emphasizes the validation of the method on the effluent
from treatment.  The reason for this is, it is considered the most
                                58

-------
important water type in that it has the largest impact on the stand-
ards that will be set.   It's not to say that the others aren't im-
portant, but of primary importance in the validation is the effluent
from treatment.  Effluent from treatment method validation samples
have priority over all others.  The quality control thus developed
will be applied in accuracy and precision data supplied for the sup-
ply water, and influent to treatment water as a part of the ongoing
program.

     One thing that every laboratory must do before it gets into
the actual analytical program is to determine just how well it can
perform with a given method; so we have incorporated into the pro-
cedure, as a preliminary step to actual sample analysis, a deter-
mination of accuracy and precision on clean water samples, using
spiked distilled water, or clean background water.  The laboratory
then establishes what it can do with this method under those cir-
cumstances, telling you that under ideal situations this is what
the laboratory is capable of, this is what the method is capable
of.

     One other thing which you'll note if you read carefully.  May-
be you wouldn't note it if I didn't bring it up.  I left out one
very important aspect of the ongoing quality control program and
that is the periodic analysis of performance evaluation samples
and quality assurance samples such as those that are provided by
John Winter's group in the EMSL Laboratory.  That's the brief intro-
duction of the program.  One thing you'll also note that I didn't
have time to prepare and as you read through this document you will
readily recognize the need for a definitions section.  I didn't
have time to prepare that, but I do intend to put that together,
because there are phrases, words, developed in here that maybe
we're not all accustomed to.  Those of us who have been working
pretty closely with it know what they are, but those who haven't,
do not, so, I do intend to add that to the document and I'm sure
you'll recognize the need for that as soon as you have a chance to
look at it.

     Routine quality assurance, or quality control, is developed
around field blanks, method blanks.  These things are defined in
the original protocol and we do provide for the analysis of the
blanks within the protocol, within the quality assurance package.

Further, we provide for instrument calibration checks with the
GC/MS and also with the GC if, in fact, the GC/MS combination
might also be used for the GC performance itself.  I think you're
all familiar with the compounds that are used for GC/MS calibra-
tion, it's the decafluorotriphenylphosphene for the base neutral
acids and for the purgeables, it's pentafluorobromobenzene.  These
are characteristic ions and criteria for this are given in the out-
line of the protocol.

     One of the criteria, or one of the conditions, for providing
the up-front validation data is that if a separatory funnel is used
for the extraction, then a separatory funnel must be used for the
                                59

-------
validation of the method.  If a continuous extractor is required
for the extraction of the sample, then a continuous extractor must
be used for the validation studies.  The entire package revolves
around the use of two spiking materials; not two compounds, but
two broad classes of materials.  The first class we have defined
as surrogate standards.  We call them surrogate standard spikes,
or surrogate spikes.  Secondly, samples are spiked with priority
pollutants, the specific priority pollutants which are of particu-
lar interest to you in your particular wastewater.  The final cal-
culations are developed around the use of internal standards, that
is, assuming that the proper internal standards can be selected
and used.  We have also allowed for calculations based on external
standards, if, in fact, we cannot find a proper internal standard
for selective compounds, or classes of compounds.

     I'll briefly outline the rest of the package that we have,
and then I have three overhead slides to show you.  If you were
confused with Paul Fahrenthold's little presentation of his QC
program on the slide, you might be a little more confused with
the slides that I have.  It is somewhat complicated, and the per-
mutations and combinations of duplicates, replicates, is fairly
extensive.  They can be a little confusing, particularly the first
time through.  I will do my best to make it clear, and if it isn't
clear, you will be doubly confounded by reading the document, I
suppose.

     We provide spiking procedures, recommended procedures for
spiking the samples, both for the liquid extractables and the
purgeables.  We suggest in the protocol that we will spike in the
validation study at three different levels.  We will spike four
replicates at three different levels.  Spiking procedures at one
level will include spiking two of the replciates with surrogate
standards only, and two of the replicates with surrogate standards
and the priority pollutant standards, so we'll have four and I'll
show this in a slide shortly.  We will do that for each of three
levels, so we'll have three concentration levels.  Now, the concen-
tration levels that we are proposing are two times the background
level, ten times the background level, and 100 times the background
level.  Now, we realize that there is a certain hazard in selecting
those factors, and individual samples may dictate that some varia-
tion in those concentrations will be required.


     In the protocol we provide equations for calculating both the
accuracy and the precision of the recoveries.  We provide those
for the surrogate spikes, for the background priority pollutants,
for the spiked priority pollutants, and in combination with the
surrogate spikes.  I hope that this becomes clearer as we take a
look at the slide in a few minutes.  After the validation study
is completed, which, according to plan, is to be done by a single
laboratory, using the criteria that I mentioned, we will have the
data on accuracy and precision and we will subsequently use that
data to judge the acceptability of succeeding data, both from that
laboratory, assuming that it remains in the program, and from other
laboratories that are analyzing samples from the same subcategory.


                                60

-------
After the  initial validation studies, our first day ongoing quality
control will incorporate three replicates, or three aliquots, I
should say, of a sample.  Two will be spiked with surrogate para-
meters only, and one with surrogate parameter spikes and priority
pollutants.  With that data, as  it is accumulated, we will derive
the accuracy and precision statements for that laboratory, or more
specifically, that particular subcategory in that laboratory.  I
think the  calculations of mean percent recoveries, their ranges,
the critical range values, and so on, which are described, or dis-
cussed in  the package are all in common use in the statistical
analysis of data.  They are defined and the use of them is ex-
plained in Chapter 6 of the EMSL Quality Control Manual.  The for-
mer edition is out of print, but the current edition will be avail-
able soon; it is in press right now, it's been in press for about
three weeks.  I have a copy of that here for the perusal of anyone
who happens to be interested.  So we have not included, in the QC
package itself, for this protocol, a lot of detail on how to use
the data,  or how to manipulate the data.  We have given the basic
equations  for determining the figures that need to be used in pre-
paring the accuracy precision data and the control limits.  I
think with that very brief presentation I'd like to take these
three overheads and try to make a little clearer what I have said.

     This  is the flow diagram of the initial validation studies
that we're proposing to carry out on each subcategory of samples.

     Here's the situation:  We are collecting composite samples
with the automatic samplers.  It may take up to three of the two-
and-a-half-gallon composite samplers to provide enough sample to
carry out  this program.  We intend to carry this out on the base
neutral acids as a group.  We have the replicate composite and
I've shown the three as one here.  This is not necessarily so, but
if we were using a single two-and-a-half-gallon composite, it would
be one; if we were using three, as circumstances might dictate, we
would have three of these.  What we will do is take one aliquot,
one liter of that sample; we will analyze it immediately on receipt
for background.  We will use that background data to select the
spiking levels, spiking levels one, two, and three, 2X, 10X, and
100X.   The duplicate pair A in the first case, one liter spiked
with surrogates only, and the duplicate pair B spiked with surro-
gates and priority pollutants, based on the 2X, would constitute
concentration level one.   At spiking level two, we have the same
situation repeated with surrogates only in C, and with surrogates
plus priority pollutants in D at 10X; and spiking level three, we
have at E the similar situation of A and C, where we have one liter
with surrogates only spiked, and in F we have level three of surro-
gates, plus level three of priority pollutants.  Each of the repli-
cates, spiked with surrogates only, are spiked at the same level,
and we have selected, at least for the interim, 100 micrograms per
liter as the spike level for each of the A, C, and E pairs; and the
surrogate level in B, D,  and F wil also be 100 micrograms per liter.
I neglected to mention that after selection of the one liter sample
for background analysis,  we will hold the remainder sample at 4 de-
grees C until that analysis is complete.  Then we will analyze the
                                61

-------
samples, extracted from each of these aliquots by the Protocol pro-
cedure, by GC/MS.  Prom the data derived here we will calculate the
precision of the priority pollutant background, in A, C, and E.
We will have actually six replicates, three pairs, on which we have
the data for doing that.  The same situation will hold for the sur-
rogate spikes in A, C, and E, so we will have six pieces of data,
three pairs of replicates.  We'll be calculating the precision of
the surrogates and the precision of the background priority pollu-
tants at that point.  In B, D, and F, we will be calculating the
accuracy of the recovery data for the surrogate spikes and of the
priority pollutants.  So, we will have one pair of replicates at
spiking level one, one pair at spiking level two, and one pair at
spiking level three, which will be used for calculating the accu-
racy of the recovery data for the surrogates and the priority pol-
lutants.

     If the priority pollutants levels in B, D, and F produce no
significant effect on the recovery of the surrogate spikes, you
will also be able to include the pair of surrogates spiked B, D,
and F in the A, C, and E grouping, so that you then have 12 sets,
or pieces of data, to use there.  That is a pretty quick rundown.
I'm not sure I covered all of the points that can be made, but
maybe I should entertain some questions right here.

     Mr. Northington;  What are surrogates?

     Mr. Lichtenberg;  Surrogates are compounds which are struc-
turally similar to the compounds of interest, but which are not
expected to be found in the samples.  Fluorinated compounds con-
stitute an example of a compound type that might be used.

     Mr. Northington;  Don't you need a duplicate analysis on your
background sample?

     Mr. Lichtenberg;  Yes, we do, we actually have it.  We have
the duplicate background on A, C, and D; in fact, we have six
replicates there.

     Mr. Flory;  What's the purpose of the surrogates, Jim?

     Mr. Lichtenberg;  The surrogates are really the basis of the
whole ongoing QC program.  We're using those to get a fundamental
accuracy and precision with which to carry on the remainder of the


program.  We establish the initial validation of the method with
the surrogates, as well as the priority pollutants, and then, in
subsequent operations we will be using only the surrogates.

     Mr. Northington;  What's the one-liter aliquot for?

     Mr. Lichtenberg;  The one-liter aliquot is for initial analy-
sis for background so that we can select a reasonable spiking level,
We're spiking at two times the background; we're spiking D at ten
times the background.


                                62

-------
     Mr. Northington;   Can't you get that information from A, C,
and E?

     Mr. Lichtenberg;   Yes, but not ahead of time, we'll be spiking
blindly if we do that.   We're proposing to do at least spiking level
one after we have that information, that is, when we know what 2X,
two times the background, is.  Paul tried blind spiking earlier and
came up with no data,  after having consumed the analytical expense.

     Mr. Colby;  Jim,  has your organization had an opportunity to
go through this procedure with an effluent yet, and, if so, what
are your estimates of time involved in terms of man hours?

     Mr. Lichtenberg;   My own laboratory has not.  This is a docu-
ment that was put together by several of our laboratories and con-
tractors.  Some of our people are involved now with the actual im-
plementation of it.  Bob Kloepfer and others have been working with
the Protocol, but I can't tell you how much time they have actually
spent with the procedures.

     Mr. Colby;  Are there any people here that can?

     Mr. Lichtenberg;   Before you answer that, and before you go
any further with that line of thinking, consider this.  This is not
something that is proposed for every day and this is not something
that everybody's going to have to do.  This is the initial up-front
validation which selected contractors are going to do before the
program begins.

     Mr. Colby;  It makes the GC method look very inexpensive.

     Mr. Lichtenberg;   Except that it won't work with straight GC.

     Mr. Novak;  I went through the proposed program for days two,
three, and four sampling, and if we had been required to do that
type of procedure for the pulp and paper verification, it would
have taken 14 mass spectrometers, or four years; it's a seven to
ten-fold increase in sample load.

     Mr. Lichtenberg;  I hope that your calculation is in error.
I believe it is.  I hope that you misinterpreted what you saw
there.  We'll find out, okay?


     If we ignore the treatment of the samples at the various spik-
ing levels, the principles of spiking are the same in all cases and
should be applied to both the liquid extractables and the purgeables.
This is an outline of what is written in the document on preparation
of the purgeable samples before they're selected for aliquots A, B,
C, and D, and so on.  This is the way Tom Bellar prepares his sam-
ples.  It deviates a little from the current protocol in that it
requires more sample, but if you're out there collecting grab sam-
ples, compositing these things, it is not that consequential, as
far as I can tell.  At any rate, the grab samples are shipped to
the laboratory, they're composited according to the procedure that
is provided in the write-up; and at that time, an aliquot is taken

                                63

-------
out for background analysis.  Twelve clean 40-ml vials are then
filled with the remainder of the composited samples and sealed
according to normal sampling procedures.  Three clean 120-ml vials
plus three 40-ml vials may be used as an alternative here.  These
samples are held at 4 degrees Centigrade until the background analy-
sis is complete.  After a determination of the intended spiking
concentrations from the completed background analysis, two 5-ml
aliquots are prepared from one of the 40 ml vials.  At this point,
according to the Protocol, the samples are contained in two 5-ml
syringes.  The spikes, 5 microliters of surrogate in each case,
are added through the syringe valves, then we have the duplicate
pair A, two replicates spiked at the same level with surrogates
only.  To spike the sample at level one with surrogates and prior-
ity pollutants we take one of the 120-milliliter vials or three of
the 40-milliliter vials, fill a 100-milliliter volumetric flask
with these to the mark.  Then we spike with 20 microliters of
methanol standard priority pollutant spike.  Then, you take two
5 milliliter aliquots from that and you go through this operation
that you did with A, spiking the surrogate standards into that;
so that now, you have surrogate standards and priority pollutant
standards in the B duplicates.  I think it's clear, here, are there
any questions on this?

     Mr. Flory;  I have one minor one.  I never did understand why
you made up a standard in 100 milliliters of water and then took 5
milliliters out and put it in a syringe, when you could just as
well put these standards into the syringe without ever hciving the
100-milliliter volumetric flask involved.

     Mr. Lichtenberg;  Well, I can tell you this, that the proced-
ure described here has produced the best accuracy and precision
that Tom Bellar has been able to achieve in our laboratory.  This
is the way he does it, and this is the way we feel it should be
done.  There could be compromises made to that, but any compromise
will provide less accurate, less precise data.  We can make com-
promises, but with a sacrifice in data precision.

     Mr. Colby;  Is there any real reason to want to make a measure-
ment which is significantly greater in precision and accuracy than
the estimate of flow rate from the effluent source?

     Mr. Lichtenberg;  All I'll say is that you always strive to
get the best number you can from the analytical point of view; at
least we feel that way.

     Mr. Flory;  I object to that statement.  You have got to do
the analysis as well as it needs to be done for the job at hand;
otherwise, you waste money.

     Mr. Lichtenberg;  We now consider day one in the ongoing
quality control phase.  In this phase, the proposal is to take
three aliquots of the sample, spike two of those aliquots with
surrogate spikes only, and one of those aliquots with surrogate
spikes and priority pollutant spikes.  The pair with surrogates
only will be used to calculate the precision of the background
and the precision of the surrogate spikes.  The single sample


                                64

-------
spiked with priority pollutants and surrogates will be used to
determine the accuracy of the surrogate spikes and the priority
pollutant spikes.  Any comment there?

     Mr. Taylor;   What, again, is the purpose of the surrogate?

     Mr. Lictenberg;  The surrogate is to give you a complete
picture of the entire method, from start to finish, spiked in
at extraction and carried through the process.  Data gathered
this way will be  compared to the validation data that was provided
initially.  Every sample that is run will be spiked with the same
level of surrogate spike.

     Mr. Telliard;  We're going to run each of the additional ones
in duplicate and  only with internal standards, correct?

     Mr. Lichtenberg;  We're not going to run them in duplicate,
only singly.

     Mr. Hammer;   When you're only doing duplicate extractions on
a single sample on that first day, how do you get precision any
more than a definite maybe?

     Mr. Lichtenberg;  That's just one day's data.  As you accumu-
late data on succeeding days, you will be able to provide a better
statement on the  precision and the accuracy.

     Mr. Hammer;   I understand what you're saying there, so, on the
first day, really, you're just getting a handle on whether or not
you're even in the ball park?

     Mr. Lichtenberg;  That's right, so we will have to accumulate
data over a period of time before we can prepare a significant
statement for accuracy and precision.

     Mr. Johnson;  I didn't understand on this.  Do you follow
these procedures  at all of the sample points, or just two of them?

     Mr. Lichtenberg;  Yo do this on the first day samples, from
each subcategory, at each of the three sampling points that are
outlined:  influent to treatment, effluent from treatment, and
supply water.


     Mr. Johnson;  Just for verification.  You still have a lot of
other points.  You would not do it at the other points?

     Mr. Lichtenberg;  I won't say that you would not do that, but
I think that would be up to the project officer.  We're proposing
that the minimum  validation of the method should be done on these
three sample sources.  The project officer could very well require
you to do additional work.  We consider this to be the minimum and
the effluent guidelines are constructed based on the assumption
that these are the most important sample source types.  Dude?
                                65

-------
     Mr. Dudenbostal;  I'm sure you mentioned this, but before you
hand this out, I want to say that this is only the seventh draft
and there are still a few changes to be made in it.  You people
will probably spot a few errors and whatnot.  They will be cor-
rected.  The second point is that the identification of compounds,
qualitative identification,  will be done through the use, not of
just a search for a single mass spec peak but the entire mass spec-
trum has to be used to tell  you whether or not that compound is
present.  We realize that we still have a big problem with false
positives, but this will, at least, be of some assistance to us.

     Mr. Lichtenberg;  That's correct.  That statement is not in
there now, but it will be, to use the full mass spectrum for quali-
tative identification.

     Mr. Flory;  Do you know, now, how many injections this makes
per day, and how many mass spectrometers will be needed to get this
done in a reasonable length  of time?

     Mr. Lichtenberg;  There was a number there several drafts ago.
All of that was there.  I think I can give you an idea of what the
figure is.

     Mr. Flory;  Also, would it be true that in some categories
you would not be doing all extracts and, or is that not right?

     Mr. Lichtenberg;  I'm not quite sure what your question is.

     Mr. Telliard;  Yes, the answer to your question is yes.  Assum-
ing that you have more than  one extract in a case where only one is
required.

     Mr. Flory;  And the compound list may be such that you won't
have to go for 45 minutes in the run, that you might be able to
stop at 25, or so?

     Mr. Lichtenberg;  The question is, will the whole chromatog-
raphic run be required if the compounds of interest elute before
the end of the run?  That's  a question for Shackelford and the
project officers and the effluent guidelines people.  I suppose
so, but one thing you do have to do is run it long enough to clear
the column.  Your actual analytical time might stop short of that,
but you're still going to have to run it through the program to
clear the column.

     Mr. Flory:  But that would save you time, data management and
so on.

     Mr. Lichtenberg;  Yes.

     Mr. Flory;  It doesn't  stop with acquisition?

     Mr. Lichtenberg;  Right.  I will pass out the documents that
I have and before I do that, I will indicate at least three typo-
graphical omissions, or errors, that are in it.  There may be more,
but I haven't found any more yet.  I will pass these out now.

                                66

-------
Priscilla, do you want to help, or Paul?

     Mr. Flory:   Jim, are you going to" provide us with the other
table, the next table, the day two continuing QA?

     Mr. Lichtenberg;  No, the day two continuing QA is simply one
of those.

     Mr. Flory;   So, that's just, that's no more work than we're
already doing, that's just one...

     Mr. Lichtenberg;  That's right.

     Mr. Flory;   ...That's sort of a one man, one vote, or one-
tenth of one analysis.

     Mr. Lichtenberg;  Right, that's no additional workload at all.

     Mr. Flory;   So, that there is almost no difference, almost no
difference whatsoever.

     Mr. Telliard;   It is approximately 4:30. 'I would like for you
to take the methodology that Jim is passing out.  I would also like
for us to start earlier tomorrow morning.  You will have a chance
this evening to look at this material, and, I hope, comment on it
tomorrow.  How about 7:30 in the morning?

     Mr. Lichtenberg;  When you all have a copy, I'll give you the
three corrections that I have, if anyone notes anything between now
and tomorrow, let me know, and I'll try to give those to you in the
morning.

     Does everybody have a copy in hand? .• All right.  On page six,
equation two, the equation should read,  'P bar equals 100 times the
summation, 0 sub I, over N times T*.   The T was left off.

     On page 11, on the line immediately following equation four,
the X should have a bar over it.

     On page 12, equation five, the C and the N in parenthesis
should have a line between them, C over N.  There may be more; I
have not detected any others at this point.  That's all I have
right now, then.


     Mr. Chian;   What's the meaning of capital T?

     Mr. Lichtenberg;  It's on the next page, following that equa-
tion, true value of the spike.

     Mr. Telliard;   They're going to tear down this room for the
dance; so take everything with you when you leave tonight, anything
you don't wish to give uip to the junkman; and be here tomorrow
morning, 7:30, same stadium.


                                67

-------
                            PANEL DISCUSSION

                             MARCH 9, 1979


     Mr. Telliard;  Leaving off yesterday we were going to have a
panel discussion covering the EMSL adaptation, or variance thereto,
to the screening protocol, and since everyone did their homework
last night and diligently ran out and read all of this material,
we're all set for the quiz this morning.  So, I'll turn it over to
Jim, who is at the end and in the wrong place, again.   Bob Kloepfer
will also be participating.

     Mr. Lichtenberg;  I didn't know we had seat assignments, Bill,
but at any rate, I'm here on the end.  Yesterday most of the people
obviously didn't have a chance to read it ahead of time.  That was
not by design, but, in fact, I wasn't able to get the thing ready
in time for everybody to have a good, strong look at it.  But I'm
here now if the 50 percent that showed up this morning are still
interested in discussing the thing.  I guess we can start by answer-
ing any questions that you might have.  Hopefully, we can answer
them.

     Just to be sure that there's no misunderstanding about how
this document was derived, I again acknowledge the organizations
involved with helping to put this package together.  They included,
besides our own laboratory, the effluent guidelines people in Wash-
ington; and a number of our regional laboratories, I think all of
the regional laboratories that are involved in the actual analyti-
cal work in support of Effluent Guidelines; and a fair number of
the contractors who have been involved with contract analysis for
the Effluent Guidelines.  The package that we put together results
from a concurrence among us.  It is our considered judgment that
this is a reasonable approach, one that effluent guidelines can
afford, and one that we're going to implement as soon as we pos-
sibly can after this meeting.  Is that a correct statement?  So,
then, I'll let it go at that and ask if anyone else wants to make
some opening remarks.

     Mr. Novak;  I'd like to make a comment to Jim.  Yesterday I
said his protocol would take seven to ten times the effort, and I
had based my estimate on a precursor to this document that I had
seen.  Now I say that it would not take seven to ten times the
effort to redo the pulp and paper verification by this protocol.
It would take about four times.

     Mr. Kloepfer;  There were several questions that came up
yesterday about the surrogates, and I think that the most innova-
tive part of this verification QC program is probably the use of
the surrogates.  I see the value in several respects.  Number one,
the surrogate adds no workload, or virtually no workload, to the
analysis.  It's something that's put into each and every sample,
and the amount of time involved is virtually none.  So that's a
                                68

-------
good point in its favor.  Number two, it allows you to rapidly
determine control limits, at least for the surrogate.  And the
biggest value that I see is that it allows you to monitor for
gross processing errors.  For example, after you've run your first
ten or fifteen samples you can determine the limits for the surro-
gate.  If you're running, let's say, pentafluorophenol in part of
your acid, in the acid fraction, and the median recovery estab-
lished for that particular surrogate is 78 percent, with a varia-
tion of plus or minus 20 percent, with this type of control, you
know that 99 percent of the time you should get a recovery for
that surrogate of between 98 and 58 percent.  Again, using this
example, if you're running a bunch of samples and you run where,
perhaps/ you only have 12 percent recovery, you have immediate
feedback and you know that something went drastically wrong on
that sample.  I can think of only two things that could have gone
wrong; number one, there was some weird, strange, matrix effect,
or, two, you blew it, the slider blew up, you dropped the extract,
you know, something drastic went wrong.  It's the type of QC that's
included in every sample and I think it's a very innovative approach
to quality control.

     I'd like to present some data that Dr. Fairless and I have
collected over the past year on recoveries of priority pollutants.
First, let us consider examples from priority pollutants spiked
into organic-free water.  This is a relatively easy matrix, a clean
matrix.  I'll comment on real samples later, but this is what three
laboratories were able to do working with a clean matrix.

     Dr. Novak;  What kind of laboratories were involved?  Con-
tracts, regionals, or what?

     Mr. Kloepfer;  Well, I'm trying not to give away the identi-
ties of the three laboratories.  We're talking about one regional
laboratory, and two contract laboratories, but I won't say which
is which.

     Here is a table that contains data for four of the base-neutral
compounds that were run.  We have data for most of the base neutals,
but I have just pulled out four at random to give you an idea of
the types of recoveries that we were looking at.  For example, the
overall recovery for lab one was 86 percent with one standard devi-
ation of plus or minus 75 percent.   Bear in mind that on the base
neutrals, we're talking about the addition of 20 to 100 parts per
billion of the standard to the clean water; and although the stand-
ard deviation was quite a bit higher in lab one compared to the
other two; as you can see,  the recoveries were respectable in all
three cases.  For the acid fraction, and in this case, laboratory
one and two did much better in percent recovery, again, laboratory
three had better precision.

     The next slide shows the data for the pesticide fraction.

     Mr.  Flory:   Are these the same labs — one, two, and three --
each time?
                                69

-------
     Mr. Kloepfer;   Right, the same labs each time.   Here is some
much more consistent data.  Overall, the percent recoveries were
all in the 70's with not a whole lot of difference in the preci-
sion.   One thing I  noticed in accumulating this data was that the
recoveries were pretty consistent regardless of what amount of
spike was added, whether it was down at the bottom,  at two-tenths
of a part per billion, or whether it was at the top range, which,
in this case, was 33 parts per billion.

     On the volatiles, laboratory number three had recoveries of 100
percent.  They were measuring precision; accuracy was nc>t measured
in that particular case.  They were running standards and simply
determining how much precision there was in that.  You can see that
for the volatiles the overall precision was really good.

     Now, I would like to make some general statements about real
samples.  The data I've given you concerns standard additions to a
clean matrix.  We also have a lot of data on the same type of thing
where the priority pollutants have been added to real influents and
effluents that I didn't have time to prepare, so I'm just going to
make some general statements.  For the extractables, data that I've
seen for the three extractable fractions, in general, recoveries
are 10 to 20 percentage points lower.  For the volatiles, the per-
centage recoveries that I've seen range from 0 to 5 percent lower
than what we see here for the clean matrix.  So, in general, it
appears that this is the type of data that our quality assurance
program will generate; it is a preview of the types of numbers
that we will be looking at.

     Mr. Flory;  Is this GC/MS data?

     Mr. Kloepfer;   Yes, this is all done by GC/MS.

     Mr. Neptune;  Does anybody have any other questions for Bob?

     I only want to say a couple of things here.  Usually when you
pick up something and start to read it, you thumb right past the
introduction.  If you do that with this document, you're going to
miss a heck of a lot.  The introduction contains the assumptions on
which the document is based.  If you miss those, then it's going
to be very difficult to piece together the rationale for some of
the decisions that were made.
     I want to select a couple of those assumptions and focus on
them because they are extremely important.  All of them were dis-
cussed, to a certain extent, yesterday, and I just want to call
attention to several points for special emphasis.  First of all,
when you consider the amount of work involved, you think, in the
sense that you thought about a sample in screening:  "Why do that
when it will apply to so few cases?"  By the same token, if some-
body were to ask me how many priority pollutants there were, and
thus, how many potential fractions might I have to run in verifi-
cation, I couldn't answer the question until some start had been
made in segregating the data along those lines.  But in most of
                                70

-------
the cases that we have looked at so far, it's going to be fewer
than the four that are being run now, plus the associate injec-
tions.  We are using that screening data to focus our efforts
in verification.  Consequently, we're not going to be looking
for 114.  That's why we went to such a great deal of effort to
assure that we didn't have false negatives; we accepted false
positives.  That's particularly important.

     I want to make sure that several other points are not glossed
over.  Because we have no solid information to the contrary, the
waste drain at the subcategory level is considered unique.  What
does that mean?  It means the initial quality assurance and qual-
ity control parameters are established at that level, on that
waste matrix, on a subcategory.  There is another thing that you
were looking at which if you didn't listen closely to Jim yester-
day, you might have missed.  The whole program is broken down into
two major portions, initial and ongoing.  The initial is that which
is done the first time in that industrial category, and done once,
one time; that establishes the control parameters for the rest of
the work being done in that lab, and by other laboratories handling
that same industrial subcategory.  If we do the initial in one
facility in the industrial subcategory, the amount of QA on a per-
centage basis is going to be a lot.  If we do the ongoing program
in a number of facilities, which, in most cases, of course, is what
we'll be doing, the overall amount is diluted by the number of fa-
cilities that are involved in the continuant.  This is a consensus,
a determined judgment by a number of people, that this is the mini-
mum, and the best approach to provide us with data consistent with
our objectives.

     One last thing:  the control parameters in the initial are
based on the effluent from treatment, and that is particularly im-
portant.  Why was that chosen?  The information that we want most
to be as accurate as possible is that which might ultimately be
useful in some sort of regulatory-type posture, and that's the
effluent from treatment.  That's all that I have.  Any questions?

     Mr. Fahrenthold;  I only have one comment, which is a follow-
up on what one of the points that Dean made was.  The assumption
that this can be used one time per industrial category, for the
Organic Chemicals Branch, fits only those categories where we've
already finished the work.  So we will not be using that particu-
lar protocol in those categories since we're done with most of the
work.  In the other two categories, which are the pesticides and
organic chemicals category, the basic assumption that you can go
out and characterize a wastewater matrix with one set of QA/QC pro-
cedures has never been proven.  We will not operate with that basic
assumption until either we have proved it to be true, or learned
that it is false.  We are making the assumption that it, in fact,
is not true and the program is proceeding more or less on that
basis.  This is probably a worthy topic for discussion, so, I'll
bring it up now.  I don't know what the difference would be between
using our kind of protocol for QA/QC with GC/MS versus the way we're
doing it right now.  Dean says that he feels like it would probably
                                71

-------
be much too burdensome, in terms of machine time, to use that kind
of an approach.   So, we have not done that; perhaps the answer to
that is more obvious to you, who are familiar with the techniques,
than it is to me, but it is a major instance of possible difference
and I think a brief comment is justified.   If there are any ques-
tions, maybe we can explore those or investigate the differences
if there's any controversy.

     Mr. Novak;   Yes, the question that I  had posed to Dean, which
I'll pose again so that everybody can hear it, is, what degree of
analytical flexibility is allowed under this protocol?  Is this a
backup to the screening protocol where we  have to use SP 2250, and
1240 DA, or can we, like we did for verification, pick and choose
the best?  Well, the best in my opinion, but okay.

     Mr. Neptune;  Maybe I ought to let Jim answer this question.
The answer is not going to be an easy and  a straightforward one
since this, as the title says, is an addendum to the screening
protocol.  The operating procedures that will be used should, of
course, be based upon the screening protocol.  By the same token,
if those procedures will not allow you to successfully analyze a
given sample, and you have to make a modification or variation to
what is recommended to allow you to run the sample, and that is
the only way that you can run it, of course, that's what you're
going to have to do, utilizing this, but the thing that is exceed-
ingly important here is that whatever is done, by whoever runs the
initial, has to be disseminated to all of  the people that are doing
the continuing program, so that they can use the same operational
parameters and run the sample in a way that we have a chance of
getting information that is comparable.  Jim, do you want to add
something to that?

     Mr. Lichtenberg;  One of the objectives...yes, Bill.

     Mr. Telliard;  To answer your question, also, some of us don't
know if we can afford the program.  Speaking as a branch chief, I
may choose for verification not to use this.  I may call up Jim
Ryan and say, send me a copy of that mess  you guys did, if I can
afford it.  I don't think the Cadillac option is open to all of us,
particularly in relation to programs that we've already spent X
amount of dollars on.  I don't think Paul  is going to take his
whole program and drop it, turn around and run in a different direc-
tion.  Nor am I, nor probably is John Riley, or any of the other
branch chiefs.  For those three industries that we haven't started,
this will be hell on wheels to get.  But,  no, I think this is an
option.  I don't know how many of us can afford it; what is it going
to cost, what are the trade-offs against other programs?  Bob, what
are you guys doing for QA right now?  Is it all right to ask?  I
asked Athens for their quality assurance data and they told me to
stick it in my ear, so I was just...

     Mr. Kloepfer;  I already showed mine, Bill.

     Mr. Telliard;  That was the whole thing?
                                72

-------
     Mr. Kloepfer;  No, currently what we're doing generally is fol-
lowing the original document that came out of Bellinger's shop where
you run one standard addition, once out of every 20 samples.  We
have not yet implemented this current verification protocol.  There
was a question about procedure awhile ago, and I'd like to bring up
a question and that is, when will the updated screening protocol
be available?

     Mr. Lichtenberg;  As soon as possible.  We are in the process
of updating the GC/MS protocol.  I have made and have typed the
changes that we, in our own shop, feel should be made on the purge
and trap.  They are not very significant.  There's an addition of
a column which will help to do some things better.  The trap was
changed a little bit to retain some of the very volatile compounds,
such as freon and some of the monohalogenated methanes.  So, we're
working on that and we're trying to get a little more feedback,
particularly on the liquid-liquid extraction end of things before
we finalize that.  I told Dean that anything that happens after
this meeting will be after Easter as far as I'm concerned, but,
hopefully, we can do things better than that; somebody else in
the shop can get on it.

     To answer the question that came before, about changing the
columns and so on, what we're trying to do with this program is
to compare data between laboratories as best we can with what we
have provided, and any time you change any of the conditions, or
any of the features or the methods, you're going to create the
possibility of providing poor, non-comparable data.  So, if you
change a gas chromatographic column, you can't take that lightly,
the way I see it.  I'm not saying that you can't change it, but
if you're going to change it, you've got to demonstrate that that
column is going to be at least somewhat comparable to the one for
which you're substituting it.  It's got to have a certain effi-
ciency, or whatever, but just because you've got another column
in the system, you just can't arbitrarily go ahead and change it.
You've got to have some logical basis for making the change.
You've got to maintain the conditions for making a decent inter-
laboratory comparison of your data.  The use of the surrogates
spiked at the same level into every sample, we feel, is going to
give us a pretty good comparison- of data between laboratories,
even though the samples may not be exactly the same samples.  They
will be from the same type.  We will be able to judge whether
they're at relatively the same concentration levels, based on the
data that we get back, and so on.  So the possibility of changing
the analytical approach is there, and it's been there from the
beginning.  If you want to change the GC/MS protocol, the option
has been there to contact the EPA laboratory in Cincinnati, pre-
sent your proposed change, and our laboratory will make a judgment
on whether or not it is feasible or possible or desirable.  So the
mechanism for doing this is there, it's been there from the begin-
ning.  There have been relatively few inquiries to us, directly,
or indirectly, on making such changes.  So, where I sit, without
those inquiries, I'm assuming that the method is working pretty
well.
                                73

-------
     Mr.  Flory;   Can  I  make  one  right  now?

     Mr.  Lichtenberg;   Yes,  maybe  it's a  little  late,  but...

     Mr.  Flory;   Well,  I didn't know it was possible to get them
changed that easily before you made that statement.

     Mr.  Lichtenberg;  I think the provision is stated somewhere
in the original document that if you have such problems, to contact
our laboratory,  and the possibility for changing the protocol is
there, but we don't want people indiscriminately going around chang-
ing it just because they prefer to do it this way, instead of the
way that's in the book.

     Mr.  Flory;   Okay,  now, I'd like to make some comments about
the tuning of the mass spectrometer, if I may.  One of the things
that I noticed in the data that's been presented on precision and
accuracy is, of course, if you put in a labeled compound, the same
one you're quantitating, the results get much better; the lines fit
the regression curves better.  Now, in my mind there are two reasons
for this.  One is, if you take the material through all of the pro-
cedures,  then, that labeled standard duplicates the recovery for
the compound you're handling.  Now, the other reason is that, when
you use a labeled standard like that, you're using a mass in the
mass spectrum that's very close to the compound you're quantitating,
much closer than that mass may be to the mass of the internal stand-
ard, it it's, say, D^g-Anthracene, or bromochloromethane, et cetera.

     Now, I think one of the things that could•greatly increase our
precision in the relative response factors is a different method of
tuning.  Now, the method now says that you inject 20 nanograms of
DFTPP, and then you must get a spectrum out of that GC peak that
matches a reference spectrum that we have.  Now, as anyone knows
who has tried that, if you have eight spectra in your GC peak, you
can find those spectra that vary in relative abundance for a 442
peak from 10 to 200.  Also, if anyone has spent much time tuning,
like, a 3200 mass spectrometer, he knows that by moving pots in-
side, then changing ion energies and changing offset programming,
and such things that he can bend the mass spectrum around and make
it look like anything he wants.  So, I contend that by checking
the tuning of the mass spectrometer by looking at a spectrum on a
GC peak,  one can be very far off in what the relative abundance of
the mass spectrum looks like for a reference compound.  The best
way to do this tuning, and the way that it's always been done be-
fore this procedure came out, is with the batch inlet; and one
should have a batch inlet, one should put the compound in at a
constant concentration, tune the mass spectrometer, and record
the spectrum, and I'm sure relative response data will get better
if that is done.

     Now, one other thing you can consider doing in addition to
that to make the quantitation better  is to put the reference com-
pound into every analysis and make mass discrimination corrections
before you do the quantitation.
                                74

-------
     Mr. Lichtenberg;  Bob, will you comment?

     Mr. Haile;   I have a comment concerning the gentleman's com-
ment.  We do a couple of things to insure that we're within the
spectrum quality requirements as published in the protocol.  Num-
ber one, we don't like to waste runs, so we don't normally run our
DFTPP until we're fairly sure we have the balance fairly well in
tune.  We do do our primary tuning with PC 43 from, a reference
inlet, and we have gained enough experience over the years that
we can tune on the FC 43 and be fairly well assured that we're
going to get the good DFTPP spectrum.

     There are a couple of hazards with looking only at samples
from a batch inlet, in that your source pressure may be slightly
different and that can perturbate the balance of the spectrum
also.  In terms of actually looking at the GC peaks, we prefer,
since we realize that a mass spectrometer is not an instantaneous
detector, to average the four, or five, or six scans, or whatevr
it takes to enclose the DFTPP peak and average those spectra and
get a better look at the balance that we're getting on the DFTPP.

     Mr. Flory;  Let me answer the first objection that you brought
forth there.  When you tune with DFTPP, or anything else in a batch
inlet, you should set the levels of pressure in the ion source to
somewhere in the middle of the quantitation region that you're
going to do, and then you won't have that problem.  When we tune
with the batch inlet, if we want to be doing 10 tb 100 parts per
billion, then we set the one, the 198 peak to be about what it is,
and when a 50-part-per-billion sample comes through the GC, we
don't have that problem.

     Mr. Haile;  I think that's, that may be difficult to achieve
in practice.  It's fairly easy for us to make the correlation
between the spectrum we see of the FC 43 and the spectrm that we
expect to see from the DFTPP.  We're using the DFTPP simply as a
check.  My only real objection is that I think you should be care-
ful with tuning purely on batch inlet without looking at the spec-
trum you get off of a GC run.

     Mr. Flory:  I don't understand that because the best way to
tell how the mass spectrometer is tuned is with a batch inlet, and
I don't think you can question that.


     Mr. Haile:  It may be the best way to tell how the mass spec-
trometer is tuned for the batch inlet, but it may not be telling
you what's coming off your GC.

     Mr. Flory;  You can't change the relative abundance in the
spectrum by hooking a GC to the mass spectrometer.

     Mr. Kloepfer;  I want to point out one thing.  In this adden-
dum, the requrement is for 50 nanograms of DFTPP to meet the re-
quirements, rather than 20.  So that has been increased to make the
task somewhat easier; it's now 50 nanograms, rather than 20.
                                75

-------
     Mr. Flory;  Is that a change we can recognize in screening
work, we now have to get the spectrum at 50 nancgrams?

     Mr. Kloepfer;  It's acceptable for verification, if you follow
this protocol.

     Mr. Flory;  No, what I meant is, from now on in screening,
will that be a sufficient amount to put into...

     Mr. Kloepfer;  Yes.

     Mr. Colby;  I'd just like to make another comment about DFTPP
in general.  We have, incidentally, a magnetic instrument and so
we don't have quite as many tuning parameters to modify the spec-
trum, let's say, to what is required.  When one  thinks about any
mass spectrum in general, there are a lot of different things
which can get involved in modifying the abundance pattern that is
observed.  One of the most sensitive turns out to be the ion source
temperature.  DFTPP is tremendously sensitive to this parameter,
and we can produce practically any kind of spectrum we want from
that compound, in terms of the fragments that are involved in the
protocol, just by increasing or decreasing the ion source tempera-
ture.  And we find with molecules like FC 43 that that's not nearly
so much the case.  We find just as little sensitivity to it in some
other cases.  If we use what we feel is a reasonable temperature
in the ion source, the 442 peak, in our instrument, is just almost
always 200 percent of 198, and the only way we can get it down is
to jack up the ion source temperature so that we can kill all of
those nice high mass ions that are so useful to  identify things.

     Mr. Flory;  That is another problem; the DFTPP spectrum that
we're tuning to is, in my opinion, not a real mass spectrum.

     Mr. Kloepfer;  I don't think we care how you do it, just so
you meet the criteria.

     Mr. Stauffer;  I just want to say regarding tuning from a
batch compound that if you have Finnegan 4000,  the instrument is
very sensitive to the direction from which the sample is coming
into the ion source, and if you tune on a 4000 with FC 43 and then
look at the peak shapes of the column effluent,  you see that they're
grossly distorted.  You really have to tune on material coming from
the column, and what we've done for meeting the  tuning criteria
is to jack the column temperature up, tune on column bleedoff,
then shoot the DFTPP to see what comes out.  We  can usually judge
ahead of time whether or not the pattern is going to match, but
we are thinking about using the equivalent of an FC 43 inlet in
the GC line, controlled by a valve or something  like that, and
putting some DFTPP in that, where we can tune directly on it, but
in a steady state flow characteristic, and having the right direc-
tion into the source for that particular mass spectrometer.

     Mr. Flory;  All I'm trying to say is the best way to define
how the mass spectrometer is performing is by a  batch inlet if you
                                76

-------
want to get it the same every day, and it doesn't make any differ-
ence whether you come in the right or left side as long as you
always do it the same way.

     Mr. Stauffer;  It does on the 4000.

     Mr. Flory;  No, no, what I'm saying is if you want to get the
mass spectrometer set up, okay, today, just like yesterday, next
year, just like last year, okay.  We're putting it in in a batch
inlet at a constant concentration instead of through a GC.  This
is the best way to define how the mass spectrometer is set up.  I
don't care if the mass spectrum is then different when it comes
in through the GC.

     Mr. Stauffer;  I agree with you, for most machines, but on a
4000 you can tune to what appears to be an identical spectrum on
one day versus another; then look at column effluent coming off
and find that you may have distorted triangular-shaped peaks in
which you're measuring the abundance with the data system on that
sloped side, rather than on the top of the...

     Mr. Flory:  You're talking about the actual peak shape.

     Mr. Stauffer;  The actual peak shape will vary substantially,
depending on where the sample is coming from.

     Mr. Lichtenberg;  That's what he's saying.

     Mr. Flory;  That is right.

     Mr. Lichtenberg;  We could talk about this all day.

     Mr. Stauffer;  I'm just saying this as a caution.

     Mr. Lichtenberg;  Well, thank you very much.  One more comment
on this particular subject matter if that's what you have and then
we'll cut it.

     Mr. Henderson;  I'd like to point out that I think the most
serious drawback to the use of DFTPP, as described in the protocol,
is something we've touched on indirectly already here, and that is
the selection of the spectrum.  There are no requirements outlined
in the protocol.  One of the problems is, we have some people like
Midwest and, in fact, our own laboratory, who take an average,
some people who select the apical spectrum, and a number of people
who will try first one, and then the other; and if that doesn't
work, as we've heard here, some people will select the spectrum of
the leading edge, if that doesn't work, they'll try one on the
trailing edge.  The idea is to introduce some sort of uniformity
in this data for future works.  That's one of the most important
reasons for using DFTPP.  Now, in the future, as analysts look for
new compounds, almost invariably they are going to be selecting
the apical spectrum, subtracting the background, the leading back-
ground spectrum, and I seriously recommend that you consider some
                                77

-------
sort of requirement, some sort of additional requirement along
those lines.  That is, perhaps, selecting a single spectrum, not
an average, within plus or minus 1 mass spectrum of the apex of
the 198 peak.   I think that would help in the uniformity because,
as I say, I know that some analysts will try first this spectrum,
and then that spectrum, maybe half-way down the peak.  The idea
is simply to meet the requirements and not to actually meet the
spirit of the requirements.

     Mr. Lichtenberg;  Thank you for that comment.  I would appre-
ciate it if you would get your feelings to Bill Buddy.  He's the
man responsible for introducing this, and he has some pretty
opinionated ideas on the use of it himself.  I think that maybe
some of the questions that are being raised here might be solved
by looking through the reference and applying that.  We did not
include all of that in the initial analytical protocol.  We only
provided the table with the reference cited and it's not cited
in this particular document that we're talking about today, but
it is cited in the original protocol.

     Mr. Kagel;  One more quick one, Jim.

     Mr. Lichtenberg;  Okay.

     Mr. Kagel;  I'd like to suggest a couple of additions to the
validation protocol for purposes of clarification.  I'd like to
see some definition of limits of detection for methodology and in
terms of signal to noise, either RMS, or peak to peak, and some
definition of limits of determination, and also some guidelines
for reporting data, such as not detected parenthetical detection
limit versus less than.  One other question, did you investigate
spikes near one X, rather than two X?

     Mr. Lichtenberg;  I, personally, have not, I'm not sure about
some of the other people here that have been working directly with
the method.

     Mr. Kagel;  I'm looking for a spike near the detection limit,
I'm still concerned about determinations near detection limits,
where the errors get horrendous, that may reflect a little better
than two X.

     Mr. Lichtenberg;  Does anybody here have a comment to his
last remark?

     Mr. Colby;  I'd just like to mention that Bob so freely stated
that he didn't care how one got the spectrum for DFTPP, and then we
find that there are lots of different ways to get the spectrum,  like
taking off the front or the back side, or whatever you need to do.
I don't think that kind of a statement is all justified.

     Mr. Kloepfer;  Well, I don't think you should take the spectrum
off of the leading edge, or the tail-end edge of the peak, I mean,
certainly that isn't the way to do it, and, as Jim said,  if you  go
                                78

-------
back to the original document that appeared in Analytical Chemistry,
I don't think that was the intention of that particular compound.

     Mr. Neptune;  Ron, in answer to a portion of your last question,
there, the materials that we have been observing most closely to
date, the presence of which we would like to verify, and upon which
we would probably be most interested in getting good quantitation
information are primarily not those priority pollutants that are
hovering around the limit of detection, and consequently, a great
deal of concern about the larger variation that is possible operat-
ing in that region, while it is a valid concern, is not going to be
appropriate for most of the things we're looking at.  If it is, if
we should be operating down in that area, of course, we'll have to
give further consideration to looking more carefully and trying to
approach, very carefully, those particularly low quantitations with
appropriate procedures that take into account the wider variation
that one can expect.

     One of the other things I want to mention before we leave is in
reference to something that Bill Telliard said here a few minutes
ago about what is affordable.  Now, I want to put things in a proper
perspective.  Just as I indicated to you, one of our assumptions was
that the waste stream is unique at the subcategory level, and with
very good reason, I still feel that depending upon the regulatory
strategy and the approach that is being considered by each project
officer, the needs and requirements are going to vary.  Obviously,
if, through screening, you have found very little indication of
priority pollutants being present, or, if present, only so indi-
cating at what appears to be extremely low levels, and your regu-
latory approach is paragraph eight, the utility of this program
or any of the other programs that you have heard described here is
going to be very limited.

     So, it really depends on what your objectives are, what you
need to do, what you're trying to do, what your regulatory strategy
is going to be.  And, that's going to vary, it's going to vary
based upon the uniqueness of that industrial subcategory.

     Mr. Haile:  If I may make one final comment concerning that
particular question.  It may be that the organic chemicals people,
indeed, have a special case.  A protocol as you've designed here,
assumes some degree of similiarity from plant to plant throughout
the industrial category, which does not appear to be the case in
such a mixed bag of organic chemicals.  Also, I'm not sure, but
I understand that the organic chemicals plant may be the only
industrial category that's going up the stream into the process
lines, but certainly it gives you a mixed bag from plant to plant.
If you're trying to do a lot of front-end validation work, it may
be less applicable to the sort of methods, the mumbo-jumbo that
sometimes you have to get into, to handle some of the extremes
in that particular branch.  Am I correct, Paul?

     Mr. Colby;  Thank you, Clarence.  Dean is going to respond to
you, okay, with some more detailed comments than what I can make.
                                79

-------
     Dr. Neptune;  Just as I was describing here a few minutes ago,
whatever is being done, is being done based upon a strategy, a plan.
That's exactly why what the Organic Chemicals Branch is doing and
what Paul described to us yesterday is somewhat different from that
which is being done in some of the other areas.  It's not to say
that what Paul is doing is not being picked up and used by someone
else, but it happens to be the particular regulatory strategy that
best meets the objectives for the Organic Chemicals Branch, and
exactly as you pointed out, the potential variability, at a level
even lower than the industrial subcategory, is why Paul needs the
flexibility he has tried to bring into his program.  Do you want
to add anything to that, Paul?

     Mr. Fahrenthold;   I think the basic difference is not only
in strategy, which you see, but it's inherent in the nature of the
industry.  I don't know about all industries.  For example, I don't
know how different paper mills are, okay, but Roger's protocol is
developed for that.  It may very well be that to a certain extent,
all the waste streams are different and we're just arguing about
matters of degree, which is probably the most likely circumstance.
In pesticides, I know that their waste streams are much different.
In organics, because of the structure of the industry, you don't
have one plant, one product, one discharge, one treatment plant
type scenarios, and for that reason we have had to build in the
kind of flexibility that doesn't seem to fit with the GC/MS type
protocol.

     Mr. Dellinger;  We've heard in discussing this particular
QA/QC program that the minimum requirement is that it's going to
have to be implemented once per subcategory.  Well, an opposite
approach to that would be that the maximum is going to mean it's
going to have to be done once per plant and as project officers
we keep saying that we have to have a plan and a strategy and this
type of thing; well, a lot of that strategy and plan is dependent
upon cost.   What we need, from the project officer's standpoint is,
the most probable number that it's going to be, so we can base our
budgets and our decisions on that.  It becomes a very difficult
situation if you make your original budget on a once per category
basis and get halfway down the road to suddenly find out that the
entire procedure is going to have to be followed for each plant.
I would just suggest that as information is developed on how often
it is required to, let's say, increase the amount of QC/QA, that
this most current information is passed immediately to the Project
Officers so that we can use it in making up our budgets and doing
these plans that we keep talking about.


     Mr. Telliard;  Thank you.

     Mr. Henderson;  I have one more comment.  I think one thing,
it seems to me, that we are taking lightly is the one time valida-
tion study at the beginning of the project.  But it's for each sub-
category, so you might have to do it 20 odd times, isn't that right?

     Mr. Telliard;  387, I believe, subcategories.


                                80

-------
     Mr. Henderson;  So, Walt's is significant, but 20 times is
something that really needs to be considered.  I'm afraid it's
being passed over here, that's my problem.

     Mr. Kloepfer;  Yes, but remember that would be 12 samples
analyzed for only a selected number of priority pollutants and
not necessarily all fractions.

     Mr. Henderson;  Well, that' 247, on a contract; that's before
you start a contract maybe of 200 samples.  That's not going to be
that bad, but it could be, in the worst case.

     Mr. Lichtenberg;  All I wanted to do was to restate the fact
that we are preparing a revision of the GC/MS protocol.  It's in-
tended for use in the verification phase here, and it's also in-
tended for application in the NPDS program at a later time, as an
option to the individual methods that Jim Longbottom was referring
to yesterday.  So we're not shutting the door on mass spec just
because we're providing individual methods that will very easily,
and at less cost, meet specific needs.  We're not shutting the door
on the overall GC/MS approach.

     Mr. Telliard;  Thanks, folks,  when all is said and done, we're
still multiplying a flow of plus or minus 20 percent, and a little
sampling error of 100 percent, and it's what we're going to do with
this data that counts.  That's why everyone will have quote, his
approach, unquote, to handling the individual industrial categories.
I think a lot of the data is still lost.  The fact that we're still
trying to generate engineering data for evaluation and determination
of treatment efficiencies is an indication that we still have the
tail wagging the dog.  So, it's our purpose to come up with a defini-
tion for engineering design and treatment, and to look for a fenta-
gram when you're multiplying by plus or minus 20 percent, is not
always cost-effective.

     Next on the program, just to show you that we're not cost-
effective, is a study that we're doing to disprove everything I
just said.  Gail Goldberg is going to talk about our precision and
accuracy work.  Gail, why don't you come up here and I'll get out
of your way?
                               81

-------
                       PRECISION AND ACCURACY
                            G.  Goldberg
                               USEPA
     Ms. Goldberg;  The study that we're undertaking now deals
with the data we've already graduated using the screening proto-
col and screening methods for most of the industrial categories.
As you know, time didn't permit us to implement precision and
accuracy studies in many of the first screening studies done on
BAT industries.  The question then comes up, "How reliable is
the screening data on organics determined by the GC/MS protocol?".
With that question in mind, we developed a study to answer the
question.  Keep in mind, the purpose of the study is that we're
trying to evaluate the data developed by the screening proto-
col.  By that I mean we're using the same method, the same col-
umns, the same amount of time to collect the sample that we did
when most people did the screening phase work.   We're going to
obtain data indicating the precision and accuracy capabilities
per industrial category in the matrix of that waste water sample.
Another purpose is to support the eventual defense of the data
when we're trying to make decisions, such as Paragraph Eight type
decisions where you take screening data and you decide whether or
not you're going to continue with the verification phase.

     The other thing to keep in mind is what this small study is
doing.  It's not a total method validation, it's just an indica-
tion of what we can do in a single lab, an indication of the level
of confidence in the data that we've developed so far.  The actual
outline of the work is that samples from each of the 21 primary
industries and POTW's are being collected.  We're having teams
go out and collect one set of samples per industrial site.  When
they're out in the field they're taking influent to treatment, and
effluent to treatment.  That is all we're looking at in this study.
As I said, when they take the sample, the intention is for them to
take the sample in the same way, and as much as possible to dupli-
cate whatever effort we used initially to collect the samples.  So
if the samples for the coal industry were mostly grab samples, when
we collected a precision and accuracy sample, they'll take, again,
a grab sample.  If they were taken at 24, 48, 72 hours, whatever
was taken originally, they would tend to duplicate the collection
of the sample.  Only this time, we're collecting ten gallons in-
fluent to teatment, ten gallons effluent to treatment, plus VGA's
and associated blanks on equipment if they're used.  One other-
thing we're going to be measuring on these samples is TOC and COD.
Each of the ten gallon samples will be divided into ten aliquots.
Each aliquot will be analyzed only for those fractions that we're
interested in.  In other words, it may be that there were no pesti-
cides found in most of the screening data for a particular industry,
so that when we look at these samples for precision and accuracy,
we're not looking at the pesticide fraction.  Of the ten aliquots
that we're looking at, we're spiking at four different levels and
we're doing each analysis and extraction in duplicate.
                                82

-------
     Two of those aliquots are not spiked with priority pollutants,
and two each are spiked at four different levels,  and we're trying
to cover the range of concentration that we found in the original
screening data.  All of the samples are being spiked with sur-
rogates to get some indication of quality control; so that even
the unspiked sample, with priority pollutants, will have surro-
gates in it.

     In all cases, we're analyzing for up to 50 priority pollu-
tants.  In most cases, we haven't found much more than 50.  The
decision on what 50 priority pollutants we'll look at in the pre-
cision and accuracy study will be based on the screening data that
we have available and discussions with the project officer as to
what seems to be of concern and what we want to check out.  There's
some flexibility and discussion on what the needs are in selecting
those 50 compounds.  Then, EPA selects the compounds and the levels
at which the lab will be directed to spike.  As I said,  we're using
the surrogate spikes as a quality control measure, not for quanti-
tation in this case.  We're planning to do up to five surrogate
spikes per fraction at 100 parts per billion.

      As for the time schedule on this study, we've already started
with two industries to date.  We started at the end of last month
(February, 1979).  If we can get all of the sampling sites sched-
uled, we can do up to three industries a month.  This is our goal,
so that this kind of information will be available to the project
officers when they make decisions.  I know that many of them are
making their decisions now.  Another thing that the study provides
is fairly good turn-around time.  Once a sample set is received
by the lab, we should receive the data within 30 days, plus 10
mailing days.  The data is reported on a per industry basis and
the data will include -- on the data sheets we've developed -- the
measured values, the percent recovery (to give us an indication
of accuracy) and values obtained from duplicate analyses  (to qive
us an indication of the reproducibility of the method and the lab
precision).  The study will provide all of this information --
the precision and accuracy, per parameter, per industrial category
-- and it will also have additional data on each fraction for the
surrogates that are added to the samples.

     One of the things that might not be clear because of the prev-
ious conversation might not be clear, is how the study is different
from the verification work that's planned.  In this precision and
accuracy study, we're not changing the column conditions.  We're
using the SP 2250 and the Tenax that were used to produce most of
the screening data for the Effluent Guidelines Division.  As T said
before, we're using the same methods to collect the sample, and
duplicate the kind of conditions that were used to collect the sam-
ple and analyze the sample:  This includes such things as, whether
a seperatory funnel was used to do the extraction, or a continuous
liquid - liquid extractor, whatever was used  in the screening
phase.

     Another major difference between the precision and accuracy
study and the verification procedures that were discussed is that


                                83

-------
we're not doing a background analysis before we spike.  We're spik-
ing immediately (before extraction) and we're spiking based on the
best available data that a project officer has from his screening
data.  In other words, I discuss with the project officer the range
that these particular priority pollutants were found in (high/low)
and what the most frequent concentration level was.

     Another question of interest is why do we need to know the
reliability of the screening data if we're going to be doing a
verification phase.  It is important to keep in mind that the
screening phase focuses the work in the verification phase.  If
data in the screening phase is not clear, or there are any false
negatives, then it may not be possible to check it in verification.
This is an attempt to correct that situation.  Also, the timing on
this, hopefully, will be such that this information will be avail-
able before a lot of the verification work is done.  Are there any
questions?

     Mr. Haile;  I'd like to start off by saying I feel this sort
of program is something that could be very beneficial to the out-
come of these studies.  I do have a couple of reservations, how-
ever.  We've been involved in getting ready to replicate some of
the analyses for the tanning industry, which is one of the two in-
dustrial categories that Gail has mentioned.  What really bothers
me about the program as it's progressed thus far is the possibil-
ity, and the probability, actually, of doing blind spiking.  In
the particular situation we have in the tanning industry, the vali-
dation sampling is being conducted on a plant that we have not had
a chance to look at before; it's not one of the plants that we
screened or verified.  Although our tanneries are not quite all
alike, we really don't know what the wastewaters are going to be
like out of this particular tannery.  I'm somewhat concerned that
we get at least two or three good data points out of the four
levels of spikes,  since, for many of the compounds we found in the
tannery industry,  the concentrations ranged somewhat drastically.
In particular,  concentrations of phenol in the tanning waters that
we looked at ranged from about 25 parts per billion--! think was
the lowest detected level -- up to about 25 parts per million.  We
really don't know what we're going to find in these tannery waters
and we may end up with no good data points for that particular
compound.

     Ms. Goldberg;  Clarence, I know we've talked about this, and
I recognize your concerns.  One of the things I'd like to clarify
is that it's not entirely a blind spike.  I think that the effort
was made to use the data produced before, and although this partic-
ular tanning industry wasn't part of that data base, we used a
wide enough range when we did the spiking that I think we can en-
compass what we're most likely to find.  For example, we spiked on
the lower level from 5 to 250 parts per billion, and for some of
the phenols, we were spiking in the 1000 parts per billion range,
which should probably cover what exist at that facility.  The
other thing is, there is a limit in this study.  We're only doing
one sample per industry, which is limiting in and of itself.  All
I can say is that when we come out with this data that it's used
                                84

-------
carefully just to say that we're getting an indication of how the
screening method worked for a particular sampling matrix.

     Mr. Haile;  I feel the communication that we are getting, or
we would hope to get out of this, is that the importance of that
indication should not be diminished.  It's not to reflect on how
the data for that entire industrial category is received.  It's
very laudable to try to obtain that information, although if we're
trying to make sure we're getting an indication, we should get as
accurate an indication as possible, especially with the tanning
industry, since we're looking at a plant that we have not looked
at before.  It's good to look at the data from the 18, or whatever
plants that we've seen.  I'm not sure that the average, or the
range, adequately represents what we might find in the particular
tanning plant that we are doing the verification work on.  I would
feel and would have felt a lot more comfortable, had at least we
gone back to a plant that we had looked at before.  There are a
lot of dangers there and that certainly should be recognized.

     Mr. Cowen;  I noticed that you mentioned that you are going
to be analyzing TOC and COD.  I wanted to address a question to
Jim Lichtenberg.  When you start getting data back in relation to
your protocol, when you said something about how you were going to
check between the different plants for a given industrial category,
I was wondering if there was any thought to putting some sort of a
measurement on the organic matrix, or how high the organic strength
is in the waste when you're relating that.  Are you just going to
be looking at what are the priority pollutant concentrations and
the pieces of data that you're going to be comparing?  Maybe I
should restate the question, but, Gail has mentioned that she's
going to be looking at TOC, COD, to give her some idea of compara-
bility between the various plants.

     Mr. Lichtenberg;  Well, the concentration levels and their
ranges involved are going to be defined by the procedures that
we're following.  Based on that, we will be comparing data of
comparable concentrations from one place, or location, to another.

     Mr. Cowen;  That's the concentrations of priority pollutants,
then,  you're talking about?

     Mr. Lichtenberg;  And the surrogates.  The concentration of
surrogate will be the same throughout, in every sample.

     Mr. Cowen;  What I was wondering is whether or not you are
going to do any measurement on the wastewaters that were used —
the effluent from treatment, now, we're talking about — as far
as gross organic content?

     Mr. Lichtenberg;  During the TOC and so on?

     Mr. Cowen;  Yes.

     Mr. Lichtenberg;  No, we weren't planning to do that, no.
                                85

-------
     Mr. Cowen;  I was just wondering.  It was a thought when I
heard that she was going to do that.

     Mr. Lichtenberg;  I'm sorry, I guess I missed the first point
of your question.  No/ we're not planning to do that in this pro-
gram.

     Mr. Telliard;  I think you're getting at the fact that we
might find that maybe all biological effluents look alike,  or
something like that.

     Mr. Cowen;  I was wondering if there will be any reason to
measure gross organic content in your program?

     Mr. Telliard;  In Lichtenberg1s program?

     Mr. Cowen;  Right, I'm really addressing to his program, now.
I'm sorry to go back, but it's just a thought.  You said you were
going to be comparing data in various places and that maybe it
would be nice to have some measure of the difference in the efflu-
ent from an organic standpoint.

     Mr. Lichtenberg;  If that's directly correlatable to the
specific questions we have, perhaps that would be true.  Maybe it
would be.  There is the possibility of the real potential of so-
called surrogate parameters like TOC, TOCL, or whatever, becoming
implemented into the overall program.  If that is the case, proced-
ures are accepted and implemented into the program.  Then we would
be doing accuracy/precision on those parameters as well.  Right
now, for the purpose that we're discussing this whole thing, I
don't see it.

     Mr. Henderson;  I must say I'm a little pessimistic this morn-
ing about the data from this study after seeing Bob Kloepfer's
data on good water.  He's getting recoveries of 50 percent, plus
or minus 50 percent; 100 percent, plus or minus 60 percent.  It
makes the possibility of any kind of meaningful correlations remote.

     Mr. Telliard:  That's why we're doing it.  We'd like to know.

     Mr. Kloepfer;  I would like to make another point about the
data that I presented.  Keep in mind that that data is from three
different laboratories with varying amounts of experience doing
that type of work.

     Mr. Hall;  Some of those samples were not analyzed for a con-
siderable amount of time.  Is that being considered here, because
you say there is going to be sample turnaround of 30 days?  Do you
think that's going to have an effect on correlating that precision
and accuracy?

     The other thing, is that, as an analytical chemist, one of the
problems in precision and accuracy is operator to operator varia-
bility.  Is this going to be single operator precision and accuracy
and how is that assured in the contract?
                                86

-------
     Mr. Telliard;  The answer to your question is multiple opera-
tor; and, no, if the sample originally was really done early on
in some of the industries where the samples didn't get analyzed
for five to eight weeks,  we're not going to hold them for five to
eight weeks and run them.  We're trying to find out, basically,
in all of the industries  that we have run, or will run,  what the
general precision/accuracy of the screening protocol is; not the
verification methodology, but just the screening protocol.  This
is something that everybody was going to do, in their individual
studies.  We did it in petroleum, but in some of the industries,
we didn't do it.  This is just to have one shot at covering all
of them.

     Mr. Bastian;  I'm Bruce Rastian with Shell.  I'd like to fol-
low up on a comment that  was made just a little bit earlier about
the different qualities of the laboratories that would participate.
I think that it is very important in regard to your on-going study
that the data that has been accumulated already has been accumu-
lated by a series of laboratories of different abilities, back-
grounds, and qualities.  That data is being used in developing ef-
fluent guidelines, and so an indication of the existent data base,
and the quality of that,  would be more important and to the point
for the on-going process  of developing guidelines that are an after-
the-fact definition of what one particular laboratory is doing.  In
order to determine this,  it would seem that inter-laboratory var-
iability, and reproducibility, would be a most important thing to
determine.  My question is, to what extent are you addressing that
problem?

     Mr. Telliard;  On screening, or on verification?

     Mr. Bastian:  Bill,  I'm referring right now to the current
study that Gail described.  If it is to be related back to any
previous quality of data, then one would have to be looking at
different laboratories.

     Mr. Telliard;  Right, but we're not.  That's all we have, one
lab.

     Ms. Goldberg;  Yes,  but the one lab that was chosen was the
lab that produced a greater proportion of the data for the Divis-
ion.  In other words, this particular laboratory had done over 800
samples, and, we felt, had the most extensive experience of any
of the  labs that we have under contract.

Mr. Telliard;  Thank you.  We have an addition to the program that
I didn't pencil in.  George Stanko, from Shell Development, is go-
ing to have a few words.
                                87

-------
                           G. Stanko
                       Shell Development


     Mr. Stanko;  Before I start, I'd like to say it's very refresh-
ing to hear some EPA project officers are starting to sound like
industrial management and are concerned with the particular costs
of these analyses.

     This morning I'd like to share with you some of the observa-
tions we made on inter-laboratory reproducibility of GC/MS analyses
with respect to priority pollutants.  We feel that it is extremely
important in the development of numerical effluent guidelines,
since it is the reliability of this data base which relates to a
given industrial process, or category.  Stated another way, the
numerical effluent guidelines cannot be more restrictive than the
reliability of the analytical methods.  Our concern in this area
was promoted when data obtained by our own laboratoty did not
resemble the data that was generated by the Agency.

     These data showed that the agency found 47 priority pollutants
in one particular location.  Our analyses showed 9 priority pollu-
tants to be present.  If one compares the data, there was only
agreement between the two sets of data on one particular priority
pollutant quantitatively.  That seems to be a considerable number
of false positives.

     Mr. Flory;  Was that a split sample, George?

     Mr. Stanko; Yes.  Not being able to explain these differences
to our mangement, we felt it was necessary to have samples of ac-
tual wastewater streams analyzed by a number of labs so that we
could evaluate the data from a statistical point of view.  We
wanted to see where agreement between labs, reputable labs, I
should say, was good enough to warrant its use quantitatively.  We
also wanted to see where its use could not be justified.  I would
like to caution you that the data that will be discussed is only
for the organic chemicals industrial category.  It may not apply
to other categories.  I would also like to caution you that the
study is incomplete at this time, and that the data base that
we have is very limited.  We hope to expand this.

     At this time, we have data from two particular studies, and
these studies are complete.  One study involved three laboratories
and two sampling points.  The second study involved two laboraties
and two sampling points.  In these studies, we controlled the samp-
ling variable as much as possible.  Only grab samples were used,
and both studies included spiking of the samples with deuterated
compounds in the field.  On one of these studies we elected to
spike, in the field, with priority pollutants.  There was a limited
number involved.  In one study, all of the samples were analyzed
within 14 days; therefore, sample degradation variables were re-
duced.  In the second study, the analysis schedule was precisely
identical for both laboratories down to an hour.  Two different
                                88

-------
chemical plants were involved in study one and study two.  I will
not waste your time showing you the raw data.  It would look like
little spots on the wall.  Instead I'll show you our statistical
evaluation of these data.

     The data that we generated was evaluated statistically fol-
lowing a manual by Youden and Steiner, called Statistical Manual
of the Association of Official Analytical Chemists.  The specific
procedure that we followed was called Analysis of Variance, and
it's described on Page 77 and 78 of this particular manual.

     This table describes the statistical evaluation of the two
studies that I just alluded to.  Not having done this previously,
we had no guide to tell us, where we should draw the lines.  We
elected to draw lines from zero to 100 parts per billion, from
100 to 1000 parts per billion, and then range that we define as
greater than 1000.  The reason for doing this was that we thought
that the analytical variability would probably be different in
each one of these ranges, and that as the concentration got higher,
analytical reliability would probably be much.better.  We go to
the first set, the greater than 1000.  In the first study, we had
four data pairs, the average concentration of the four data pairs
was 4050 parts per billion, the one sigma, standard deviation for
the 4 pairs was plus or minus 354 parts per billion.  In the 100
to 1000 range, we had eight data pairs, when we compared labora-
tory one with laboratory two.  In this set, the average concentra-
tion of all the values that were reported is 394 parts per billion.
The one sigma was plus or minues 87 parts per billion.  When we
included the third laboratory, we were only able to use five data
points.  Obviously one laboratory did not find some of the com-
pounds and we can only statistically analyze real numbers, not un-
detected parameters.  In this study, the average concentration for
the entire data set was 246 parts per billion and the one sigma
standard deviation, was a plus or a minus 257 parts per billion.
When you drop to the less than 100 parts per billion range, a lot
of interesting things that start to show up.  The first consider-
ation we had included 14 data pairs.  The average concentration
of all the data reported was 27.6 parts per billion, and if one
compares laboratory one with laboratory two, the standard devi-
ation is 19.1 parts per billion.  If you do the same comparison
with laboratory one and three, the average concencentration is
63.6 of all the data pairs on the 15 data sets, and the one sigma
is 96.5.  If you compare laboratories two and three, it's 75.4;
one sigma is 134.   If one takes the data from all three labora-
tories, and there were 11 compounds where everyone had some num-
bers to report, the average concentration in this particular
study was 51.5 percent; the standard deviation, or one sigma was
86.4 percent.  I'll go down to the second study.  We do not list
for the greater than 1000, or from 100 to 1000 parts, simply be-
cause we just didn't have enough data points.  Below 100 parts
per billion, the average concentration of all the compounds re-
ported was 33.9 parts per billion.  The standard deviation be-
tween laboratory one and laboratory two, was 18.9 parts per bil-
lion.  In this particular study, we did replicate analysis and
                                89

-------
                    •i

                    fcC
                          CM
    to
 g
H
M
a
o
M
«J
a.

o
             §
              t>

             CO
                   a
                   in
                   o
                   t)
                          CN
                          •*
                                         «n
                                         CM
                                                  vp
                                                  CO
                          -*    r^
                          »n    eo
                                               ON
                                                              to
S
I
53
r
 9
 3
 E-«
 (O
          g
K
O H
M W
H to
2<
H ^
;s <
W O
U »)
55 PJ-«.
O O «
0 tvi 3.



O -* O
in o sr
O 0 CM
•h
*?



                                                 yO -*  IA
                                                        cr> ^

                  O
                     CO

                   • H
                  O W,

                  S5 co|
                                     coin
                                                 m
 •*•
-------
     CM
3
                                        r--
                                        CM




pj
O
H
s
O
§
5
2
^\
H

M
P
jg"
^J

frj
w
H
y^t
M

P-4
O

CO
M
to
K^
<£
SM


fT(
O

t~* 12
K 0
UJ M
M £-•
U~f*f
•^
M (2
Ji. W H
U, 0 iS
k*J ^C P4
O K O
^- > 0
< 0























n
k*j
CM
r^



cn
. »-O
CM



C">
•^3
^



CM
O
rH






H
t-4

^
H
<
Q

CD dc
^T | ^




^ta^
H S
f) -^
PK ^
O
C/5
O CO
^S W


o
M
H ^

H ^
y*
CJ 0
o -y
S S


^^ CO
O VD
- t-< »H



CO
r^
^l 	



CM
in
r4


^^
•
oo cs ^ o
es \c> 10








>O O *!} iT> ^ CS LO
O *3-\O I^COlOr-* COOrH
10 O"* "^ C^4 \o r*1* if) c^ rO CO
O en CM
•»
-»





'
•
t-(
•o
-* coin ^-mrsi^ <-<^-* *J
t-liHtHrH tHi-H»H to
|
tH
V
^n
H
g
« M
rt o «8 rfl •«
O O O O _
O O O O **
O ^i .-< r-\ *J
i-t 1 VI VI «
A O P
O
^^
•
•9










































»
CM
•a
u
U)
I
CM
*>
JQ
H
§
V*
t6>4
ej
4-i
0

•
^t)
                    89b

-------
          100-
Second
 Analysis
             O
20     40     60     80
     pg/8.      First Analysis
100
                      Figure 1.  Laboratory 2 Repeatability
                                   89c

-------
89d

-------
G
                                      89e

-------
89f

-------
we were able to get intra-laboratory information.  With laboratory
one, adding up all the data points from their 14, the average con-
centration was 36.2 and they're within laboratory repeatability.
The one sigma standard deviation was a plus or a minue 9.7 parts
per million.  Laboratory two, the average concentration was 31.5
and the standard deviation, one sigma, was 4.2 parts per billion.
If we go back to the inter-laboratory study on this last program,
we see that the average concentration was 31.5 and the standard
deviation is 18.9.  If you use the 95 percent confidence limit,
or level, you could say that if the component is there at 34 parts
per billion, the plus or minus is 34 parts per billion, plus or
minus 38 parts per billion.  This data shows tht standard devia-
tion with respect to absolute concentration.  You can also take
this information and display it using the coefficient of variation,
for those who would like to use that procedure.  The story is the
same, it's just showing the data in a different format.  If you
take the information that is inclined in this table, you can, siLs®.  t
display it in a graphic form.  You also learn something about the
laboratories that you study, this particular laboratory  was lab-
oratory two, and this shows the repeatability data.  All of the
points represent the first value plotted against the second value,
the scatter, along the 45 diagonal line is some indication of the
random error inherent in the procedure.  We have included the two
sigma, or the 95 percent confidence level lines, also on this
table.  It shows the data that we previously presented, and this
particular laboratory has repeatability of a plus or a minus 9
parts per billion, when the average concenration of all of the
components is at the approximately 30 parts per billion level.
We have plottted the data for laboratory two, in a similar fas-
hion.  In this case it shows that the error, random error, around
the 45 diagonal and that the two sigma,or 95 percent confidence
limit in this particular case, is very near 20, where all the
components average out at about 35 parts per billion.  This is
each intralaboratory repeatability.

     If one takes the data, and compares one laboratory with another,
laboratory one and laboratory two, one can see that the standard de-
viation has increased.  The two sigma, or the 95 percent confidence
level, at the 34 parts per billion range, is now plus or minus 38
parts per billion.  Inter-laboratory repeatability is approximately
two to three times that of intra-laboratory.  The data also shows
that in addition to random error with the GC/MS methods, there is
also systematic error in certain laboratories.  The first study in-
cluded three laboratories.  If one plots the data of laboratory
three against laboratory two (and it doesn't make any difference,
you can also plot laboratory three against laboratory one and you
get a very similar display), one can see that the standard devia-
tion, or the 95 percent confidence limit to sigma for the 34 parts
per billion range is now plus or minus 273 parts per billion.   One
of the most important things that is shown on this kind of display
is that it is very obvious that laboratory three has a high bias
with respect to laboraory two or labortory one.
                                90

-------
In this presentation I have not identified laboratory one, two,
or three.  In all three cases, the laboratories involved knew
that they were being studied.  I should clarify that we were
not studying the laboratories, we were really studying the GC/MS
method.  When you do collaborative study, it should be pointed
out, as it was, that you are really studying the method, not the
analyist, not his equipment, and not the particular laboratory
involved.

     I'd like to go back to my original problem, when some plant
manager comes up to me and says, look, you have reported that my
effluent contains 34 parts per billion of 1, 2-Dichloroethane.
What does this mean, where is the guideline going to set the num-
erical limit?  What can I report back to him, is that with two
laboratories, using over a half million dollars worth of equip-
ment, 95 percent of the time it is possible to say that there
were 34 parts per billion of that compound in his effluent, plus
or minus 38 parts per billion.  I can also tell him that if the
guidelines choose to set the limits for that particular compound
at 34 parts per billion, and indeed his effluent contained, abso-
lutely, 34 parts per billion, that the two laboratories, 50 per-
cent of the time, will yield data stating he is out of compliance.
The example I give you is a real life, the names of the compounds,
and the people involved, have been changed to protect the cur-
rently employed.  I do not intend to be sarcastic, and I would
like to go on the record as stating that I believe the GC/MS
methodology is a current state of the art.  It also may be the
most cost-effective way to handle the problems that we face.  I
also think it should be recognized that there are limitations to
the GC/MS methodology, these limitations need to be recognized
and they have to be addressed in a reasonable fashion by people
who will be using these particular procedures.  Thank you.

     Mr. Telliard:  Questions?  You're not going to let him off,
you can't just take this lying down, He's not going to sit there,
come get him.  Thanks, Bob, I was worried.

     Mr. Kloepfer;  George, you made one statement about the study,
you said that you were studying the method and not the laboratory,
I don't see how you can posibly separate the two.  In the data
that I showed awhile ago, presumably all three labs were using the
same method, yet the precision was quite different in the three
laboratories and I don't see how the distinction can be made be-
tween only studying the method, and not taking the differences in
the laboratory experience.

     Mr. Stanko;  To answer this question, when you initially set
up the study, you don't know that one laboratory is any better
than another laboratory; you can only determine that after you
obtain the data and start plotting it in the format that we have.
We did not know one laboratory was biased high, one was biased
low; we did not know that, in addition to random error, there was
systematic error, at least at two locations.  I should point out
that between laboratory one, and laboratory two, that the system-
atic error is probably reasonable.  It's not much, but it is


                                91

-------
present.  You can calculate it.  Another way, a simple one that
we've been using the as a rule of thumb, is to take the difference
between all of the data points, between the two laboratories.  If
it is strictly random error, the summation of this, the plus or
minus, should be near zero.  If you have an absolute value, either
a plus, or a minus, then one of the laboratories is either biased
high or low with respect to the other laboratory.  It's a simple
way of doing it.  The treatment we have given this data is very
rigorous.  It's very strict, and we've stuck right to the book.
There are shorter procedures that yield very similar statistics
and they're quicker to use.

     Mr. Neptune;  George, I have one thing, I'd like to say.  You
make a couple of assumptions here which are more than a little
dangerous, at least, on your part.  Number one is that the infor-
mation you're providing us is based on the screening protocol, is
that correct?

     Mr. Stanko;  Dean, 60 percent of the data was prepared GC/MS,
using the VGA procedure.  I'd like to point out that this procedure
is in draft six of ASTM, which is a consensus standard, so, you can-
not say that the data are not reliable, or valid, the method will
be a consensus standard, very shortly.

     Mr. Neptune;  Does that mean you didn't use the screening
protocol, George?

     Mr. Stanko;  The screening protocol and ASTM procedure are
practically identical.

     Mr. Neptune;  Yes, or no, you did, or you didn't?

     Mr. Stanko;  Yes, we used the screening protocol.

     Mr. Neptune;  Okay, fine, and the screening protocol was in-
tended, as it says in the foreword, to be a guide, and, thus, being
a guide, it allows for individual interpretation between labora-
tories, that's one point.

     The next point is that the screening protocol was also in-
tended for qualitative determinations of presence or absence, and
never intended to be a quantitative tool for the very reason that
there was a great deal of flexibility being allowed to the given
laboratory interpreting how it should be used so that we could
obtain basic samples on which we had very little information to
start with.  Because so little work had been done in this area,
the analysts needed as must flexibility as they could possibly
have in, order to run a large number of samples from different
wastewater matrices, so, that's an important point.  If it had
been a strict, very finitely outlined procedure, many of these
samples would never have been run.  They just would have been im-
possible to run....  That's why a certain degree of professional
judgment, I think, on the part of the analyists had to be allowed
in screening, and that was consistent with the objectives of
screening, to make a qualitative determination and a quantitative
estimate.

                                92

-------
     Lastly, in regard to your concern about using this particular
type data and defining a guideline, and, hence, probably even a
regulation, you took another quantum jump and assumed that the
screening data would be used to set numerical limits.  I've said
it a number of times and I'll say it again.  To set numerical
limits per se, the screening data was never intended for that pur-
pose, and, so, some of your concerns, based on your assumptions
if they were valid, would be of great concern to me, too.  But
when you come back and look at your assumptions you find our that
the assumptions aren't valid and that makes a great deal of differ-
ence in the conclusions that can be drawn from the information you
presented.

     Mr. Stanko;  Dean, the information and the data that I pre-
sented here did not address the phase, the screening phase, the
verification phase, or anything.  I said we evaluated the GC/MS
procedure as it's spelled out in the protocol.  We were given an
appendix to that protocol yesterday, which allows the use of those
particular methods for the verification phase, I don't want to get
into that.  In addition to following the EPA protocol procedure, we
incorporated in this study, and I explained it, some of the things
that are outlined in the document that just came out of Cincinnati.
I don't know if we took a page out of their book, or they took a
page out of our book.  The surrogate addition of spikes in the
field, with priority pollutants, all of these things were included
in this.  So, in addition to what you call screening phase metho-
dology, it was well beyond screening phase methodology.  We had
some QC and QA above and beyond the protocol and something that
borders close to what's being outlined in the new document.  In
relation to the guidelines being set, I made quantum leaps because
I don't know when the numbers are going to come out and I don't
have the slightest idea what they will be.  I'm making the point
that the analytical variability is something that has to be recog-
nized, considered, and addressed, no matter in what phase you use
the methodology.  The cases tht I have given you here were very
tight, wellcontrolled, designed experiments.  We think we've
covered all the angles that were possible in a time frame with the
money involved.  I think what I have presented, and I caution you,
this is limited data, is probably the best situation that you can
expect from that point.  Does that answer everything?

     Mr. Prescott;  I would like to ask the Agency once and for all
to get their act together on what the hell the screening data is.
We have heard in the last six months, that we are not to worry
about the screening data.  We have heard it's only qualitative data,
and now we have a member of the Agency get up and discuss George's
data saying the same thing.  Yet, a young lady was at that podium
earlier today who said, we are doing precision and accuracy studies
on screening protocol.  Now, you can't do precision and accuracy
on qualitative methods.

     Mr. Stanko;  Are there any other questions?
                                93

-------
     Mr. Lichtenberg:  George, I would just like to ask you one
question, you talked about the systematic error that was obvious
in here.  I would just like to ask if you did anything to try to
find out what systematic error was, and was there any attempt
to correct it, or, will there be an attempt to correct it in any
further work your'e doing?

     Mr. Stanko;  Jim, the only thing I could say is the data I
presented it yet.  I may not be currently employed when I get back.

     Mr. Lichtenberg;  Want to work for us?

     Mr._Stanko;  What's your offer?  As I stated earlier, this is
only an incomplete reporting.  We have at least two additional
studies going on in addition to what I've represented here.  This
was hot off the griddle, it was completed approximately a week
ago, less than a week ago, really.

     Mr. Flory;  George, it's not really fair to criticize without
offering something constructive for the future, and I just wondered
if you have any ideas.  Do you think this method can be improved,
do you have some things in mind?  One of the things I've been think-
ing about is that the people who have been doing the work, in the
lab, who are very familiar with it, might get together in a group
and just go through the protocol and see where it can be improved.

     Mr. Stanko;  At this point, I don't have any astounding rev-
elations, or ways of improving it.  I think the method is probably
pretty good.   When you start working in the lower concentration
levels, I think you've got to recognize that errors are important,
not only random error, but systematic error, and to try to pin
down what systematic error is, what is one laboratory doing diff-
erently from another laboratory.  The only thing I can say is that
the method has to be put through something that's called a rugged-
ness test.  That's pretty well outlined in this manual.  I suggest
this is a pretty good Bible, get it and read it.

     Mr. Flory;  Well, now I think you have to take into account
what Dean said, too, and that is that the method is not that rigidly
specified, and...

     Mr. Stanko;  That may be part of the problem.

     Mr. Moseman;  I have a question.  You also implied at the be-
ginning of your talk that there was some variability with the qual-
itative results.  You didn't really present any data on that.  The
screening protocol is supposedly for qualitative results I'd like
to get some comments about that.

     Mr. Stanko;  The numbers came out.   There are 47 organic pri-
ority pollutants and of this, only 9 were found by our laboratory
and in this particular case, only one that really agreed quanti-
tatively.   It is a real horror story when you present the data;
management comes by and says why did I spend $175,000 to buy you
                                 94

-------
a 023 and you generate numbers like that, is that the best that
you can produce?  These are serious questions.

     Mr. Mosesman;  Well, I  was referring to the study with the
data you presented on the three labs in the comparison study.
What were the qualitative results of that study, and how far off
were the few labs qualitatively?

     Mr. Stanko;  In that particular study, there were 20 priority
pollutants in the zero to 100 parts per billion range.  Of the 20
priority pollutants that were found, there was only this agreement.
Everyone reported a number for four.  Two laboratories, and I don't
know which laboratories, one and two, at least two laboratories
reported as many as 14 values.  In other words, there were false
positives, and false negatives.  Now, that data that we showed
didn't include any cases where one laboratory reported a priority
pollutant and the other said, not detected.  In the three labora-
tory studies, if two laboratories found a compound at 10 and 20,
and the third laboratory didn't find it, we did not include that
data set.  This is why our data set is rather limited, not because
we didn't try, but because the numbers didn't turn out that way.

     Mr. Flory;  I can certainly tell you the reason for that and
that is the procedure that defines how you search for these com-
pounds.  For example, it doesn't state whether you use background
subtraction, or not use background subtraction, whether you en-
hance or don't enhance.  If you're talking Incos, and if you make
a mass chromatogram plot and there's high background, when you
look over on the left at what the relative abundances are, they
may be way off and you might just look at it and say the compound
is there, and it may not be there.  So, there are many things like
that that can be examined.

     Mr. Stanko;  It's true.

     Mr. Neptune;  Don, if you, or anybody else, cares to reinforce
a comment that Jim Lichtenberg made this morning, have any contri-
bution that you could make on making the screening better, Jim
Lichtenberg mentioned it today, Bob Booth mentioned it in Savannah,
it's in the protocol, saying, if anybody has got a contribution to
make, get it to Jim.  I don't know that you need to get together
and have a meeting.  I do know you're going to have to get it Jim,
and, because all the knowledge in the world is useless unless it
gets to somebody who can start to do something with it, and that's
anybody that has any input, get it to Jim, get it to EMSL.

     Mr. Stanko;  Are there any other questions or comments?

     Mr. Telliard;  Due to the fact that you guys want to talk
about all of this nonsense so long, I'm not letting you out of
here to eat the strawberries, we'll just keep right on going.
Checkout time is at 1 o'clock, we still have a long program ahead,
so get out and have your strawberry, get your coffee, and get
back in here, ten minutes, no more.
                               95

-------
                             RECESS
It's time again, folks.   We'd like to have Priscilla Holtzclaw
talk on what we're doing on asbestos, this will not be the three-
hour presentation that we had in Denver, but it will keep you
informed of at least what we're doing, and if George is upset
with the GC/MS Methodology, we've really got a good one for him.
                                96

-------
                              ASBESTOS
                            P. Holtzclaw
                               USEPA
     Ms. Holtzclaw;  As most of you I'm sure are aware,  inhaled
asbestos has been recognized for many years as a definite car-
cinogen.  The health effects data on ingested asbestos do not
as clearly delineate a problem, however.  There are many reasons
for this, but the result has been that asbestos has not received
the top priority in Effluent Guidelines Division that it neces-
sarily would have had it been recognized as a specific,  definite
carcinogen.  Among the 129 priority pollutants, it is unique in
both its definition and in the sampling and analysis procedures
necessary for it.  The term "asbestos" encompasses the fibrous
and in some cases, by definition, the non-fibrous forms of some
six minerals.  Very little health effects data is available at
this time to delineate which, if any, of these morphologies pro-
duce a high degree of carcinogenicity upon ingestion by man, but
current studies are beginning to produce more positive results.
They are on-going, but the problems with them include the determ-
ination of the critical aspects of the fiber; for one thing, does
the size of the fiber make a difference, the flexibility of the
fiber, the morphology of the fiber, the resiliency of the fiber,
et cetera? There has been a problem with development of a pure,
characterized, standard material.  To date, there is no indica-
tion of the accuracy of an asbestos measurement.  There have been
problems with the long period of time needed for tumor induction
in laboratory animals, and in man, himself.  Because of the lack
of definition, and because the cost involved in asbestos analysis,
which gives data only on the ore pollutant, many of the project
officers in Effluent Guidelines have more or less avoided the
problem of analyzing samples for asbestos.

     Also, to date there have been few laboratories that have
really been capable of doing the work, or have the equipment
necessary to meet the standards that Effluent Guidelines would
impose.  However, we are now currently in the process of deter-
mining a contract award for six analysis lots which make up a
total of 1500 samples for asbestos analysis.  The program has
been set up differently than most of our sampling and analysis
programs have been.  Sampling will be treated as a self-sampling
venture with the industries.  Samples will be taken from one
industry in each industrial subcategory.  In some industries,
there will be a large number of samples taken, in others there
will be a relatively small number.  The facility will be selected
as being representative of the particular subcategory in which
it is included.  Samples will be taken at three spots.  Normally
in the plant, the raw water, the influent to treatment, and the
effluent from treatment will be sampled.  I say normally three,
because in some plants there may be two sources of raw water.
There may be two points at which the raw waste load is dumped into
the treatment plant.  All of the samples will be sent to an IFB
analytical laboratory by the plant.  The influent to treatment,
and effluent from treatment will be prepared and analyzed within


                                97

-------
30 days.  As to the raw water, or the influent to the plant itself,
the filter grid will be prepared and it will be archived.  This
sample will be analyzed only if significant levels of fibers are
found in the effluent from the plant itself.

     The program has been designed to have minimal impact upon the
industries.  When a plant is selected for asbestos analysis, it
will be notified by the project officer and by the Analytical Pro-
grams group.  They will then be asked to reach a mutually agree-
able date for sampling, and will be sent a self-sampling kit which
will include bottles that have been cleaned to avoid any asbestos
contamination.  The kit will also include instructions on how to
sample; it will include a trip report in which they will describe
the sampling location; and it will include shipping labels for
shipment to the analysis laboratory.  There will be very little
cost to the plant, other than several hours, perhaps, involved
in collecting the grab samples.  Effluent Guidelines Division
will be responsible for the cost of the shipment to the labora-
tory, which will be by Federal Express.  We have put an emphasis
on time turnaround in this contract, because of the uncertainty
as to how long a sample will remain intact after it is taken.
There is a question as to whether bacterial growth will occur or
as to whether leaching of the magnesium may occur in certain of
the effluents that we will be sampling.  This is intended as a
screening survey of the industries.  It is simply to tell us
where the asbestos is present, where we may have a problem, and
to allow project officers to eliminate asbestos from their con-
sideration if it does not appear in the plants in which they are
interested.  However, we have built into the contract enough
quality assurance so that we can be assured that the instrument
is operating at its peak capacity and that the personnel are cap-
pable.  This will, hopefully, allow us several years from now to
be able to use the data that we have generated today.  Asbestos
analysis is still developing, but we feel at this time we have a
procedure that will basically be the same procedure that will be
in effect in several more years.  The one that we are using, by
the way, is the interim procedure that has been developed at the
Athens laboratory.

     The program will be controlled in a way similar to that in
which our organics program is presently controlled, in that the
Sample Control Center will eventually take the responsibility
for the sampling and tracking of the samples.   The plants will
be chosen on several bases within each subcategory.  First of
all,  we would like it to be representative of the process for
that subcategory.  Second, it may be chosen if we feel that
there is a treatment process at that plant that will be useful
in the treatment of asbestos fibers.  Then there are several
smaller ones; one, we are hoping to work closely with the plant
since it is a self-sampling venture so we will be looking for
plants probably that have been sampled before, that the project
officer has some information on and that will cooperate with us.
Then, we will also be looking at plants that are in close prox-
imity to an airport so that they don't have any problem getting
the samples to our laboratories.


                                98

-------
     The data base that we get from this will be not only on the
discharge of asbestos, but we will also be providing other Divi-
sions within the Agency with information on asbestos in the raw
waters, in the ambient waters.  We will also begin to build a
data base on the type of available treatment technologies that we
have today.  As I said, this is a screening venture.  The numbers
are not going to be used explicitly for regulation, but they will
give us an indication as to whether asbestos is a problem and
will help us to focus further efforts in this area.
                                 99

-------
                            METALS ANALYSIS
                               M.  Carter
                                 USEPA
     Mr. Carter:  I'm going to give a brief description of what's
going to be happening with the elemental IFB's — in other words,
the contracts that will be used to provide us with analytical da-
ta on the metals in the consent decree,  plus a number of others.
The other elements that will be analyzed for under the terms of
this contract include metals that may present interferences in
the analysis of the priority pollutant metals.  So,  as we have
the sample available, we want to go ahead and acquire this data
from one sampling trip.  Some of the other metals are in there
in case a project officer thinks he needs data on them.  There
are a few, like gold and palladium, that we may not have any
determinations for, but they are there in case they are needed
or of interest.  They will be requested specifically by the
project officer, if he's interested.

     The QA/QC procedures that will be used with the ICAP are
basically the ones that have been in use at the Region Five
laboratory in Chicago.  We picked up the program from them, and
the contractors will be required to implement an equivalent QA
program.

     At this point the ICAP results will be considered to be
screening data pending method validation by EMSL,' Cincinnati,
to show that the ICAP results, are, indeed, equivalent to AA
results.  After we get a certification of the method, we expect
that the ICAP results will become verification-type data.  Until
that time we will use AA as a quantitative confirmation of ICAP
if it is required.  There will be new tracking sheets,, traffic
report type sheets, provided through the Sample Control Center.
We'll be able to keep up with the samples -- where they're taken,
which contract lab they will be going to -- and make sure that we
get all of the data reported on time, as it's being done with or-
ganics at the moment.  The contract bids were opened a week ago.
We will be making our site evaluations next week, and we expect
to get the contract awarded, hopefully,  the first week in April.

     Mr. Telliard:  The next part is the part that you all came
for.  Early on we had looked at two sets of programs.  One is
ours, the industrial part, but we're also required to  set pre-
treatment standards in many of the industrial categories.  We
had to  look at the discharges to publicly owned treatment plants
and at the same time we were involved in evaluating as almost an
industrial category, the publicly owned treatment plants.  As
the program developed, we also got into a question of trying to
do materials balances.  We had it in the influent, and it wasn't
in the effluent, where did it go?  Of course, we all knew the
answer; it went in the sludge.  A number of us attempted to ana-
lyze sludge, both municipal and industrial.  For this,, there was
no method.  A methods development had to be initiated, and that's
                               100

-------
what we're going to talk about today.  Some of the methods have
programs that have been initiated.  The experiences of some of the
analysts in trying to run both POTW and industrial sludges will
address the original sampling question of materials balances.

     First we will look at some of the efforts made to determine
the priority pollutants entering Municipal Treatment Plants from
all sources in order to give you a feel for what is happening in
that area.

     One of the studies that was done early on was done by ADL,
for looking at the "source assessment".  ADL will tell you what
they were doing.
                               101

-------
                   Source Assessment of Priority
                     Pollutants Entering POTWs
                             K. Thrun
                                ADL
     Ms. Thrun:  We're actually doing a source assessment of
priority pollutants entering POTW's.   What I'd specifically like
to share with you today is the quality control data that we
gathered as part of a program in which we're analyzing for priority
pollutants in sewage.  We felt this would be of general interest,
because of the range of sample matrices that we're analyzing and
spiking priority pollutants into.

     To give you a little background, the Office of Water Planning
and Standards is presently engaged in developing regulatory stra-
tegy for priority pollutants entering publicly owned treatment
works,  or POTW's, formerly known as municipal sewage treatment
works.   ADL has been requested to do the source assessment portion
of that study. In other words, we're determining where the  prior-
ity pollutants are coming from that are entering the POTW's. The
first POTW area that we analyzed samples from was the Muddy Creek
area in Cincinnati, Ohio.  That particular area provided us with
the raw wastewater samples that we spiked priority pollutants
into, and subsequently, the quality control data that I'm going
to be presenting.  We've also gone to St. Louis, and we'll be
going to a couple more cities.  We're not doing the sludge portion
of this.  We are doing the raw wastewater part of it.

     We used the 1977 EPA screening protocol to quantify and iden-
tify the priority pollutants in sewage samples and, in addition,
we incorporated an EPA quality assurance program to document the
reliability of this particular methodology for all of the priority
pollutants.  At the onset of this work -- I should mention this is
the first time we ran the priority pollutant protocol — we did
have some doubts about the quantitative ability of the EPA screen-
ing protocol.  Based on our Cincinnati quality control data, that
protocol is capable of generating reliable data.  Based on our
Cincinnati data, we were finding an overall recovery for all of
the priority pollutants that we spiked in — we spiked in 111
— was 90 percent, plus or minus 25 percent.

     The specific quality control activities that we used in this
program were based on the general recommendations in the March  '78
EMSL document.  That document was designed for the verification
study in effluents.  We needed to modify that since we were essen-
tially doing a screening activity.  In other words, we didn't have
screening data that told us either what priority pollutants were
in a sample, or at what concentration levels they were present.
That meant, for us, the two important quality control differences
from the EMSL document were that number one, all the priority pol-
lutants were spiked into our sample and not just those detected
in our samples; and number two, we had to do blind spiking, since
we didn't have even a ball park figure for sample concentrations.
                               102

-------
     The final QC procedures were based on the EMSL document, and
also developed from discussions with our EPA project officer.  We
had a quality assurance adviser from region five, Billy Fairless.

     We collected a total of 69 samples in Cincinnati.  15 of those
samples were chosen to be quality control samples.  The criterion
that we used for that choice was that the sample matrices covered
a range of source types, or sewage types.  We had industrial sew-
age, commercial sewage, and residential sewage.  Those 15 quality
control samples were analyzed interspersed with all of our sample
analyses.  We used the QC results to develop our quality control
limits for all of our future work.

     A quality control sample is a bit prolific; what was one field
sample rapidly becomes five samples to be analyzed in the labora-
tory.  The field sample arrives in the laboratory and is divided
into three aliquots, two of which (A and B) are spiked with prior-
ity pollutant stock solutions, and the third, C, is analyzed as
our unknown sample.  That's a basic difference between us and the
EMSL document.  The duplicate in the EMSL document is done on the
sample unknown.  As mentioned before, we were looking for data on
all of the priority pollutants and not just those in the sample,
therefore all the priority pollutants were spiked into two of the
field samples.  The fourth sample associated with the field sample
aliquots is a clean water spike.  It's known as a method reference
standard — which we've indicated as D — which was ultrapure water
spiked with priority pollutants at the same levels as the two field
samples.  The fifth sample (F) is the method blank which is merely
ultrapure water to which we've added appropriate internal standards,
Field blanks were also analyzed, and at a minimum, two calibration
standards per day were analyzed.

     I wanted to indicate by a couple of examples how and where
priority pollutants were spiked into quality control samples.  For
the volatiles, we took a 5 ml sample aliquot, spiked in the inter-
nal standards, and the 30 priority pollutants in the volatile cate-
gory to obtain a concentration level of 20 micrograms per liter.
This aliquot was subsequently analyzed, using the EPA screening
protocol.

     For the acids category,  we took a 1 liter sample aliquot,
spiked in the D-,/^-anthracene internal standard, and the 11 prior-
ity pollutants in the acidic category to obtain a concentration
level of 50 micrograms per liter.  This aliquot was subsequently
analyzed, using the EPA screening protocol.  The rest of the cate-
gories -- base neutrals, pesticides, metals, et cetera -- were
done similarly.  As I mentioned before, a total of 111 priority
pollutants were spiked in.  We were looking for a total of 130.
We had manganese added to our list.  Not all the standards were
available then.  The St. Louis QC samples now have 127 priority
pollutants in them.
                                103

-------
     Once the concentration data was determined for the QC samples,
we did a few very simple calculations.  For our method reference
standard (D),  which was the clean water spike, we calculated a per-
cent recovery, and the standard deviation of the percent recovery
For the field sample spikes, we performed calculations of the same
type.  The only thing I wanted to indicate here is that all of our
spikes were field sample corrected.  In other words, if we did find
a priority pollutant in the sample, it was subtracted from the
spikes, so we're not getting abnormally high recoveries because of
contributions from the field samples themselves.

     This slide shows the averages of all the percent recoveries
and standard deviations for all the priority pollutants within a
category.  The first row is the category of compounds analyzed
for.  The second row, micrograms per liter, is the concentration
level to which the priority pollutants were spiked.  Next, the
average percent recovery and standard deviation for clean water
spikes, and then, the percent recovery and standard deviation for
the raw wastewater spikes,  or field sample spikes.  What is as-
tonishing to me is the consistency of these numbers as you move
from a GC/MS technique, to a GC/ECD technique, to an atomic ab-
sorption spectroscopy technique, to a cholrometric technique.
Apparently, the protocol, except in the case of the metals which
I'll explain in a moment, is uniform for the different kinds of
methodology.  The reason that the metals data was poor was because
the metals were spiked at too low a concentration level 10 micro-
grams per liter.  That concentration level, for several of the
metals, was below our method detection limit, and., at times, was
insignificant compared to the level of priority pollutant metal
that we found in the Cincinnati samples.  We also did a preliminary
evaluation.  This data is based on clean water spikes, where the
spiking level was 40 micrograms per liter.  I think you can see
the obvious difference in the standard deviation.  At 10 ug/L it
was an average of 115%, plus or minus 71, and at 40 ug/L, it was
112%, plus or minus 7.  We were unable to average out recovery data
in raw wastewater, so that plus or minus 17 percent figure is essen-
tially a measure of precision of the actual sample priority pollu-
tant concentration.  We then calculated the reproducibility of
replicate calibration standards by GC/MS.  The relative standard
deviation of replicate standards was no better than 20 percent,
obviously, a lot of the error in the GC/MS methods is due to the
GC/MS system itself, and not the sample preparation.

     I've extracted a few examples of the priority pollutant pre-
cision and accuracy data from which the averages on the previous
slide were calculated.  I've tried to select good, bad, and average
data.  This slide shows 5 of the 30 volatile priority pollutants
spiked into the QC samples.  The first column is the particular
priority pollutant we were looking at.  The next block is the
average recovery and standard deviation for the clean water spike
(D),  The third block is the raw wastewater data  (A and B).  C is
the concentration to which we spiked in the priority pollutants.
RC is a number from the EMSL document; it's a measure of precision
between the duplicate field sample spikes, and, then, the average
recovery and standard deviation.  Most of the data is looking more


                                104

-------
like the last three compounds listed on the slide and x^iat's why
the averages were as good as they were.  Once again the recoveries
always look good, and I think the precision is within what you
would expect the methodology to be capable of producing, and it's
uniform from compound type to compound type.

     This slide shows six of the 11 acids that we spiked into the
QC samples.  We calculated the theoretical recovery based on par-
tition coefficients into chloroform.  The theoretical for 2-
chlorophenol is 95 percent:  The average experimental recovery
from clean water was 86%, and from raw wastewater, 93%.  The theo-
retical recovery for phenol is 54 percent:  the average experi-
mental recovery from clean water was 77%, and raw wastewater 66%.
For 2, 4, 6 trichlorophenol, the theoretical recovery was 100
percent:  the average experimental recovery from clean water was
98 percent and raw waste water, 108 percent.  Apparently, the pro-
tocol can do as well as theoretically possible.

     Once again, it's the uniformity of the results that amazed
me.  I should also point out that there's not much difference be-
tween the ideal case that's represented by the clean water spikes
as compared with the raw wastewater spikes.

     This slide shows 5 of the 33 base neutral priority pollutants
spiked in.  We had a few bad actors here.  The data for bis
(2-ethylhexyl) phthalate and benzo(a)pyrene is not particularly
good.  Once again, to get the averages that I showed you in the
beginning, most of the base neutral compounds are looking like
the first three there, and not the last two.

     This slide shows 4 out of the 30 pesticides spiked in.  These
were spiked in at approximately 30 micrograms per liter.  For St.
Louis we did lower the total amount spiked to 10 micrograms per
liter, and the results are consistent with what we're seeing at 30
micrograms per liter.

     Now, since the Cincinnati data for metals was looking a little
strange, this slide shows the St. Louis metals data.  The differ-
ence from the Cincinnati spiking was that the metals were spiked
at higher concentration levels.  These are the results of the
atomic absorption analysis.  The recoveries are reasonable and
uniform, again.

     The next slide, please.  Another difference between the
Cincinnati and St. Louis analyses, is that plasma emission spec-
troscopy was used to analyze for 7 out of the 14 metals spiked in.
These results were really astonishing.  Manganese, based on 8 data
points, was 100 percent recovered with a standard deviation of 0.4.
Plasma emission, for us, was the preferred technique to use for
these samples.

     This slide is a list of the priority pollutants which were
spiked into the samples.  However, they were not detected.  I don't
think this is a surprising list to anyone; I think everybody has
discovered this.  The other problematic compounds were methylene
                                105

-------
chloride, nitrophenols, bendizine,  and phthalates.   Out of all
the priority pollutants spiked into the Cincinnati  samples,  only
5 percent were not detected.   Those compounds not detected were
associated with this list and the other compounds I mentioned.  As
I said before, we were uncertain about the quantitative abilities
of the EPA screening protocol.  I think an average  recovery of
90 percent, with a relative standard deviation of 25 percent is
quite good.  Furthermore, the uniformity of the results here from
compound type to compound type,  does suggest that the protocol is
probably appropriate for the  second generation of priority pollu-
tants.  We did use the data generated here to trouble shoot our
analyses from day to day.  Overall the St. Louis quality control
data is looking a good deal better than what you've seen here.  I
don't think it would have been possible to obtain,  or even to
demonstrate, of course, the quality of data achieved if the qual-
ity control program had not been run concurrent with the sample
analyses.  Jim Stauffer, our mass spectroscopist is also with
me today.  I'll be happy to answer any of your questions.

     Mr.Flory;  I have a question all about organics primarily.
Did the analysts know they were running QC samples, or were they
mixed in with other samples?

     Ms. Thrun: No, all the quality control data that you've seen
was interspersed throughout the sample analyses.  We've continued
to do that, and over the days of sampling, too.

     Mr. Flory;  My second question is already answered because
that was:  Was there a long period of time in collecting the sets
of data?  That wasn't all on one day, for example?

     Ms. Thurn;  The total amount of time for the St. Louis and
Cincinnati analyses was seven months.

     Mr. Flory;  The last question is:  How were the quantitations
for theorganics made?  Were they made by an automatic data system,
or by an operator doing areas of peaks manually?

     Mr. Stauffer;  The first collection from Cincinnati was done
on the Finnegan 6100 data system, so it was an operator identify-
ing peaks.  Most of the second collection was done on an in-lab
computer.  We now have a set-up that produces the mass chromato-
grams for you, and then an operator selects the peak.  It's not a
totally automated system, but an operator interactive one.  When
we used this technique for St. Louis, our results got substan-
tially better than for Cincinnati in that the standard deviations
are much smaller than they are for Cincinnati.

     Mr. Flory;  Is that with less operator involvement?

     Mr. Stauffer;  No, it was all done on a consistent basis, on
the operator interactive data system, if you will.

     Mr. Flory;  These values of recovery more approach some of
the things I've done but I did them all in one day.


                                106

-------
     Ms. Thrun:  Yes.  Well, what else is nice  from what I heard
over the past two days is that I think this data is fairly consis-
tent with what we're hearing from the various other QC programs.
It's plus or minus 20 to 30 percent.

     Mr. Kloepfer;  I have one question  for you, Kathleen.  What
was the elapsed time between the addition of the standard and the
extraction?

     Ms. Thrun;  We extracted our samples within 48 hours of sam-
ple receipt.  On the average, the lapse  of time was about two to
four hours.

     Mr. Henderson:  I may have misunderstood one of your slides.
It looked like you indicated that the D-, Q-anthracene was added
before the extraction process?

     Ms. Thrun;  That's right.

     Mr. Henderson;  How did you determine recovery percentages,
there?  Did you use DiQ-arithracene as an internal standard?

     Ms. Thrun;  Yes, we did.  These are corrected to some extent,
if that's your question.

     Mr. Henderson;  In the data you showed, is the recovery rela-
tive to D-^Q-anthracene and not the absolute recovery?

     Ms. Thrun;  Correct.  However, we did some preliminary experi-
ments with just Dn--anthracene, and the  recovery of D-, Q-anthracene
from water was about 100 percent.

     Mr. Rushneck;  What about the acid  fraction, then, with the
D-i Q-anthracene?  Did you spike that prior to injection?

     Ms. Thrun;  No, we spiked that into the aqueous aliquot be-
fore extraction.  We did have problems with the exchange of D-^Q,
but corrected by using the isotope peaks.

     Mr. Rushneck;  If you put it into the water, and you do the
base neutral extraction first, then it comes out in a base neutral
fraction and you don't have it in the acid fraction.

     Ms. Thrun:  That was another modification to the EPA screening
protocol.  We did decide to extract for  our acids and base neutrals
separately.

     Mr. Rushneck:  Another question I had was:  Did you use the
separatory funnel method for extractions, or did you employ a con-
tinuous extraction method?

     Ms. Thrun:  We used the separatory  funnel  and we had to
centrifuge our emulsions.
                                107

-------
     Mr. Rushneck;   When using the separatory funnel method,  did
you dilute the sludge at all?  What was the solids content?

     Ms. Thrun;  No.  I should make it clear.  This is not from
sludgedthis is from sewage,  raw wastewater.  We went down the
manhole and collected raw wastewater.

     Mr. Kloepfer;   Could you clarify what you did with the acids
again?  As I understand it, the acids were done by doing a direct
acidification and extraction and you didn't go through the base
neutral extraction first, is that correct?

     Ms. Thrun;  That's right.

     Mr. Ollison;  Could you elaborate a little bit on your ex-
change problems with D^Q-anthracene?

     Mr. Stauffer;   We did have some rather severe problems on
exchange with DIO-anthracene on the first go around on the acid
fraction.  In fact, we found for some of the early samples we had
to integrate more than one ion; we couldn't just use the 188 ion
for DnQ-anthracene, but we had to integrate over several mass un-
its to accommodate for the change that had taken place by exchange.
Then, it improved with time as we ran samples.  Subsequently on
our columns we pulled the glass wool that Supelco supplied and put
in plain silanized glass wool at the head of the column, and we no
longer had any exchange problem.  It just didn't exist after that;
it seems to have occurred on the phosphoric acid with which they
treat the glass wool.

     Ms. Thrun;  I had some handouts detailing all of the data
for all 111 priority pollutants.  If anybody is interested,
they're on the front table and you're certainly welcome to them.

     Mr. Telliard;   Thank you.  Next on the program is Fred
Bishop.  Fred's going to talk about the program his people worked
on dealing with methods development for sludge analysis in POTW's.
                                108

-------
                     Sludge Analysis in POTWs
                             P. Bishop
                               USEPA
     Mr. Bishop;  The previous June, in '78,  we were given,  per-
haps, an unenviable assignment of attempting to develop procedures
for the analysis of the priority organics in municipal sludges.
We were also given a very tight timeframe so that the amount of
work which we were able to do in that time period restricted,
severely, our capability of doing appropriate QA/QC with the work.
As perspective on the work,  if one takes a typical raw domestic
sludge and extracts it with methylene chloride, between 25 and 30
percent of the dry weights of the solids can be extracted.  For
a 5 percent thickened sludge at a municipal plant, therefore,  on
a per liter basis, one would extract into the solvent 18 to 20
grams of solute.  On that same per  liter basis, one is looking
for parts per billion of priority organics.  When analyzing a di-
gested sludge, the percentage by weight of extractable solids is
reduced from 25 to 30 to 12 to 15 percent, or approximately half.
The point is, the analyst has a true organic haystack in which
he is looking for a few needles of toxic substances.

     With that in mind, we put together a research team to attack
the problem.  We had two major assignments.  The first was to
take a quick look at the priority protocol for its application
to raw wastewater.  We did that and indicated that it was probably
applicable for the OWPS planned POTW study which was subsequently
done by A. D. Little.  In that quick assesment, we did some in-
house work and A. D. Little did some confirmation of the pesticides
and PCB procedures in the priority protocol.  We also assembled
a group of people to work on the sludge assignment.  The principal
research firms were Southwest Research for the pesticides and
PCB's, and Battelle Institute of Columbus for the base neutrals
and acid extractables.  We performed a quick look at the analysis
for purgeables in sludges on an in-house basis and then went to
MRI for some additional confirmation work after MRI became involved
in the POTW study.

     As shown in the first slide, the procedure that we suggested
for the purgeables is basically the Bellar procedure.  The proce-
dure was based on a preliminary examination in the laboratory using
GC, since we had limited GC mass spec capability and it was tied up
in other areas.  We tested the Bellar procedure on three types of
sludges; waste activated sludge, the raw sludge, and digested
sludge.  Our quick assessment with  the GC showed us that the ana-
lyst could strip the materials from the sludge and make a measure-
ment on the waste activated sludge  using the GC procedure, after
spiking into the waste activated sludge.  Attempts to apply the
procedure with GC as the detector, however, to the raw sludges —
that is, the digested and the raw sludges from the treatment plant
-- provided chromatograms too complex for analysis by GC.  We pre-
pared a tentative procedure in which we determined the total solids
in the sludge, and then transferred a liquid volume in which 50
milligrams of dry sludge would be in the sample.  For 5 percent
                                109

-------
sludge -- the volume is typically 1 milliliter.   And,  then,  we
diluted that volume to 10 milliliters in a modified stripping
apparatus.  We purged it the diluted sample with nitrogen or helium
up at 22 degrees C.  We absorbed the purged organics on the  classi-
cal trap, the Bellar trap, desorbed and analyzed for organics by
the recommended GC/MS procedure.

     The next slide, please.  The modification that we made  to the
stripping apparatus is shown here.  In our past  experience,  in
some wastewater samples, we encountered some foaming problems.
So, we used a second capillary trap as shown in  the slide in the
apparatus.  This trap solved municipal wastewater foaming prob-
lems, and thus we applied the second capillary trap to the sludge
procedure.  We changed the top of the stripping tube by putting in
a ground glass joint in order to get our 1 milliliter pipette, or
thief, down into the chamber before injecting the sludge sample.
Then we diluted to 10 ml in order to produce a solution which we
could strip relatively easily.  The 5,000 milligram per liter
solids concentration in the stripping chamber, then, gave us satis-
factory sensitivity.

     May I have the next slide, please.  This slide shows typical
results that were obtained with the GC on spiked activated sludge.
Waste activated sludge is a material which has been through a bio-
oxidation procedure and is relatively clean.  By spiking the var-
ious compounds at the levels shown, and, then, applying the Bellar
procedure with the GC, we got the recoveries shown.  These are
results of duplicates.  Clarence Haile will be able to discuss the
GC/MS aspects of this approach in his presentation.

     May I have the next slide, please.  For the pesticides and
PCB studies, we went to Southwest Research and asked them to take
a look at the conventional pesticide technology as applied to
sludges.  To do this, of course, they had to develop an extrac-
tion technique.  We also began a study of extraction techniques
for the acids, neutrals, and bases with Battelle, Columbus.
Battelle experimentally developed a reasonably acceptable extrac-
tion technique, namely, a centrifuge technique.   That same tech-
nique, basically, has been adopted by Southwest Research.  I'll
speak a little bit later about that technique when I get to the
Battelle approach.  In any event, Southwest Research, using a 20
gram wet sample with blanks and spikes, extracted with the class-
ical methylene chloride hexane solvent, cleaned up with florisil,
removed the sulfur with mercury and then quantitated with GC/ECD.
They took a portion of the material and cleaned it up with GPC
biobeads SX2, and collected six fractions which they were then
able to confirm with GC mass spectrometry.

     As shown in the next slide, that work, as you can see,  pro-
duced a relatively lengthy procedure.  Southwest Research tried
another option for us.  That option was to substitute the GPC in
place of the florisil and then proceed with a roughly standard
pesticides procedure.  After sulfur removal with mercury, they
then split the sample, quantitating with GC/ECD, and confirming by
GC/MS.
                                110

-------
     May I have the next slide, please.  This slide shows the re-
sults on the group one portion of the pesticides and PCB's using
the simplified option.  Southwest Research spiked the sludge at
three-tenths of a microgram per 20 gram wet sample and at 1 micro-
gram per 20 grams.  The results showed good replication and good
recoveries.  May I have the next slide, please.  Here's a second
slide showing additional group one pesticides with good recoveries
from the procedure.

     The next slide, please.  This slide contains the results on
group two.  Again there were satisfactory recoveries.  The data is
for digested sludges.  Southwest Research has tested successfully
other sludge types.  They have not, however, tested a large number
of sludges.  The next slide, please.  This slide, again, confirms
the reasonableness of the approach for the group two materials.

     The next slide, please.  This slide shows the results on group
three pesticides.  The recoveries from the sludge samples, again,
were reasonably satisfactory.  You'll note here that in the chlor-
dane test. Southwest Research was looking at the five representa-
tive peaks.  They spiked, of course, at higher levels of 3 micro-
grams per 20 grams, and 10 micrograms per 20 grams of wet sludge.

     The next slide, please.  This slide shows the results for
group four, the toxaphene mixture, and, again, reasonably satis-
factory quantitation occurred.

     The next slide, please.  Here is group five, again showing
satisfactory results for the AR-1260 mixture.  Southwest Research
in their timeframe, were not able to complete the evaluation of
all of the PCB's, but they feel that the procedure would be appli-
cable to at least eight of the others that are on the priority
list.  Now, I should restate that the data that I've just shown
from Southwest Research was for the simplified procedure with GPC
substituted for florisil and with a single cleaned up fraction for
GC/MS analysis.  Similar results, perhaps not quite as good,  were
obtained with the full procedure leaving the florisil in its normal
location.  The key point here, is that for the single peak pesti-
cides, Southwest indicated that they could confirm compounds by GC
mass spectrometry with the GPC in the system; they could confirm
single component pesticides at about three-tenths of a milligram
of component in a kilogram of the sludge solids.

     The next slide shows the problem areas.  In trying to do the
base/neutrals and acids, Battelle had difficulties with the enor-
mous amount of interferences and the lack of the selectivity of the
detector for the quantitation.  What gradually evolved in their
four or five months of work was the procedure that you see here.
Basically, Battelle selected three portions of wet sludge, and
handled each one separately.  Because of the problems with the
benzidine recoveries at low concentrations, they elected to go the
HPLC route for analysis of bases (benzidines).  They took a 10
gram amount, buffered it with phosphate solution, extracted it
with chloroform,  re-extracted it with 2N sulfuric acid and then
                                111

-------
analyzed by HPLC, using the electro-chemical detector.   This
HPLC technique is basically the protocol that Battelle  orig-
inally developed for Jim Longbottom in his second generation
verification procedures.

     Battelle also took a separate aliquot of 100 grams and then
went through the following sequence for analysis of the neutrals.
Basically they acidified and extracted with methylene chloride,
then washed with sodium hydroxide; they fractionated the sample,
using biobeads, into two fractions, a GPC 1, and a GPC  2 fraction;
they cleaned up the GPC 1 fraction using activated silica gel;
that fraction, basically, was the phthalates, or the larger of
the priority pollutants; they then combined the GPC 1 and 2 ex-
tract and analyzed by GC/MS using a capillary GC procedure.

     The procedure for the phenols consisted of taking  a 100 grams
wet sludge, acidifying it with the sulfuric acid, extracting with
methylene chloride, cleaning up with the biobeads to remove some
of the interferences, re-extracting with 2N sodium hydroxide, acid-
ifying the aqueous phase with 6N acid, and re-extracting again with
methylene chloride.  The re-extractions were performed  in order to
reduce the level of the interferences to where they could begin to
analyze the extracts by the GC mass spectrometer.

     May I have the next slide, please.  This is the result of the
work on organic bases in the distilled water.  You can  see that
the bases were spiked at a level of .6 micrograms per 100 ml, which
is about 6 parts per billion.  With their procedure Battelle got
reasonable recovery — 68 and 83 percent for the two bases.  For
the sludges -- and this happens to be a digested sludge, but they
also tested the raw sludges -- they again spiked at the same level
and got reasonable recoveries -- 117 and 67 for the benzidine and
the dichlorobenzidine respectively.

     We can look at the neutrals in the next slide.  I  have taken
a selected group of the neutrals to cover the various types of
organic materials.  The data are representative of the kind of
result that Battelle saw.  As you can see, recoveries from pure
water ranged from about 46 to about 144 percent.  The data pretty
much brackets the range of recoveries found following the neutral
procedure.  Battelle missed one of the neutrals using their pro-
cedure; otherwise they found them all and they fell within that
kind of concentration range.

     The next slide shows what happens when Battelle applied the
procedure to a digested sludge.  One can see some of the problems
that are occurring.  In the first place, the precision in the data
on the few replicates that were run is very poor.  The procedure
also missed nine neutrals from the priority group; Batelle reported
a lot of "NM's" for averages, meaning not meaningful data, simply
because the precision and the accuracy of the procedure was not
acceptable.  I went through their data and changed a couple of the
NM's to actual average recoveries to give you an idea of the kinds
of recoveries they were seeing.  If you take a look at the repli-
cation, as in the first organic shown,  .7, and, then, 5.2, and,
then, 14, for an average of 6.7 mg/1 of organic, that you would

                                112

-------
not even consider the procedure as a quantitative tool.   Even as
a qualitative tool, the procedure missed nine of the neutrals in
the spiked sludge because of the amounts of interferences that
were still present from the sludge matrix.

     May we go to the next slide, please.  Here's where we came up
against a more serious difficulty.  This slide shows the work done
with the acids, i.e., the phenols.  The data on recoveries is for
about a 50 part per billion spike of each phenol in distilled
water using their multiple extraction and reextraction procedure.
The extraction and reextraction significantly reduced the recoveries
from distilled water, especially of the nitrophenols, so that there
is clearly some more work to do in the area of the phenol analysis.

     The next slide will show the results in the digested sludge.
Here, again, the procedure missed all of the nitrophenols and exhi-
bited spotty results on the remainder of the phenols.

     The Battelle approach was one of the attempts at developing
a procedure for the sludge analysis.  And as you can see, it's a
rather lengthy procedure and quite expensive.  In our in-house
efforts, and also in an effort later joined by Clarence Haile at
Midwest Research, we attempted to do what we call the unified
procedure which featured a single extraction wi,th clean up before
GC/MS analysis of the acids, bases, and neutrals.  I will briefly
go through that procedure in the next slide.

     We took a larger sample, in this case, 320 grams of the sample.
We adjusted to pH2 with HC1 and extracted with methylene chloride.
Then, we used the centrifuge extraction technique that Battelle had
earlier developed which worked quite well.  Basically, in the cen-
trifuge technique, the sludge sample was placed into a centrifuge
tube and mixed with an equal volume of methylene chloride solvent.
In this case, we placed four aliquots (80 ml) in four centrifuge
tubes, with 80 milliliters of solvent per centrifuge tube.   In the
Battelle work it was 100 milliliters of solvent, and 100 grams of
the sample.  We applied a tissuemizer to the centrifuge tube for
one minute which effectively mixed and extracted the material.  We
then centrifuged the mixture, usually for about half an hour*  We
were able to separate the methylene chloride layer from the  sludge
liquid layer with cleanup through glass wool and a drying column
to take out any entrained solids.  The glass wool was returned to
the centrifuge tube, the extraction repeated through  a total of
three extractions.  In our early screening work, Battelle ex-
tracted five times and found that 95 percent of the material was
extracted from the sludge within the first three extractions.  It
was important that the tissuemizing did not proceed for too  long
a period of time to avoid heating; about a one minute time limit
on the tissuemizing seemed to work very well.  We were then  able
to separate the solids very satisfactorily.  We had no problems
for separation of the solvent and the sludge material.  After the
acid extraction, we performed a basic extraction with the methylene
chloride in a similar manner; and combined the acid and base ex-
tracts.  We performed gel permeation for the removal of the high
                                113

-------
molecular weight compounds; then we performed a solvent exchange
into a mixture of hexr.ne and silica gel to prevent the precipita-
tion of the polar materials during the change to the less polar
solvent.  We added that silica gel with the hexane solvent to a
column of silica gel for silica gel separation to remove: aliphatic
hydrocarbon interferences.  After a hexane elution to remove the
aliphatic materials, we then eluted the priority organics with 5
percent methanol in methylene chloride.  Finally,  after KD concen-
tration we analyzed with GC/MS for organics.

     I'd like to show the next slide,  please.  This slide gives
you an idea of what's coming out of the GPC.  Here's where there
is room for a considerable amount of development work.  This slide
shows the relation of the refractive index versus APC elution vol-
ume.  It's an ideal representation.  I have one later in the slide
sequence that shows an actual one.  We found that, in our initial
work with the GPC procedure at Region VII in Kansas City, that the
loading on the GPC was very critical.   In their particular system
Region VII was employing about 1 gram of solute in any given sample
through the GPC,  This, therefore, required a couple of aliquots
for the cleanup of each 320 gram sludge sample. We therefore ob-
tained in our laboratory a very large preparative scale, Styragel
GPC, from Waters Association Inc.  We took a look at how some of
the interferences would elute from the GPC.  The first interferences
to elute were triglycerides.  They are heavy molecular weight com-
pounds c  The slide indicates the approximate position of elution
for the tryglicerides in the GPC elution volume.  The fatty acids
elute over the full range of the big hump in the refractive index-
elution volume curve.  The aliphatic hydrocarbons tend to elute
across the full range of the elution volume.

     We don't have really complete information on the elution of
the individual organics — thus the slide indicates by classes
the elution position of compounds.  The phenols tended to elute
as shown in the slide over the GPC; same elution volume as the
phthalates and the small priority organics.  There is, therefore,
a range of overlap for the phthalates, the phenols, and the small
priority organics.  There was also, as you can see, an area in the
index of refraction-elution volume curve where you have a very
large amount of material.  Our analysis indicates that that major
hump in the curve is mostly fatty acids.

     We've been using these kinds of techniques to screen organic
samples.  For instance, we are evaluating leachates and we're find-
ing that the fatty acids in these samples are very, very prevalent.
On leachate samples, for instance, we found 98 or 99 percent of
the interferences were fatty acids.  Here, in the sludge area, the
fatty acids are lower; 75 to 80 percent of the interferences are
fatty acids.  Clearly, then, the fatty acids are one of the prin-
cipal problems of the interferences in the analysis of specific
organics.

     In returning to our discussion of the unified procedure, the
procedure basically failed then, because we were using too high a
loading on the GPC.  We are in the process of reducing the loading


                               114

-------
to the GPC, and re-evaluating whether we can come up with a shortened
procedure to reduce the cost of the analytical work.

     May I go to the next slide, please.  This shows a typical re-
fractive index curve that we're actually getting through the GPC.
We get something similar with UV.  The next slide describes the
extractions we were doing.  Aliquots of the extracts were evapo-
rated to dryness and weighed to determine the amounts of extracted
material.  An acid neutral extraction, as you can see from the 320
gram sample, produced 2.6 grams of extractable material.  The slide
also shows the reduction in the mass of the interferences after
GPC.  We performed a base extraction in the study and you can see
the small amounts of organics in milligrams in the basic extract.
Using this approach as a screening tool, the analyst can get an
idea of how to assemble cleanup techniques, how many techniques
will be needed, and the amounts of materials that are likely to be
removed by each technique.

     The final slide, please.  The final slide shows the Battelle
procedure.  The Battelle procedure is similar to the unified pro-
cedure but with a separate sludge aliquot employed for each frac-
tion.  In the Battelle approach, the neutral fraction could be
obtained by using a simple base extraction.  After the base extrac-
tion, an acid extraction on the sludge residual could also be used
to recover the phenols.  Clean ups to each extract, as in the
Battelle procedure would then be applied.  The key question that
still remains is how many of the interferences are still going to
be present as you go through the various cleanup procedures.
Thank you.

     Mr. Stanko;  On your GPC curves there, you only showed reten-
tion volume.  Do you have any idea of what the maximum molecular
weight of the components you were observing was?  In the 60 ang-
strem styragel the exclusion is probably, what, 2,000, 3,000
mole weight?

     Mr. Bishop;  Somewhere in that range.  I don't have the exact
exclusion on that particular yet.  We are just beginning our work
on evaluating the pure compounds in the GPC.  The loading factor
is going to be very critical in GPC.  I suspect that with these
high solute loadings that we've been experiencing,  we're getting
a lot of spread or overlap,  and the impurities are spilling over
into, perhaps, some of the lower molecular weight elution volumes.
I think that BPC is an appropriate approach and more research is
going to produce a method for cleaning up these high solute ex-
tracts.  We're really at the beginning and haven't done enough work.

     Mr. Stanko;  I'd also like to caution you that the refractive
index detector is not a linear detector.  As the refractive index
of the compounds you're detecting, or trying to determine, ap-
proaches the refractive index of your solvent,  you can absolutely
miss gross quantities of these materials.

     .Mr. Bishop:  We use the refractive index and the UV as a
guide;  they were roughly similar in overall shape of the curve.


                                115

-------
In any event, with those as tools and with pure compound work,  we
can begin to assemble a better understanding of how the interfer-
ences are going to interact.

     Mr. Stanko:   I'm just using it as a caution.   We've already
been stung by that.

     Mr. Flory;  If this much cleanup is required with the mass
spec detector, then I guess there's not much hope for other detec-
tors on GC.  If you inject this on a GC it gets completely wiped
out.  We get a straight humpogram, or a pegged output.

     Voice From the Audience;  Humpgram?

     Mr. Flory;  Whatever you want to call it, it's just bad news.

     Mr. Shattuck;  Assuming that you can clear up your base neu-
tral and phenolic fraction problems, on a practical point of view,
how can we apply this as far as turnaround per sample and on a
cost basis?

     Mr. Bishop;   This was the result of work which was done from
June to December, and then our contracts ran out.   Since that time,
Clarence Haile has been working in the area and will be discussing
it in his presentation.  I think that he's done a little more work
with the GPC and with some pure compound work.  Indeed, Battelle
has done some extensive work with the pure compound GPC work.  I
haven't seen those results yet; their final report has not been
delivered to me.   I think at this time it's a little bit early
for me to say how the final protocols for the acid, base, and neu-
tral organics will come out, except that I think that an approach
using GPC and perhaps the silica gel will be required.  The devel-
opment work to date is a good start in the analysis of the sludges;
we are getting effluents, or I should say, extracts from the GPC
which, at least,  are complex but partially analyzable spectras.
You can begin to see the priority organics there, although you
can't quantitate well.  There are undoubtedly false positives and
false negatives.   The problem in this area is going to take some
more development work in order to quantitate at low ppb concentra-
tions .

     Mr. Flory;  From the process that you have laid  out there,
it lookedasTf it would be a fairly drawn out procedure?

     Mr. Bishop;   Yes, well, the procedure really isn't all that
much greater than the base neutral and then an acid extraction
of original EMSL Protocol, although it does have a few more steps
in it.  Clarence Haile has been following a little bit closer to
the EMSL protocol approach, but again, with GPC in it, and he'll
comment on that in the next discussion.  We have asked Southwest
Research to continue with a unified concept to see if it can be
fully developed.   They are now under contract to EMSL in Cincin-
nati, and so is Clarence at MRI.  Both will be pursuing this re-
search. We filled in as stop-gap help, because there  weren't any
                                116

-------
procedures for that sludge area, and we needed procedures within
our municipal environmental research laboratory.  Our laboratory,
now, is going back to its normal support function.  While we'll be
looking at these procedures for our own use, most of the develop-
ment work will be directed through EMSL.

     Mr. Colby;  The use of ultrasonication with any materials
which incorporate a large quantity of membranes, such as bacterial
cell membranes, is going to produce a tremendous number of fatty
acids, since that's what the membranes are composed of.  The use
of sonication is the way biologists, and so on, break up those
membranes so that they can analyze those membranes.  It's a very
effective way to prepare cell membranes for analysis.  Did you
compare anything in terms of using other methods to get rid of
these fatty acids when you didn't incorporate ultrasonication?

     Mr. Bishop;  I don't believe the tissuemizer is an ultrasonic
device, but it does shatter the tissues very much, and you do ex-
tract fatty acids.

     Mr. Colby;  Well, that's what it was made for, and that's
why they sell that.

     Mr. Bishop;  Right, we agree.  The problem, of course, is that
when we tried to extract with conventional extraction techniques,
that is, let's say, continuous extraction, we experienced quite a
bit of difficulty there.  Soxhlet extraction, or any technique
which is strong enough to really extract the priority organics,
then extracts out the fatty acids.  You can use a differential
extraction which would only selectively extract a few compounds,
but I'm not sure this would give you the analyst the answer be-
cause some of the priority pollutants may well be in the cellular
material itself, inside the interstices; the analyst must break
up the cellular material and go on inside and extract everything.
That type of extraction, therefore, produces one heck of a forest
of interferences.

     Mr. Colby;  I think it looked, basically, very good.  I like
the looks of it; you did a lot of good work.

     Mr. Bishop;  Well, it's a start.  This was a five-month start,
and it's on-going.

     Mr. Telliard;  We have two speakers left.  Next is Clearence
Haile from Midwest.  As Fred Bishop pointed out, Midwest is involved
in the POTW study.  Clarence is going to tell you everything you
ever wanted to know, and more, about sludge.
                                117

-------
                    SLUDGE ANALYSIS TECHNIQUES
                             C. Haile
                    Mid-West Research Institute
     Mr. Haile;  I want to start by giving you a brief look at
what we've been doing with sludges in general, so that you might
have some perspective as to what this work entails.  We've been
involved in a couple of programs.  They're not really that diver-
gent; several programs on analysis, including the POTW pilot sur-
vey, which includes the analysis of sludges from two POTW plants,
one in Indianapolis, and one in Cincinnati.  Approximately 39
sludges and scums — the only word you can use is scum -- have
been analyzed just very recently.  We are getting ready to par-
ticipate in a 40-plant POTW survey, analyzing approximately 320
sludges.  We've also been looking at tannery wastes.  Some of you
may recall that we were involved in both the screening and veri-
fication of the tanning industry.  We've been looking at the par-
titioning of selected compounds in tannery waste between the sol-
ids portion of the waste -and there is a large solids portion in
the raw waters at least -- and the liquids portion.  We've been
analyzing some solids from tannery waste and we also did a cer-
tain amount of work on acid mine drainage treatment sludges.
These were largely irrigating sludges from the Crown test site.

     As I mentioned, we worked a little bit with acid mine drain-
age, just a few samples in relation to a priority pollutant treat-
ability study being conducted by EPA at the Crown test site.  We
also have a major contract to develop methods for priority pollu-
tants analyses in both sludges and sediments, to develop referee
and screening type GC/MS protocols, and also to adapt that meth-
odology for use of GC detectors.  Maybe it's somewhat analogous
to verification type protocols.

     What I'm going to be talking about today really has more
reference to the development of a cleanup and extraction scheme
that we can use with the determination techniques that are speci-
fied in the screening protocol.  As we're now using it, we've
divided the compounds into classes we're used to dealing with.
We're analyzing the volatiles by a modified purge and trap tech-
nique very similar to what Fred Bishop described to you.  I'll
show you just a couple of other minor variations from the screen-
ing protocol.  We have worked with the acids, bases, neutrals and
those other neutrals, the pesticides, and with a unified approach
as Fred has mentioned, trying to take a, fairly simplistic view.
One thing that I try to do when I'm pressed for time is to do
things as simply as possible.  That way I don't get confused, and
I'll talk a bit more about that a little later.
                                118

-------
     The two minor; modifications that we're doing to the purge and
trap technique concern the operation of purge and trap for sludges.
As Fred mentioned, we're diluting the samples to less than a half
a percent solids.  Typically that would mean if we had a 5 percent
primary sludge, or a thickened sludge, or whatever, we would dilute
that sample 1 to 10 with our super-duper VOA free water, analyze
it, and do the purging from that point.  This helps us keep down
a lot of the foam problems.  We got a lot fewer foam problems than
we had originally anticipated.  We were able to solve all of our
foaming problems on the two plant POTW pilot survey through the
use of mechanical devices, such as the glass wool plug, or a small
expansion chamber immediately above the sample, to provide a little
bit more head space.  Since we're operating with our own home-grown
VOA system, we're using a chromatographic column, as opposed to a
tekmar type column.  That facilitates getting in and out of the
purging tube for removing the sample and cleaning out the residue
of the sample.  It facilitates cleaning, and it facilitates intro-
ducing the sample into the purge tube.  For a few of you who may
have seen the protocol published by EMSL for sludges, Tom Bellar
is looking at a new trap composition which seems rather interest-
ing.  It is rather logical.  It's a three component trap with
Tenax silica gel as we're used to, and a charcoal addition.  This
particular charcoal is a variety that's identical to that used
in the nyage gas sampling tubes.  The trap is set up so that the
volatiles are purged from the water into the trap.  The purging
trap sequence is through the Tenax which ties up most of the heavy
organics, into the silica gel which ties up some of the fairly
light organics, and into the charcoal to tie up the lightest of
the organics, the freons and so on.  Then, one flashes back in the
other direction.  Obviously, you wouldn't want to purge through
the charcoal before the Tenax because you'd probably lose most of
your compounds irretrievably on the charcoal.

     I want to show you a little bit of recovery data for volatiles
analyzed in this manner.  The only deviation between the operation
of the system when we were obtaining this data, and the methods I
just showed you, is in the use of a straight Tenax trap.  We prob-
ably have lost a number of the more volatile compounds.  These were
spiked into sludges at a concentration of 250 micrograms per liter.
Basically, what you see from this sort of table is that the com-
pounds that you can see fairly easily in water, you can see fairly
easily in sludge, and conversely, compounds with which you have
trouble in water, you also have some trouble with in sludge.  There
were certain compounds that were present in the sludges at such
high levels that we were not able to obtain any spiking information.
What I really want to talk most about today is the unified approach
that we're using for extractable compounds.  This is our basic ex-
traction analysis scheme.  I'm sure you'll recognize it as being
exceedingly familiar, if you've read the screening protocol at all.
It uses the same basic philosophy.  It might help to recognize at
this point that we were involved in gearing up for the original
two-plant pilot study at the same time the EPA task for the devel-
opment of sludge methods was meeting.
                                119

-------
We were trying to get some analyses rolling.  We did r lot of this
development work in parallel,  but also separately,  from the people
in Cincinnati.  What we're starting with is 320 grams of sludge.
We go through the typical base neutrals followed by acidic extrac-
tion.  The extraction procedure is as Fred Bishop delineated to
you earlier using a tekmar tissuemizer.  It's just an inside, out-
side type of high speed blender to homogenize the sample in methy-
lene chloride.  We then centrifuged that whole configuration about
a half hour at 3,000 rpm's.  You come up with a centrifuge tube
with water on top, a sludge cake in the middle, and very dark
methylene chloride on the bottom, so, we can extract that out of
the centrifuge tube.  After doing the basic extraction, of course,
we have a base neutral extract.  Acidifying the sludge and doing
the same thing, we come up with an acidic extract.   From here, we
have to clean up the sample, obviously.  We're using gel permeation
on both extracts independently.  Out of the BN extract, we're pull-
ing out the bases, the neutrals, and the pesticides.  Now, we
haven't done this, but if we wanted to look at the pesticides by
electron capture GC, I think it would probably be a fairly trivial
thing to take that extract.  If one had to remove sulfur, or what-
ever, to operate on an EC, they could get more sensitivity by look-
ing at those pesticides by the classic pesticide procedures, as
published in the protocol.  The base neutrals in pesticides were
run on a GC mass with the SP-2250 DR column.  The acidics were run
on the GC/MS with the SP-1240 DA column.

     The real key to this whole procedure is the cleanup of the
extracts.  We know that we can handle a fairly dirty extract by
GC/MS, because we're looking at selected ions.  We can play funny
games of background subtraction, et cetera, a-nd could come up with
really some pretty good data.  While sludge extracts are really a
fairly thick organic soup, much of that organic soup is  large
molecular weight compounds -- fatty acids, and other similar type
compounds.  Really, a classy way to get rid of those compounds  is
to use a size fractionation, so, it's just gel permeation.  The
method that we chose for GPC is essentially the same.  It's used
by many EPA laboratories for the analysis of pesticides  in fish.
It was modified slightly by Bob Kloepfer at the EPA Lab  in region
seven to operate with methylene chloride.  We're using an automated
system which  facilitates the analysis of about 23 injections  in an
automated fashion, which is somewhat helpful.  We're using a column
of biobeads SX3, operating with methylene chloride.  We  went to
methylene chloride for a couple of reasons; because our  extract is
in methylene  chloride, and, also, we wanted to be able to get the
poorer compounds off a little bit easier.   The automated systems,
by the way, are set up so that you have a dump fraction, which  is
the garbage coming off first, and then you have a collect fraction.
Hopefully, you get all of your compounds in there, and then  there's
wash time to  force the column for the next  run.
                                 120

-------
     Sludges are obviously a very difficult problem, but it appears
with a certain amount of care and cleanup of the extract, you can
do reasonably well and get moderately good recoveries,  A gentleman
asked earlier concerning the cost of this sort of simplified approach
relative to the Battelle-based approach that Fred Bishop showed us
earlier.  We made a calculation a few weeks ago concerning estimates
of man hours, and analysis time, et cetera, for the two options,
and it was fairly clear to us that the Battelle-based approach, as
Fred described, would require about twice as much time and money
per sample to accomplish.  There may be a lot of merit to some of
the steps involved, but we feel that we're getting very adequate
data for our purposes with this sort of approach.  It's certainly
a lot simpler and it's less costly.
                                121

-------
     Mr. Telliard;  We saved the best for last.  Early on in the
program, in petroleum, we found that on the indirect sources, a
number of questions were raised by the assistant administrator.
One of them was, "What about all of these refineries that go into
POTW's?"  I said, "Well, boss, they're just loaded with poisons."
So, to go out and prove that Joe and the boys were really screwing
up all of these POTW's, we went out and sampled a number of refin-
eries in L. A.  After we were done, we had these bottles of sludge,
both from the refineries and from the POTW's.  I called Dale Rush-
neck at PJB and they ran the sludge analysis for us.  They got some
interesting numbers.  Dale's here to tell you a little bit about
it, realizing that these were primarily aimed at the petroleum in-
dustry, and, I believe, it was plus or minus 2 percent variance,
right?
                                122

-------
                   SLUDGE EXTRACTION AND RECOVERY
                           D. Rushneck
                         PJB Laboratories
     Mr. Rushneck;  You just heard the results of some of these
four or six month studies on sludge.  What I'm going to talk about,
because of the timeframe involved in analyzing the present we got
from Bill, is the results of an intensive three to five day multi-
decadollar study on the recoveries of some materials from sludge.
I'm going to limit my talk considerably because of the late hour.

     The first subject that I'll talk about is purge and trap re-
covery.  We used a very simple method.  We thought, going into
this, that If we could get by with simplicity, we'd try it and get
only as complex as we needed to.  So we measured first the solids
content of the sludge ranges between 5 and 7 percent.  This is
done by putting the sludge it in an oven and letting it sit over-
night or for 24 hours, baking off all of the water, then measuring
what is left.  The original material is diluted to 1 percent solids,
put in the TOC type VOA bottle that's commonly used now with a screw
cap immersed in an ultrasonic bath for two to three minutes, taken
out of the ultrasonic in that same bottle, and centrifuged.  The
internet! standards are then spiked in and analyze the supernate is
analyzed.  We did this with the sludge itself, and with the full
range of priority pollutants spiked into the sludge, after we had
diluted it to 1 percent in the TOC bottle.  We ran these standards
in replicates at two levels, so the results for these particular
tests are fairly reliable.  The highly halogenated materials dis-
appear at. the 20 microgram per liter concentration.  This is inter-
esting to us because we also analyze drinking water, and one of
the bright fellows in the laboratory said, "Gee, that range of con-
centration for those highly halogenated materials is typical of what
you often find in drinking water," and proposed that we use sludge
as a filter matrix for drinking water.  We said that's very bright,
based on the data, but we don't think we'll get the public to accept
the taste and the flavor of that treatment process.  In any event,
we don't understand exactly what the mechanism is, but I noticed in
the Arthur D. Little data, some of the highly halogenated materials
were the ones that didn't have high recoveries at the low levels in
the wastewater.  Maybe there's a similar thing going on there.

     We also looked at extractions and started out, again, with
the simple extractions, using the protocol by diluting again to
1 percent solids in the separatory funnel with methylene chloride.
We got the mess that everybody else who has worked with this stuff
has found.  We analyzed the base neutral fraction that way with
spiked materials in it, and got very poor recoveries.  So after we
had converted the water to acid, extracted the acidic fraction, and
gotten the methylene chloride off -- which was much less difficult
to work with — we backextracted the acid, the residual water in
the acid fraction the second time, and improved the recoveries on
the base neutral compounds.  We cooked this stuff down, lived with
the amount of garbage present, and ran it directly by GC/MS.  We
                               123

-------
got results similar to what Clarence Haile presented,  in that we
did a number of those compounds successfully.   The recoveries
are generally low -- that is, less than 50 or 60 percent -- but
there are lot of them that can't be recovered at all,  at least at
the 100 microgram per liter level.  We immediately arrived at the
point that the simple methods for extraction don't work very well
and if one is going into the sludge business,  the thing to do is
to consider some of these techniques used by Arthur Little which
seem to be fairly successful.  That's about all I have to say
unless somebody has a question.

     Mr. Hochgesang;  Would you please describe the spikes what
you used?

     Mr. Rushneck;  We spiked all of the volatiles in, and, in
fact, all of the base neutrals, and all of the acids and in this
particular viewgraph.  What I've pointed out is the highly halo-
genated ones that were not recovered, the tetrochloroethylene,
the tetrachloroethane and dibromochloromethane.

     Mr. Telliard;  I want to thank you for coming.  I'd also like
to thank John Whitescarver who made arrangements here.  John put
this altogether.  If you get a chance and he's still out there,
you might want to thank him.  We will attempt to get the proceed-
ings to you.  When we get them, if there are any corrections, com-
ments, or anything, we'd be glad to hear from you.  We can't end
with just three of these meetings; we'll have to have another.
The next time around, we'd like to have input on any  'needs' that
you think we're not addressing.  Thanks for coming.  We appreciate
your attendance and your interest.  Thank you.
                                124

-------
Notes
    126

-------
                 APPENDIX I









                REDUCTION IN




       SAMPLE FOAMING IN PURGE AND TRAP




GAS CHROMATOGRAPHY/MASS SPECTROMETRY ANALYSES
                   BY




        M. E. ROSE AND B. N. COLBY*
     CHEMISTRY AND CHEMICAL ENGINEERING




       SYSTEMS, SCIENCE AND SOFTWARE




             P. 0. BOX 1620




        LA JOLLA, CALIFORNIA  92038
                   127

-------
                            BRIEF




       The qualitative and quantitative aspects of using sili-



cone surfactants and heat dispersion to reduce foaming in the



purge and trap GC/MS analyses are presented.
                             128

-------
                          ABSTRACT







       In the determination of volatile organic compounds in



industrial effluent waters and process streams by purge and



trap GC/MS, foaming of the sample has been a serious problem.



The foam tends to enter the transfer line leading to the sor-



bent trap and may actually reach the trap itself.  This has



several negative effects on the current and subsequent deter-



minations, including deactivation of the trap and introduction



of thermal decomposition products from labile, nonvolatile



materials.  Two methods to reduce foaming are evaluated, one



employing a silicone surfactant and a second involving heat



dispersion of the foam.  The qualitative and quantitative



aspects of these two foam-reduction methods are described for



volatile priority pollutants in spiked pure water and soap



solutions.
                            129

-------
                          INTRODUCTION






     As a result of the "Consent Decree," the United States



Environmental Protection Agency (EPA) has undertaken a major



sampling and analysis effort to determine a group of materials,



called priority pollutants, in industrial waste waters.  In



order to achieve their goals, a screening protocol  t^J was estab-



lished for laboratories to use as a guide in carrying out the



investigations.  With this protocol, the volatile organic pri-



ority pollutants are determined using a Purge and Trap (PAT)



sample preparation procedure and combined gas chromatographic/



mass spectrometric (GC/MS) analysis.  The PAT methodology was



developed by EPa personnel  [2,3] an(j nas proven extremely effec-



tive for the analysis of drinking water  [4,5,6].  with the PAT




technique, volatile, slightly soluble molecules in the water



sample are entrained by a stream of pure inert gas as it is



purged through the sample.  The purge gas is then passed through



a sorbent trap where the entrained molecules are retained.  Once



this rmrge and trap process is complete, the trap is back-flushed



and the contents theriaally desorbed into a GC/MS for analysis.



For most sample, this process is straightforward and determi-



nation proceed with little difficulty.  Occasionally, however,



a series of samples will be encountered which foam excessively




when purged.  If foam is allowed to traverse the gas transfer



lines and enter the sorbent trap, several negatve effects on



the current and subsequent analyses can be expected if the trap



is not replaced and the transfer line thoroughly cleaned.  These
                               130

-------
negative effects include deactivation of the trap and intro-



duction of thermal decomposition products from nonvolatile



labile materials.  For samples which do not form a highly per-



sistent foam, it is possible to reduce foaming by reducing the



purge gas flow rate slightly and/or by inserting a mechanical



barrier to the foam, such as a bundle of capillary tubes, just



past the purge tube.  For other samples, these steps are insuf-



ficient and alternative means are required.  One alternative is



the addition of surfactants such as silicone antifoaming agents



and a second is to apply heat to dissipate the foam.  The pur-



pose of the studies presented here was to qualitatively and



quantitatively evaluate these two approaches to foam reduction



for the determination of volatile priority pollutants.








                         EXPERIMENTAL




Apparatus



       A Tekmar Model LSC-1 purge and trap unit, incorporating



the manufacturer's recommended modifications, was used through-



out the study.   The desorbed sample gases were transferred



through an 0.028 ID stainless steel line directly onto the head



of the GC column in a duPont Model 321 GC/MS.  The 8 ft long x



2 mm ID glass column was packed with Carbopak C (60/80 mesh)



coated with 0.2 percent Carbowax 1500.  The column oven was pro-



grammed from 50°C to 185°C at 8°C per minute after a four-minute



hold.   The mass spectrometer was scanned from 45 to 300 amu at



four seconds per scan;  emission current was 500 yA and electron
                            131

-------
energy was 73 eV.  Data acquisition and processing was done



automatically by a Finnigan/INCOS data system.



Reagents



       Two series of standard solutions were prepared from



unchlorinated distilled water (Arrowhead, Los Angeles, Cali-



fornia) which had been purged until analysis indicated it to



be free of volatile organic compounds.  Both series of solutions



were identically prepared except that the second series of



solutions contained approximately 0.5 percent (v/v) of dish-



washing detergent (Purex Balsom Trend).  Aliquots of commer-



cially prepared  (Supelco; Beliefonte, Pennsylvania) priority



pollutant standards were added to the pure water using a



Hamilton microsyringe.  The resulting solutions contained



priority pollutants at concentration levels of 1, 3, 10, 30,



100, 300, and 1000 yg/1.  The samples were analyzed in order



of concentration, starting at 1 yg/1.



       For those soap solutions analyzed using the silicone anti-



foaming agent, 1 yl of the agent (Dow; Antifoam A) was added to



the purge tube immediately following the 5-ml sample.  The sur-



factant was used as supplied; no attempts at purification were



made.



Procedure



       Five-mi sample aliquots were purged with pure nitrogen



at atm cm3 per minute.  After a 12-minute purge interval, a



180°C desorb cycle was activated for three minutes with a



helium carrier gas flow rate of 20 ml per minute.
                             132

-------
      With the heat dispersion method, the purge tube was



placed in a small water bath at ambient tempterature  (Figure 1).



When purging was initiated, foam would rise in the tube and, as



it neared the top of the tube, a hair dryer-style gun was used



to direct a stream of hot air at the top of the sample tube.



The heat causes the foam to break down before it can enter the



transfer line leading to the trap.  A 6-inch section of a 2-mm



ID glass tube was placed just above the purge tube to act as a



condenser for water vapor generated during the heating process.



When droplets could be detected visually in this condenser, it



was replaced.  The condenser helped to reduce the quantity of



water vapor entering the sorbent trap.



      An automatic GC/MS data reduction procedure was employed



throughout the study to identify priority pollutants and to



determine relative response values used for quantitation.



Relative response values were calculated by dividing mass



chromatogram areas for the quantitation masses of the priority



pollutants by those of the internal standards (Table 1).  Cali-



bration curves were prepared by carrying out logarithmically



weighted linear regressions of relative response versus concen-



tration.







                  RESULTS AND DISCUSSION




      The significance of keeping foam from entering the sorbent



trap is illustrated in Figure 2 for two, 100 ug/1 samples; one



a pure standard solution and the second a soap solution.
                              133

-------
A small amount of foam from the soap solution sample was allowed



to enter the trap.  Several of the early eluting components



detected in the pure standard solution are not readily detected



when foam has entered the trap.  It is believed that this is



due to a modification of sorbent surfaces which make it less



effective as a sorbent.  However, it could also be due to an



alteration in thermal desorption characteristics.  Second, note



that several spurious peaks are present in Figure 2b which are



not present in Figure 2a.  These are believed to be decomposition



products of thermally labile nonvolatile materials present in the



foam.  However, it is possible that they could be due to volatile,



water soluble compounds which are not effectively entrained by



the purge gas under normal operating conditions.  Irrespective



of the reasons, when foam enters the trap, both qualitative and



quantitative analyses for priority pollutants are severely inter-



ferred with.



      A qualitative comparison of chromatograms obtained using



the two foam reduction methods is given in Figure 3 for 10 yg/1



samples.  Note that the silicon antifearning agent, although it



eliminates the foaming problem, results in the addition of spur-



ious peaks to the chromatogram.  The severity of these inter-



ferences increases with each additional analysis employing the



silicone antifoaming agent and, it ultimately became necessary



to dismantle the PAT system, replace the sorbent trap and bake



clean transfer lines and flow directing valve.  This cleaning



process was necessary after approximately every tenth run using



the antifoaming agent if all priority pollutants were to remain




                              134

-------
identifiable at concentrations down to 10 ng/1.  It is possible



that by using less of the antifearning agent, contamination would



be reduced.  However, all attempts to find a suitable solvent in



which to dilute it were unsuccessful; 1 yl was the minimum vol-



ume which could be handled.



      Contamination did not result when using the heat disper-



sion method; the chromatograms remained qualitatively identical



to those of nonfoaming samples for indefinite periods.  Pn "excess



of 60 foaming samples were analyzed using the heat foam dispersion



method with no noticeable change in detection limits and zero



incidence of interferences.  It is possible, however, that the



use of elevated temperatures could lead to sample degradation



in some instances.  Although this did not appear to be the case



with priority pollutants in soap solutions, it should be con-



sidered a possibility with other compounds and with other matrices,



      Both approaches to foam reduction had some effect on rela-



tive peak areas and, as a consequence, the accuracy of the deter-



minations.  This can be seen in the chromatograms shown in



Figure 3,  In order to evaluate the precision and accuracy pro-



vided by the two foam reduction methods, data obtained for the



three sets of standard solutions were compared.  Table 2 shows



that the correlation coefficients for the priority pollutant



calibration curves obtained from pure standard solutions and



soap solutions, using either of the foam reduction methods, are



essentially equivalent.  The correlation coefficient for chloro-



form was the only one found to be uniformly less than 0.99.
                            135

-------
The problem with the chloroform data is the result of



nonlinear response; the relative response drops off more



rapidly at low concentration than at high concentration.



In other purge and trap experiments performed in this lab-



oratory, nonlinearity for the relative response of chloro-



form has not been obvious and no reason for the phenomenon



is known.  If chloroform values below 10 yg/1 are rejected,



the correlation coefficients for chloroform in all three



sets of data are greater than 0.995.  However, because the



statistical evaluations performed in this study were for the



sole purpose of evaluating the data rather than rejecting



potential outliers, the lower values for correlation coeffi-



cients were included in Table II.



       In screening industrial waste waters and process



streams for priority pollutants using the EPA protocol,



the intent has been to provide data with accuracy sufficient



to place each data point, X, within a window defined by



-50% and +100%.  That is:



                      j x <_ x <_ 2x



or



                      x <_ 211 x



That is; the data should be "within a factor of two".  Be-



cause of the geometrical nature of this goal, it is conven-



ient to consider errors in terms of the factor by which a



datapoint deviates from the calibration curve or regression
                           136

-------
line.  For the purpose of this presentation, error is


defined as:
    ERROR FACTOR =  /   known concentration	 \±l >
                    \ calculated concentration  /   —
Thus, a +100% deviation from the calibration curve is num-

erically equivalent to a -50% deviation.

       When the error factors for each of the priority

pollutants in each of the three sets of data are determined

using the regression line through that set of data as a

reference or calibration curve, the values shown in Table III

are produced.  The similarity in the mean values is consis-

tent with the correlation coefficients already mentioned.

Of the priority pollutants investigated, only chloroform


failed to meet the "factor of two" criteria; all others

fell well within the window.  It should be pointed out that

this approach to evaluating accuracy involves an implicit

correction for matrix effects which might alter purging and/or

trapping efficiencies.  Consequently, the error factors in

Table III should be considered as best case values.

       If the pure standard data are used as the calibration

or working  curves,  and the soap solution data treated as if

it were for unknown,  the error  factors  shown in Table IV


are obtained.  Even though these are within the factor-of-two

window on the average, there are several compounds which fall
                           137

-------
outside that window.  This is believed to be the result of



matrix affects which alter the relative purging efficiencies



of priority pollutants and internal standards.  These rela-



tive differences could be due to solubility changes or per-



haps alterations in purge gas bubble size which might affect



entrainment. Figure 4 illustrates examples of these matrix



effects for toluene and 1,1,1-trichloroethane.  Note that



the relative response values for toluene obtained using the



silicone antifearning agent, correspond fairly well with those



of the pure standard while those acquired using the heat



dispersion method do not.  It seems that soap solution lowers



the purging efficiency of toluene relative to the internal



standard, whereas soap solution plus silicone antifoaming



agent results in approximately equivalent purging efficiencies



for the two.  In fact, all the priority pollutants referenced



to the second internal standard and determined using heat



dispersion, gave low relative response values compared to



those of pure standards.  If only those priority pollutants



determined.using the first internal standard were used to



calculate the mean error factor for the heat dispersion method,



a value of 1.61 is produced.   This is comparable, on the aver-



age, with that of the silicone antifoaming agent approach.



Essentially, the opposite situation to toluene is encountered



with 1,1,1-trichloroethane, which exhibits an increased rela-



tive response with soap solution treated with silicone
                            138

-------
LITERATURE CITED








1.     U.S. Environmental Protection Agency, "Sampling and



       Analysis Procedures for Screening of Industrial



       Effluents for Priority Pollutants," Environmental



       Monitoring and Support Laboratory, Cincinnati, Ohio,



       April, 1977.



2.     Bellar, T. A. and J. J. Lichtenberg, J.  Am.  Water



       Works Assoc., 66, 739 (1974).



3.     Bellar, T. A., J. J. Lichtenberg and R.  C. Kroner,



       J. Am. Water Works Assoc., 66, 703 (1974).



4.     Brass, H. J., M. A. Feige, T. Holloran,  J. W.  Mello,



       D. Munch, and R. F. Thomas, "The National Organic



       Monitoring Survey:  Samplings and Analyses for Purge-



       able Organic Compounds," in "Drinking Water Quality



       Enhancement Through Source Protection,"  Robert Pojasek,



       Ed., Ann Arbor Science Publishers, Ann Arbor,  Michigan,.



       1977.



 5.     Kopfler, F.  C., R.  G. Melton, R. D. Lingg and W. E.



       Coleman,  "GC/MS Determination of Volatiles for the



       National Organics Reconnaissance Survey  (NORS) of



       Drinking Water," in "Identification and Analysis of



       Organic Pollutants  in Water," L. H. Keith, Ed., Ann



       Arbor, Science Publishers, Ann Arbor, Michigan, 1977.
                             139

-------
antifoamant and essentially no change with heat dispersion.



Although matrix effects are believed to be the major reason



for apparent differences in relative purging efficiencies, it



is possible that thermal effects could have some influence on



the data acquired, using the heat dispersion method.  Thermally



induced enhancement of purging efficiency has been reported for



several ketones from an aqueous salt solution at 50°C compared



to 23°C   .  In the studies reported here, however, the majority



of the sample volume was at ambient temperature surrounded by



the water bath.



       It is interesting to note that there is no clear



relationship between the relative magnitude of matrix effect,



i.e., error factor in Table IV, and structural similarities



of the priority pollutants.  This is clearly illustrated for



di- and tri-chloroethanes.   1/1- and 1,2-dichloroethane have



fairly small error factors  with the silicone antifoaming



agent and soap, while 1,1,1- and 1,1,2-trichloroethane have



larger error factors.  However, with heat dispersion, it is



the 1,2-di- and 1,1,1-trichloroethanes which have small



error factors.  Apparently, compounds which behave similarly



in one instance, may not in the next.  This tends to suggest



that a universally good internal standard for any given com-



pound may not exist.  Because of this, it seems likely that



standard additions, rather  than the working curve approach



to quantitation, would result in more accurate results.



Standard additions accuracy should approach that represented



by the error factors given  in Table III or about 15 to 25%.





                             140

-------
       In comparing the two foam reduction procedures, the




heat dispersion method is superior qualitatively because it



does not introduce interfering substances into the analyses.



It is also better in a practical sense because it does not



necessitate dismantling and cleaning of the purge and trap



apparatus after several successive runs.  The silicone anti-



foaming agent provides more accurate quantitation when the



soap solution was referenced to pure standards; however,



both approaches to foam reduction exhibited matrix effects.



Because of these matrix effects, standard additions or some



other matrix compensation approach should be employed for



accurate quantitation.  Alternative approaches are currently



being investigated and will be reported on in the future.
                             141

-------
LITERATURE CITED








1.     U.S. Environmental Protection Agency,  "Sampling and




       Analysis Procedures for Screening of Industrial



       Effluents for Priority Pollutants,"  Environmental




       Monitoring and Support Laboratory, Cincinnati,  Ohio,




       April, 1977.




2.     Bellar, T. A. and J.  J. Lichtenberg, J.  Am.  Water




       Works Assoc., 66, 739 (1974).



3.     Bellar, T. A., J. J.  Lichtenberg and R.  C.  Kroner,




       J. Am. Water Works Assoc.,  66, 703 (1974).




4.     Brass, H. J., M. A. Feige,  T. Holloran,  J.  W.  Mello,




       D. Munch, and R. F. Thomas, "The National Organic




       Monitoring Survey:  Samplings and Analyses  for Purge-




       able Organic Compounds," in "Drinking Water Quality



       Enhancement Through Source  Protection,"  Robert Pojasek,



       Ed., Ann Arbor Science Publishers, Ann Arbor,  Michigan,



       1977.




5.     Kopfler, F. C., R. G. Melton, R. D.  Lingg and W. E.




       Coleman, "GC/MS Determination of Volatiles for the




       National Organics Reconnaissance Survey  (NORS)  of




       Drinking Water," in "Identification and  Analysis of




       Organic Pollutants in Water," L. H.  Keith,  Ed., Ann




       Arbor, Science Publishers,  Ann Arbor,  Michigan, 1977.
                            142

-------
6.      Mieure,  J.  P.,  G.  W.  Mappes, E.  S.  Tucker,  and M.  W.



       Dietrich,  "Separation of Trace Organic Compounds from



       Water,"  in  "Identification and Analysis of  Organic



       Pollutants  in Water," L. H. Keith/  Ed., Ann Arbor



       Science  Publishers, Ann Arbor, Michigan, 1977.
                             143

-------
                      Table I
                QUANTITATION MASSES
Quantitation
Compound Name Mass
benzene
carbon tetrachloride
chlorobenzene
1 , 2-dichloroethane
1,1, 1-trichloroethane
1 , 1-dichloroethane
1,1, 2-trichloroethane
chloroform
1 , 1-dichloroethylene
1 , 2-trans-dichloroethylene
1 , 2-dichloropropane
ethylbenzene
methylene chloride
bromoform
dichlorobromome thane
tr ichlor of luorome thane
chlorodibromomethane
tetrachloroethylene
toluene
trichloroethylene
78
117
112
98
97
63
97
83
96
96
112
106
84
173
127
101
127
164
92
130
Internal
Standard
BCM1
BCM
DCB2
BCM
BCM
BCM
DCB
BCM
BCM
BCM
BCM
DCB
BCM
DCB
BCM
BCM
DCB
DCB
DCB
BCM
1Bromochloromethane (BCM),  quantitation mass = 128
2l,4-Dichlorobutane (DCB),  quantitation mass = 55

-------
                          Table II
       CORRELATION COEFFICIENTS OF CALIBRATION CURVES
                                Correlation Coefficients
      Compound Name
benzene
carbon tetrachloride
chlorobenzene
1,2-dichloroethane
1,1,1-tricnloroethane
1,1-dichloroethane
1,1,2-trichloroethane
chloroform
1,1-dichloroethylene
1,2-trans-dichloroethylene
1,2-dichloropropane
ethylbenzene
methylene chloride
bromoform
dichlorobromomethane
trichlorofluoromethane
chlorodibromomethane
tetrachloroethylene
toluene
trichloroethylene
  Pure
Standard
  .994
  .991
  .994
  .993
  .996
  .993
  .994
  .919
  .981
  .992
  .992
  .995
  .999
  .991
  .990
  .992
  .994
  .995
  .999
  .993
 Silicone      Heat
Antifoaming  Dispersed
   Agent       Foam
   .994
   .986
   .999
   .994
   .996
   .994
   .995
   .930
   .956
   .998
   .987
   .990
   .994
   .998
   .989
   .995
   .998
   .998
   .999
   .993
.992
.995
.995
.991
.998
.997
.998
.936
.994
.997
.994
.995
.997
.990
.983
.997
.999
.995
.999
.995
                      Mean
  .989
   .989
.992

-------
                Table III

 ERROR FACTORS FOR DATA  COMPARED  TO  THE
REGRESSION LINE THROUGH  THAT  SET  OF  DATA
                             Error Factor
Compound Name
benzene
carbon tetrachloride
chlorobenzene
1 , 2-dichloroethane
1,1, 1-tr ichloroethane
1 , 1-dichloroethane
1,1, 2-trichloroethane
chloroform
1 , 1-dichloroethylene
1 , 2-trans-dichloroethylene
1 , 2-dichloropropane
ethylbenzene
methylene chloride
broraof orm
dichlorobromome thane
trichlorof luoroine thane
chlorodibromome thane
tetrachloroethylene
toluene
trichloroethylene
Mean
Pure
Standard
1.24
1.24
1.25
1.21
1.17
1.23
1.24
2.16
1.45
1.27
1.27
1.23
1.06
1.24
1.24
1.25
1.17
1.22
1.09
1.26
1.27
Silicons
Antif oaming
Agent
1.23
1.36
1.05
1.19
1.19
1.22
1.21
1.84
1.77
1.12
1.29
1.25
1.13
1.11
1.25
1.18
1.10
1.08
1.08
1.22
1.24
Heat
Dispersed
Foam
1.24
1.16
1.23
1.22
1.13
1.13
1.13
1.79
1.25
1.14
1.20
1.24
1.17
1.34
1.21
1.16
1.09
1.20
1.10
1.17
1.22

-------
             Table  IV

 ERROR FACTORS USING PURE STANDARD
SOLUTION DATA AS A CALIBRATION CURVE
                         Error Factors
Silicons
Antifoaming
Compound Name Agent
benzene
carbon tetrachloride
chlorobenzene
1, 2-dichloroethane
1,1, 1-trichloroethane
1 , 1-dichloroethane
1,1, 2-trichloroethane
chloroform
1,1, dichloroethylene
1 , 2-trans-dichloroethylene
1 , 2-dichloropropane
ethylbenzene
methylene chloride
bromoform
dichlorobromomethane
trichlorofluoromethane
chlorod ibr omomethane
tetrachloroethylene
toluene
trichloroethylene
Mean
1.22
1.49
1.41
1.29
1.99
1.29
1.71
1.96
2.00
1.36
2.21
1.34
1.15
2.08
1.45
1.42
1.84
1.34
1.38
1.22
1.56
Heat
Dispersion
1.51
1.20
2.59
1.32
1.15
1.85
1.89
2.40
1.75
1.44
1.59
2.34
3.16
2.25
1.39
1.17
2.69
4.04
2.71
1.47
2,00

-------
                     He Carrier   Vent
     2mm ID
    Glass Tube
Heat
Gun
              Pure N2
             Purge Gas
                                        Ten ax
                                         Trap
                               Water Sample
                                       Valve

                                       •-To GC/MS/DS
                                         Oven
       Figure 1.  Apparatus configuration for heat
                 dispersion of foam„

-------
e «
              H  TJ  O
              \ M  4->

              3. T3  T3
                  C  1)
              O  ro  3
              O  4->  O
              H  CO  r-l
                      H
              — rH  (8

              —' cn ui
                  a. <0

              00
              "WOE
                  rH  (0
              tn      o
                  e_».  tl  I
                  •^"^  "^^
                      
-------
                                                                                              si
(0
               o
               S
o
S
                                                                                             og
                                                                                             WS
                                                             Cn
                                                          CU fi
                                                          ^ -H  I
                                                          3 w w
                                                          a 3 -H
                                                                                                              -p
                                                                                                        0">*—'  US
                                                                                                        3-     0)

                                                                                                        O  Q,
                                                                                                        r—I  (IS -—x
                                                                                                            O  O

                                                                                                        o
                                                             -P  -
                                                             •H -P

                                                             \ OJ
                                                                                                        fC -M  (Ti  C
                                                                                                        e -H    -H
                                                                                                        O S  tn g
                                                                                                        ^     C  (0
                                                                                                        X3 —~ -H  0
                                                                                                        o fd  g 4-4
                                                                                                           —  rj
                                                                                                        O g  -4 t3  O
                                                                                                        •P s-i  u  e
                                                                                                        W (0 -H  O
                                                                                                        C T3 
-------
                                                         Of
                                                         g
                    I
                   2
        • X  •


             m  m
                                        X     •
(0
                                                          O)
                                                         DC
                                                         IU
                                                         y
                                                         O
                                                                      to
                                                                      0)
                                                                      3  0)
                                                                     rH  C
                                                                      (0  (0
                                                                      >  .C
                                                                         -P
                                                                        I
•H rH
-P   *
(T3 rH

CD rH
    C7> C
a.  e -H
(3 -H  g
O  g  (3
cn  nj  o
    O MH
11 i| j
3 -H  OJ
0 4J  O
-C  C  3
    (t3 T3
       CU,
    CD'  M'
    C
    O  O
    O 4J
                                                                            4J
                                                                            •H
                                                                            3
                                                                               H  C
                                                                            n3 -H  O
                                                                            OJ  W -H
                                                                            N     01
                                                                            >! (C  S-l
                                                                            r-l     CU
                                                                      T3 X5  C  •
                                                                      CD —  (3 —
                                                                                   01
                                                                            en
                                                                            0)
•H
 3  C  0) C 4J
 C71 H3  rH -H  (T3
 O     Qi CO  CD
 (3  0)  £ 3 £
    C  (3

 03     (0 £

 C  O  0 CO
 0  4-1  4-4    T3
 CO        X  C
•H  -—- T3 4->  fO
 SH  (3  CD -H
 nj  —4-1 3 4-1
 a    4-1     C
 g  SH  O T3  CU
 O  O  --H C  CT>
U  U-4  Cli '"3  (3
•>*

 CD

 3
 Cn
           30NOdS3d  3AI1V13H

-------
                        LIST OF FIGURES
Figure 1    Apparatus configuration for heat dispersion
            of foam.

Figure 2    Reconstructed gas chromatograms for  (a) 100 yg/1
            pure standard mixture, and (b) 100 yg/1 standard
            mixture containing soap when foam was allowed to
            enter the sorbent trap.

Figure 3    Reconstructed gas chromatograms for 10 yg/1 pure
            standard mixture (a)  with/without soap, (b) using
            a silicone antifoaming agent, and (c) heat dis-
            persion to reduce foaming.

Figure 4    Comparison of acquired relative response values
            for (a) toluene and  (b) 1,1,1-trichloroethane
            plotted for samples analyzed  (x) without soap
            and with soap using  (•) a silicone antifoaming
            agent and (•) heat dispersion to reduce foaming.

-------
                           APPENDIX II

         THE USE OF ISOTOPIC CARRIERS AS AIDS INT HE ANALYSIS
                  OF LOW LEVELS OF PRIORITY POLLUTANTS
              R. Pavlick, C. Jackson, and D. R. Rushneck
                           PJB Laboratories
                        373 S. Fair Oaks Avenue
                      Pasadena, California 91105
                             ABSTRACT

     Gas chromatographic analysis of the highly polar, semi-volatile
priority pollutants is difficult because of absorption of these com-
pounds on the chromatographic column.  This paper suggests the use
of either an isotope of the pollutant compound, or an isomer or
functionally similar compound to overcome this absorption, thus
permitting gas chromatographic analysis of the compound at detec-
tion levels not achievable to date.  The priority pollutant benzi-
dine is used as an example.

                              INTRODUCTION

     The inability of many columns to transmit highly polar com-
pounds has been recognized as one of the most severe limitations
of the gas chromatographic technique.  This shortcoming can be at-
tributed to absorption of the sample on the tubing surfaces, on the
glass wool used to plug the column ends, or on the support.  In
packed column gas chromatography employing diatomaceous earth sup-
ports  (such as those used for the acid and base-neutral fractions
of the priority pollutants), the support usually dominates the ab-
sorptive process.  A major effort has been directed at "deactivat-
ing" these supports by silanization  (with DMCS, HMDS, and other
agents), and by treatment with materials which align the pH of the
support with that of the compounds being analyzed (HoPO4 for acids;
NaOH for bases).  Ideally, these treatments should permit analysis
of the priority pollutants down to the detection limit of the mass
spectrometer or other detector being used.  But we have all learned
that the detection limits for the nitrosamines, the benzidines, and
the dinitrophenols is much higher than for the other priority pollu-
tants.  Further, the analytical reproducibility at moderate levels
of these compounds is not as good as desired.  This non-reproduci-
bility is attributable to varying degrees of deactivation of the
support, an effect which is enhanced by the analysis of a wide
variety of samples.

     The major objective of this study was to attempt to find a
means by which ultra-trace levels (< 1 ug/1) of the highly polar
priority pollutants could be detected.  As will be seen, this ob-
jective was achieved through the use of an isotope as a carrier to
overcome the absorption of the pollutant on the column.  The tech-
nique was first reported by Samuelson et at1, and a variation of
the technique employing a functionally similar compound to overcome
absorption was reported by Kelley et al .  In this study, the test
                                 144

-------
compound was benzidine, the isotopic carrier D8-benzidi.ie, and the
functionally similar compound was p-phenylenediamine (1,4-PDA).


                             TESTS AND RESULTS

     Gas chromatographic columns 6 ft. x 0.080 in. i.d. were baked
for two hours at 450°C then silanized with 10 percent DMCS in
toluene.  They were blown dry with nitrogen and packed with 3 per-
cent SP2250ODB on 100/120 Supelcoport.  After overnight condition--
ing at 250°C and 20 ml per minute nitrogen, a given column was in--
stalled in the GC/MS, purged for approximately 30 minutes at room
temperature, then temperature programmed to 280°C at 8°C per minute
and held at 280°C for 15 minutes.  In order to prevent the operating
history of the column from influencing test results, a fresh column
was used for each test.

     Benzidine without Carriers.  This test consisted of injecting
5 ul aliquots of solutions containing increasing quantities of ben-
zidine  (from 0.1 ng/ul to 1 ug/ul) in methylene chloride containing
a constant amount of an internal standard  (DIO-anthracene at 100
ng/ul).  The internal standard was used to compensate small differ-
ences in amounts injected.  A computer program  was used to auto-
matically detect and measure the area of the benzidine peak at the
m/e 184 ion.

     The results of this test are shown in Figures 1 and 2.  Figure
1 shows that at low sample quantities (0.1-100 ng), benzidine is
completely absorbed by the column.  At approximately the 170 rig
level, benzidine appears.  In the region between 200 and 5,000 ng,
the area of the benzidine peak incrreases, but not in proportion
to the quantity of benzidine injected.

     Figure 2 shows the benzidine peak at m/e 184 and the peak
"tailing factor" for the peak resulting from a 167 ng injection.

     Benzidine after "Treatment".  The test described above was re-
peated immediately after the analysis of a sample containing 5 ug
of benzidine.  The "treatment" of columns by injection of  large
quantities of the compounds of interest is a technique widely used
in gas chromatography.  The results of this test  are shown in Fig-
ures 1 and 3, which show the lowring of the detection limit to ap-
proximately 15 ng (hexagons), and peak shape and  the tailing fac-
tor at 167 ng, respectively.

     Benzidene plus, 1,4~PDA.  Portion of  the solutions used for
the tests above were spiked with a constant amount of  L,4-PDA such
that each injection contained approximately 2.5 ug.  The results
are shown in Figures 1 and 4, which show the response curve
(squares) and tailing  factor for benzidine, respectively.

     Benzidine plus D8-Benzidine.  Solutions containing a  constant
amount of D8-benzidine  (2.5 ug per injection) were analyzed in a
manner identical to those above.  Results are shown in Figures 1,
5, and 6 which show the response curve, and tailing factors for
167 ng and 5 ng of benzidine, respectively.

                                 145

-------
                      CONCLUSIONS AND DISCUSSION

     As Figures 1 through 5 show, the addition of a carrier or treat-
ment markedly improves the transmission properties of an SP2250DB
on Supelcoport column for benzidine.  Clearly, an isotopic carrier
(in this case D8-benzidine) so successfully overcomes column absorp-
tion that the benzidine behaves much like a hydrocarbon.

     Less obvious are the subtleties inherent in the curves marked
"Benzidine" and "Benzidine after Treatment" in Figure 1.  At first,
it might be thought that the use of a calibration curve such as
that shown for "Benzidine" in Figure 1 would permit rigorous quan-
titation in the 100-5,000 ng range.  But consider what happens when
the datum point on this curve at 5,000 ng is obtained.  The column
has now been "treated" with 5 ug of benzidine, and therefore, the
curve for "Benzidine after Treatment" should be used.  But the ab-
sorption of benzidine by the column is reversible, and after analy-
zing a few samples which contain no benzidine, the curve will return
to that labelled "Benzidine".  Thus a region of uncertainty exists
between these two curves, the characteristics of which are deter-
mined by column history.  An area of 100 units will yield a quan-
tity of benzidine somewhere between 25 and 100 ng; an area of
10,000 units will yield a quantity of benzidine between 250 and
650 ng, errors normally considered large for this type of analysis.
Using logic similar to the above, it can be shown that replicate
calibrations mislead the analyst further, in that the curve for
"Benzidine after Treatment" is always obtained so long as the cali-
bration is performed reproducibly.  But when a series of samples
is analyzed, the curve shifts back to "Benzidine".  Thus the analyst
can demonstrate that the system is precise for standards, but the
fact remains that it is still imprecise for samples.

     The use of a carrier compound such as 1,4-PDA makes the cali-
bration curve reproducible when it is added to every sample, but
has the drawback that it contaminates the samples.

     Finally, using D8-benzidine is not without its problems.  Of
great concern must be the hazards associated with the use of such
a compound if the addition of 500 ug per ml of extract is required
for reproducible quantitation.  But the conclusion remains:  If
precise quantitation of benzidine at the one ng per liter level or
below is required, the addition of D8-benzidine to the sample will
assure both the low detection limit and the reproducibility.

References

1.  Samuelson, Hamberg, and Sweeley, "Anal. Biochem.", 38 (1970) p 301.

2.  Kelley, Nau, Forster, and Biemann, "Biomed.  Mass Spectrometry", 2^
    (1975) p. 315.

3.  Rushneck, "Search and Quantitation of the EPA Priority Pollutants
    by GC/MS using a Computer Program for Automated Data Reduction".
    Presented at the EPA Analytical Conference, Savannah, GA, June
    1978.


                                 146

-------
                  Appendix III
           POTW Quality Control Study
              Kathleen E. Thrun
              Philip L. Levins

              Arthur D. Little, Inc.
Organic priority pollutant quality control data,
collected for a U. S. EPA program analyzing for
priority pollutants in sewage samples.
                        147

-------
D  =
 Pi
 Sp
           METHOD REFERENCE STANDARDS  (D)


Ultrapure  water  with  priority  pollutants  added into a known
concentration.   [SPIKE]  E  True concentration value of the spike,

Percent  Recovery
100 x  [D1
 [SPIKE]

 Standard  Deviation of  percent  Recovery
              n
             1=1
                   (Pi-
                 n - 1
           Where n = number  of  samples  analyzed
C

B=A
Pi(A)=
Sp
Xi
                          RAW WASTEWATER (A,B,C)
Field sample
Field sample with priority pollutants  added  into  a  known
concentration.

Percent Recovery

100 x ([A] - [C])
    [SPIKE]

100 x ([B] - [C])
    [SPIKE]

Standard Deviation of Percent Recovery
              n
                   (P± - P)
Mean of Replicate Spikes

     -  [C]) - ([B] -
Ri
Re
Range of Replicate Spikes

 ([A] - [C]) - ([B] -  [C])

Critical Range for a Specific Concentration
                  n

3.27 x  [SPIKE] x  Z     Ri
                    n
                           Xi
                                    147a
                                                         Arthur I) Little, inc

-------





















_1
ONTRO
VJ

ALITY
o
o
































1
C/3
Cd
LATIL
o
>














/•"N.
,0
Pi
W
H
3
u i
t-> i
w
r-- 1

»s
'ai \
« |
j
1
» *"••
'.„,„,
^^
cd
w
u
53


W
&.

P
O
K
H
[fl
f-*







Cu
{/}
**
p<
C/3


P*
3
__J
t
1
O



^1
u
--.




c.
LO


' a
; w
"i



|CX,

t



































CJ
>«*







C'J
a)
c
1j
F:









...
~~


















CJ
N^







0)
c





























o
^^







c


















MO**..










o







cu
TJ
•H

[~,
rH
CO
CM


r.
("•-


o.
1 1



o
V J

•"• ••**•



m


VI
CM



f-







CI
0

CM

t— 1


rH
-n


O
o
1—1


o
CM

"——



CO


rH



0







q
rH
CN
CN
t~'
H 4-1
o u 4J iH cd cd o
^ C 4J C
4J 4-1 >, cd C 3 0)
O. IV . O rH *H rH H
0) 0) -H U rH CU
CJ O *O cd C! O ^
K « 0) > -H O. V
0) 0) 4J cd U M
CJ >>
CO CO 4J O 43 i-l O
4J 4-1 0) C 4-J VJ -C
G C T3 O 4J
•H -H 0) 60 T-l CU
O O 4J H C rJ 6
a. cu o cu -H a.
cd cd 3 u-i (3
4J 4J CO CO 13 O Cd
cfl c3 13 rO
73 T3 (3 d •• fl U
3 3 CO O 0)
m o o o C 1-1 4J
rH en Cu Cu O 4J cd
B S -H ed 3
C C 0 0 4J M 0)
o o o a 3 4J 4J
                                                            rH  ti CO
                                                T3 *O  0)  0)  O  0) (d
                                                 0)  0)  W  CO  CO  CJ ?
                                                 en  en  o)  a)     c!
                                                 cd  cd ,13 43     O
                                                PQ PQ H H    CJ
147b
                                               Arthur D Little, Inc

-------
 O
 1-1
 4J
 C
 o
o
cfl


Cf
     G
     O
    U
    w

    3
                p.
                to
                p.
                to
                    CO
                    in

                    CO
                                  CM
                        CM

                        CO
                                  ,-l
                                          O
                                          CM
                                          co
                                                           CM
           £B
t • • -  ->




,

-1!

0
~ '

o
tN*
o
1-1

o
Csl



c
Cxi
o
1— i

o
-sl
m


o
1 4





  g'
            t-5


            W
            C
            o
\ '
5 &
! c^
f
r
i
!
t
t o.
.
;
|
• .
I &.
i

,'
i













f"\
^
£.->
o

Ep
o
o



CM
M



rH
rH

cr\







c
£*
CO
.fi
4J
rj
U
o

o
H

"u
o
££
c
J-)
JD
•H
0
o
MMM







^
N*^





J-t
cu
r^-
JJ
i)

f— j
^
r*
4J
Q,
g
o
v-i
o
r-^
r^H
o
I
1/j
•H
PQ
t—t

en



^
ri
O
O
r-l







CJ
c
CO
a
o

p.
0
J.;
0
rH
r-*
O
•H
c.
1
CM
*
rH
CM
rH




r— 1
ro
CO
o-\














CJ
c

u
c
c
f .
0
J_J
o
rH
^
b
IN

rH



O
rH

CC


a
r*
3
p.
o
^4
P.
0

o
rH
JZ
O
•H
-a
i

•V
rH
1
X
C
Q
j^
^
r-j

rH



CO
rH
tH
GX,




OJ
c

c-
o
}_j
a
c
!H
O
rH
f.
O
•H
-a
i
ro
*
rH
1
Uj
•f-1
u
m
CM

rH



vD
rH
CO
rH
rH









OJ
C
a
rH
J>,
J^
«J
Gi
C
^
3
rH
.C
a
•rH
J j
H
O
CO




f^

CO
CC




















cu
c
o
^4
ci
i)
pa
t-
c>.







^
%_»•




^
OJ
,n
4_I
o

r— i
^-»
K
•p^
J>

r~i
*>!
*-'—
i-J
0!
^
r^
0
T— {
r^
u
CO
r-.
rH



rH
CM
CM
CM
H













"J
C
O
N
^."
CJ
^_'T'V

r-J
>^
T"
i-1
"^
CN










CTj
H

^J
I-H
"v
p-
c
}-.
c-
o
'M
(^
t — I
tj"t
o
1
r~ '.
1
Q
P
O
r.
pi
1
CN
0
CO
rH



rH
'"'
r-4
CO


















^
}-t
c
u_^
o

6
i-<
C-)
rH

rH



-cr
""*
CM
O
rH

CJ

U

, J
CJ
o

o
rH
^2
^j
r^
'_,
4J
^
[.-s
1
CM
n
CM
i
r- ~\
f,
rH
CM

in



CM
-3-

r^.


U)
C

r^
^j
(U
C

o
rH

"o
C3

4-
Q;
t-
|
oj
«s
f\]
*,
r*i
^
»-H
Z














»
c/:
M
#,
flj
P
cd
4J
^
ti3
Q
j_(
G
— i
^d
O
• r-l
^7i
i

M
rH
CO
rH
CO



rH
ro ;

C?\




















CJ
^
o
3
rH
O
H
in
ro
                                                 14 7c
                                                                                             Arthur D Little, Inc

-------
    CO
    Q
    M
o-





^
JQ
rff*
P£
t-
1
*^
W



/~\
£
• 0


,-
CO*
j-s
t
•
: ex
I C/3



'*












o
1
u












— ~
ll
i
r1
I
















CO
M

*
0)
tJ

**"•«
C-.
o
o
rH
•C
u
i
CM
C^

O
CM


CM
O
rH
,,^.*

in
rH


i/S



o
> rH
1

O
CM



O
rH







rH
O
r '
O
.C
(X
o
T-1
•*z
1
CM
r,

CM
^j-


CM
CM
0

• in i f

r-j


o



m
rH


CM
rH



r-













i— 1
C
C
V
f.

•»

cy>
rH


rH
,
r^
tJ
01
t;
o
i

A
CNl
m

CO
rH


1— 1
CO
o
H
. ....

c\
1—1


cr
L'l



^
en


o
C^l



CA


rH
O
C
d)
^
a
o
\4
O
rH
j";
w
•1-1
a
|
^j-
^
CM
^

vC
rH

30
r-l
CO
O
r-1


t— 1


1 lj 1



CM
r\l


rH
CM



CO
CTi
i-H
O
C
01
*G
ex
o

o
rH
,C
O
•H
j_j
L— '
1
^
^f
r.
CM
"^

O
rH

CM
rH
CNi
I— 1
«.-*

a,


o
U )



rH
rH


cn
rH



-^~
rH
rH



r-!
O
to

i^
o
1
rn
i
o
o
r- 1
•-£
u
I

co

»^-
CM


CM
O
t—,


r-


o



^0
CNl

t
CM



UN



rH
O
^
^J
^^
l^_
O
^j
4-)
•n
C
1
^f
f
0!
i

cn
CM


e>4
0
.••i
.— ,^

O
1 — 1


C,
1. 1



m
vr


TM



CM
r-t
0
C.J
p

o

Cs

o
>„
ij
•f-(
,^
1
^_'
f

~-!

ON


rH
r— *
o-
rH
rH
-.-— -

CM
i-H


O



0
cn


r,



XI
co



1—1
0
c

ffi-*
o.
Q
^
O
T-*
c*
*-J
*J
Oj
rH
rH

vO
o
CM
CM
"*
r-|

1
~"^

"J


•'n



m
co


a^ i



m







r-i
Q
£
f
t--"
cx
o
v
•r^
J^
|
.
rH
                                                                                      co
                                                                                      o
                                                                                      co
                                                                                      a)
                                                                                      M
                                                                                      U

                                                                                      m
                                                                                       i
                                                                                      o
                                                                                       a
                                                                                       (U
                                                                                       o
                                                                                    co  co
                                                                                    O  O
                                                                                    (X  Cu
                                                                                   m o
                                                                                    C  C
                                                                                    o  o

                                                                                   TJ T3
                                                                                    
-------



















o
O
u

H
^
Cf




































H
O
1
H
3
W
oq
































































^
rQ
M
P^j
H
<
^_
1
££
t-..
«


r - --

cd

s
cq
^",
Q
0
w
r-1
t±^
,-2"!














P
CO
^
P
to


~1
lr.<
'


<•
f*
|1T
i^c


"^J_~
I
1 £
>
;
' c

»
t
I ^












0
O
u












i

1
'

L

CT—TJ-
!
s
1
i
!
!














cene
Anthra

o
*










"'P?* -






"














/-~




robenzene
0
r-l
X
O
•H
1
CO
A
r— 1

CO
CN

Oi
rH


r--!
OO



L ",
o
t_^




i-H
CO

CO
CN

-ct
p^





j.
*—~^~




robenzene
Dic.hlo
i

rH











- -- _. 4*




















	 v
\



robenzene
0
rH
T,
•H
G
1
r i
rH

O
CO

CN
CM


rH

•«ms*v

ON
rH
O
-'1




oo
CO

m
CN

\D
*>£>











CU
C
4-1
chloro
l^
X
o
























'V
%*-'



y

4-1
o.

^™^
rH

roisoprop
r-l
1
cs
^->
W
•H
























CJ
N— f







ri
2

4-1
01
rH
4-1
B
Chloro
^-*
U)
•H.
P3

xD
CO

oo
CM


ON
r--



CN
r^
in




0

rH


r-J
00


0
CN
O
U 1




CN

r- 1
CN


p-^








CJ
S

N
O
o
i-l
o
1 — 1
•£
*s
CN
•H"

o
CM

oo
rH


vO
00
mr-*mr-
^l'i~-*.
\£
CN
t— 1
in




o
CM

>£>
rH


,~-^











Ol
o
X
I--J
P,
s


















towanc





CJ
v— ^





CJ
c

•H
'C

C
CJ
a.
0
rH
O
U
c
rH
O
1 — 1
X
C3
X
ai
>-r<
M-,










. _






-.
ria» jc^*





-
^^




01
c
c
r^
U
'D
V.H
O
X
C)
P
o
1 — 1
1
"V-X
C1
•H

o
CO

o
CN


m
CO

.•*»• -1^-
^3-
CM
O
LO

_
KVMBM

CM

CO
CM

0
CO












0)
r-
"H.
O
CO
I-H















0) 0)
4-1 • 4-1
C T3 O
* "H 34-1 •
i 4-1 C W
01 CO -H Oi
« 3 -H -0 ft
" rH 4-1 (1) E
2 o cd 4^ cd
x ^ c C -H co
^. -H C P.
S -HMO)
B o cj 0
4-1 C CO C
CU -H 4-1 CU
X 01 O C rJ
4J rH Cd 0)
43 CU 4J *4H
>, ccj x 3 0)
43 rH 4J rH i-l
•H rH
•o cd oo o tJ
01 > C P, O
4-1 cd -H X
O J-l >, 4J
• • 01 4J 3 4-1 0)
CO CO 4-1 O T3 -H 6
4J 4-1 CU C rJ
C C TJ - O -d
•H -H 0) CO -H C
O O 4J S~i (3 !-i fll
p, PL< O CU O p(
C S -H M
ctj cd 4J u-i cu
4J 4-> CO CO 3 O 4-1
cd cd T3 T3 rH cd
T3 T3 C C O C !5
3 3 W O 01
m o o o -H 4J
rH CO P, P. 60 4J CO
B B C cd cd
C C O O *H r4 ££
o o a cj 4*i 4-1
13 -T3 ai 0> P, 0) cd
a) cu en co to a n
to w cu cu c
rd cd X X O
PQ pq H H O


X~\ ^^^ /^X S**, X~N
n3 ^5 U T3 QJ
X^X X--1 X.^ N*X X-X
14 7e
                                      Arthur D Little, Inc

-------
o
o
    c
    o
    CO
   §
    H
    O
    U
    w
    PQ




H
W
H
10
ft
^
^_

s
PH





^_^
rt


w
^
f--
ps
(7*
^
fi
D
METRO
























to
CX
to
i
| ("M |
i
^14"
i

U j


j

CU
N^7
' O



i

j c.
' CO

1
\
\ ^
; i/T
I
•
IP-















e
s
0
C-3
^4
o
o



CrO
C^

S

«lMr«u


vO





01
LTl






0
i ^
•

rH
CM
j

O
r-i







0)
Ci
Ci
rH
m
T*"I
tJ
,£
a
a
f-
o
r*
c
1 — 1
r»-
o
1
CM
£
rH
CM

CM
7
rH
•-••—


co
rH




JT\
-cr






O~N
1— 1


0
CM


CM
O
1— 1












(J
Q
a)
r-H
4-1
C.
a
c
a;
U
<
CO
r-l
CO
rH
CO
r-l
m
o
— i
~.*C*
1

CM
tH
CM
CM
CM
O
f-H
	 .


^j-
rH


in
,
CM
m






rH
Tsl


^O
CM


CM
rH
















K «•! M



en


cc
1 — 1 '
rH









a;

tC
rH
r
^
^
a.
rH

—
4J
O
1-4

CM
rH
CO
CM
£
II • •• i



rH
en




o
m






vO
CO


M)
st


CN
t— (






Q>
^j
a
tH
r"3
wC
^j
*£+
C«
^
t?
»ti
I

I
"r-l

£





*«•-*..
























^d





^
u
•r;
rH
re

jj
r]
C.

r— ;
|
r-i
f_
s^
;. '
--•
^
CC
OJ
co
iH
en
m

'»'«1*'


cr>





o
in






o
vD


-n
U".


0
ao
cu
4J
PJ
rH

•C«
*-J
r-«
"£*

X*^.
I—{
r^
i^
O
I-—
T-l
U
I
CN
^w
w
'•"1
n
CTi
'





^J**W
























IS




•U
c

^;
- j
r-l

C

X.

^-- '
C
tfl
o
U
A--

^,-
t

o
en
en
o
"*
CM
a>

••* >^'->


ex
CM


ir,
•
c,
**

•M^M%




O



•a-
tn


vn
r"!









HI
C
ll
N
C
G
J3
O
V-i
0
rH
r'
^
' eo
X
i

rH
en





--•— • t










1
lr» .•« ••












3
VJ
01
^£^
U
vV*

_J
p^
C
0)
pkrf
f^.

rH
^
4!
"c-
O
£-' i
0

ffl

**"
CN
r1
                                         147f
                                                                                 Arthur D Little. Ir

-------
    4-1

    C
    O

    CJ
o
CJ
o1


w
H

W
w|
si
*l
<2






•«

w
y
w
w
fa
w

D
O
H
S



















cx
10
cx
to
i
\
.*- -*>
i
q!
' i

J~ '
I* ""' 1-




p,
r/i
t ^

i
i
. Ci.

c
I


IP-











^
L
t^
O
t_1
o
CJ



CM
CM
iH
CM
.-r*

~i
v.0
Csi


1 J
--1




O
CJ


vO
CM



m
CM

r— '








^
P.
ro
i— (
rH
tH
r^


00

C'
m





^j.
r*~


C\)
CO


















CJ
•H
TJ
•H
N
C
Hi
(Q
ro
0
Cv
t -i
CO


oo
1-1

-f
CJ





„-.
»a"


^
m




r- i

:~ - '












C1
o
w
i r*
o
x-




*< II 1 »l



















T3

— \



a)
c
u
o
re
w
.s
4-J
c

' — '
h 	 ,
o
N
CJ
«
CTv
ro




-in -'i



















T3

(U
C
•1-1
•o
•H
N
C
V
O

o
r-(

O
•H
Q
1
CO
„
ro
O
























•o



0)
£^
CJ

4J
P2
C3
O
3


•~^
I—I
0
N
u)

^!
























•o



4)
c
r,)

u
£2
Cfi
0
3
•H
U-l
	 ,
•
O
c

M
Sr
s
g
in


CM

^j.
r-4





CM

"

vO
-a




to









c;
c

M
j>^
a.
^
I_M«
o
N
Ci
ca
•o-
























•Of

CJ
C
-u
w
^*»
a
x^v
P~-I
r
ro
*
CM

iH
0
C
o
•a
c
M
5
O
co"
u--,


co

m
CM





vO
m


CM
^^




rl

o;
CJ
o
•j

)u|
rr^
4-1
C
KJ

r^

rt
0

C
•H
a
m
R
CO
CO
m


o
ro

el
CM





^.
v£>


CO
*^j-




r>.





a)
C
(U
^->
^
o
Q-
,«»
>f_.
c?

o
t:
•U
M
*
























•O
"^









Q
^i
U!
C-i
CC
*
r^
*i

•CT
                                             147g
                                                                                     Arthur D Little, ln<

-------

P
O"
    en
    M
    a
    H
    U
    H
    H
    c/i





.a
W
H
W
H
00







I
^

-J-
rH
O
1-1

0
'"


o

ro





C
rH







^.
^






M
c:
q
*

sr '
rH '
O
rH


t--
t^v^

O
rH
a-.
C-J





r-







CM
co
p
M
X
O
W
r<
IIF.PTACHLO
r-.

rH
CO

rH

CM
^O
r~ i»»_-i ii

r-

rH





O
CM


rH
rH



p^»
IT,



t
H
«-*
ENDOSULFA
co

vo
CM
OO
rH

O^
\^^
•— - —

r-^
rH
CO
C:





O


<-
rH



O







P
^

CM
CM

*—'


°


O
r~l
'M





CM
rH


OC




CO
0






r-1
u:
r-i
t~;
£
o

m
CM

•"I


tfl



                                                                                          ca rt
                                                                                          T3 T)

                                                                                          m o
                                                                                          rH CO

                                                                                          C C
                                                                                          o o

                                                                                          TJ T3
                                                                                          0} 0)
                                                                                          co co
                                                                                          cs cfl
                                                 147h
                                                                                                 Arthur D Little, Inc

-------
   METHOD REFERENCE
Total Phenols and Total  Cyanides
        Quality Control
                       RAW WASTEWATER

n
15

JP
107

SE
13
Total
%Sp
12
n
30
Phenols
C
60
M
27
n P S£
30 93 15
%Sp
17
15   103
         Total Cyanides
         15    20     4
30   101    12
12
                                   1471
                                                                 Arthur D Little, Inc

-------
                      SOURCES OF PRIORITY POLLUTANTS IN
Study:
                        PUBLICLY OWNED TREATMENT WORKS
                                     (POTW)
                         ENVIROffBM. PROTECTION AGEICY
For:               OFFICE OF WATER PLANNING AND STANDARDS (OWPS)
                   MONITORING AND DATA SUPPORT DIVISION (MDSD)
                                WASHINGTON, D,C,
fiy:                          ARTHUR D,  LITTLE,  INC,
                            CAMBRIDGE,  MASSACHUSETTS
                                      147j

-------
      SAPLING AND ANALYSIS PRXEDURES FOR


  INDUSTRIAL EFFLUENTS FOR PRIORITY POLLUTANTS
      U,  S,  ENVIfWBlTAL PROTECTION AGENCY
ENVIRONMENTAL I^NITORING AND SUPPORT LABORATORY
           CINCINNATI, OHIO  45268
                       , 1977
                    REVISED
                  APRIL, 1977
                      147k

-------
   PRXEDURES FOR PRELIMINARY EVALUATION OF
     ANALYTICAL !€THODS TO BE USED IN THE
           VERIFICATION PHASE OF T}£
   EFFLUENT GUIDELIfES' DIVISION OF BAT REVIEW
      U,  S,  ENVIRONMENTAL PROTECTION AGENCY
ENVIRONMENTAL MONITORING AND SUPPORT LABORATORY
            CINCINNATI, OHIO  45268
                   MARCH, 1978
                      1471

-------
SAMPLE DESCRIPTION
SAMPLE  +  SPIKE




SAMPLE  +  SPIKE                B-




SAI^LE                          C




METHOD REFERENCE STANDARD       D




FETHOD BLANK                    F
                           FIELD S/VPLE
      147m

-------

LU
	 1
Q_
&I
CO
	 1
ce:
H—
0
>-
1 —
1— <
C3



k






\
/

   LU
   Z Q 0
   UJ «— i h-
   0 U
 Q.
CO
   O —•
   rH CL<
   QCO
         R

         CO_J
 Q- _l
OO <
O
Q-
  CD
   o: a.
   LUGO'

                   LU
                   o

                  a
                         LU
.s
 CD
     147n

-------
          METHOD
I STANDARDS (D)
PERCENT RECOVERY
    Pi   =  100  x  [D]
                                     [SPIKE]
STANDARD DEVIATION OF  =
  PERCENT RECOVERY
                 M
                i

(Pi  - P):
                                               N -
               FIELD SAMPLES (A,B,C)
PERCENT RECOVERY
            A(B)
    Pi  =  100 x  ([A(B)]  -   [C]

             [SPIKE]
                        147o

-------
RECOVERY AND PRECISION RESULTS

PURGEABLES (VGA)
ACIDS
BASE/NEUTRALS
PESTICIDES
METALS
TOTAL CYANIDES
TOTAL PHENOLS
JUG/L
20
50
50
30
10
40
20
60
METHOD REFERENCE
P
87
89
100
73
115
112
103
107
SP
± 23
± 26
± 33
± 11
± 71
± 7
+ 3
± 13
RAW
P
80
95
84
WASTEWATER
^D
\ji
± 31
+ 22
+ 25
69 + 14
NA
101
93
+ 17%
+ 12
± 15
             147p

-------
tr
UJ
            CO
     CO
     y
8
CO

 LU



 |

 UJ
 u_
 aj
            8:
0%
                 Q_
                        CNJ
       a

       |



       cc
                               cr>
                               en
                               un
                    CN
s
                     UJ


                     I
                                U
                               r-H
                                 ^
                                       UJ
                             01
                                              en
                             00
                             CTJ
                                               UJ
                                    3

                                    5
                                    »—i
                                    Q
                                                     LH
                                                     a
                                                                             co   co
                                                                            I

                                                                 aQ
                                                                 to
                                                            co   co
                                           14 7q

-------
                       0.1   CD
                      COI   CNI
                     10.1
                CO
8
CO
I	1
c_>
     a
      UJ
      CO
          LU
          CJ

          LL

          LL
          LL.
                §
                t
                 o_|    en
                col    CNJ
                     IOJ
LU
E
§
                               a
                                    s
                                     CD
                                     CNJ

LU

D_
                                     CNI
                                       UD
                                       CNI
CNI
                                                     OO
                                                      LU

                                                      CL
                                                      O
                                                      cc
                                                      Q
                                                      o
                                                     CN?
        UD
        CNI
        i
                                                         LU

                                                         O
                                        a
                                                                83
                                                                                       CO
                                                                                       CO
                                                                                       r™-
                                                                                       2   £
                                                                                       
-------
                             10
LU
c/5
ce
£•

LU ,
O

LU
CC
LU
U_
       8
                 s
            10-
                  W
                  S
                  CQ
                      04
                       UJ
                       O
                       00
                            LU
                            LU
                         CM


                          CO
                          »—*
                         CQ
                               UD
                               cr
                                       LU
                                       a
                                                     00
                                                    II
                                                 3Q
                                                 LU
                                             CO  CO

                                             fY^  fy*[

                                             x^  x-x
                                             <  ffl
                                             v-x  >^x
                           147s

-------
                                             11
 CO
\-s
  ce.

  t-
                                       s
      R
               CO

                                              r-H
                                              CSI
     CO
     LU
3
£
 CO

  LU
  O
                                                                        CO     CO
                                                                       I
                                      I
                     10.
R
                           OQ
                                                                        a
                                                                        CO
                                                                        <     CQ
                                           147t

-------
                                                     12
                   03  >
                   CO Gc>

                   CO  CO
                   m  m
                   C3  O

                   o  o
                   z  z

                   t—» oo
                   en
                       o
                   o  >
                   >  -H
                   H  >

                       TJ
                   -o  o
                   o  •—
                   •"-1  Z
                   Z  H
                   H  CO
                   CO
-i
O
c:
                                c
                                3
     00
     m
     |-
     m
     c
     3
                                         m   >

                                         O   2
                                                   CO
                                                                          m
CO
m
                                                   —   o
      o
      z
      -<
O   oo  co  -vj   co  en
t—»   OO  OO  Ul   CD  Cn
     I—•  Ui       I—«
en   CM  K)  oc   I—«
                                                             oo
                                                                    oo
                                ooooov-n
                                i— *  
                             oo
                             CO
                                          m
                                          o
                                          o
                   m
                   -n
                   m
                   73
                   m

                   o
                   m

                   00
                                          z
                                          o
                                             oo
                                             —I



                                             o


                                             C/D


                                             rn

                                             D>

                                             oo
                                                                                             CD
                   CO

                   rn



                   m


                    ttT
                                on
\s*   ro
hO   CD
                       en
      V-M
                                                                    oo
                                                                    -o
R
                                                      147u

-------
                             13


Z
LU
ce
LU
LJ_
LU
Q
O
I—
LU
•t-






	 |

2
1 1 1
uu

Q- -=T CD i— 1 CO 1C NO
CO rH rH CNI
a. CD cr fvo r'-x cr ^3-
CD CD CD en CD r— |
«— : 1 «— 1 «—l I-H r— |

O OO UD CNI CT OO CJ-
f^^ 	 i 	 | 	 | ^^
CZ. i — | i — | i — |

<— > CD CD CD CD CD CD
o in in CD i_n r— t
1 — 1 — 1





cr
c/5 CNI ^- N^\ CD i^\ en





oo h>» en CD CNI tf\
en en en CD CD en
1 '' ' i








UJ
S co
^ LU
-< oc z _i o:
SI LU < LU LU
O Q- Q CD i>£ >
o: a. < z o _i
X O UJ < •-• t—
t_> o _i s: 2: co





CNI
rH
1
cc
LU
_J
CO
~1
»
a
LU
_J
1—
CL.
LU
CJ
X
LU
CO
CO t—

Z >— "
— 0
o a_
^
< r-
r- <
< (=3
(X)
OO rH
Z Z
0 0
Q Q
LU LU
CO CO
DQ PCI
^^^ X"X
< CQ









                             147v

-------
                      14
      STANDARDS ADDED INTO QC S/fflfS
     BUT NOT DEIEC1B BY THE EPA FETHOD
VDLATILES

CHLOROMETHANE
BRQMOMETHANE
VINYL CHLORIDE
DICHLORODIFLUOROMETHANE
BASE / NEUTRALS
BlS-CHLOROMETHYL ETHER
HEXACHLORXYCLOPENTADI ENE
PESTICIDES AND PCBs


CHLORDANE, TOXAPHENE
AROCHLOR - 1016, 1242,  1248,  1254,  1260
                      14 7w

-------
                             APPENDIX IV

                          LIST OF ATTENDEES
Bruce N. Bastian
Shell Oil Company
P. 0. Box 4320
One Shell Plaza, Room 1394
Houston, TX  77210

Bob Blaser
Hamilton Standard
Airport Road
Windsor Locks, CT  06096
Mail Stop 1A-2-4

Joseph N. Blazevich
EPA Region X Lab
1555 Alaskan Way South
Seattle, WN  98134

Gary D. Burns
Cyrus WM. Rice Division
NUS Corporation
15 Noble Avenue
Pittsburgh, PA  15205

Francis T. Brezenski
USEPA, Region II
GSA Raritan Depot
Woodridge Avenue
Edison, NJ  08817

Mike H. Carter
USEPA - EDG
WH-552
401 M St. SW
Washington, D.C.  20460

Edward S.K. Chian
Georgia Inst. of Tech.
Daniel Laboratory
Atlanta, GA  30332

Bruce Colby
Systems, Science, & Software
Box 1620
Lajolla, CA  92121

William F. Cowen
Catalytic, Inc.
1500 Market St.
Centre Square West
Philadelphia, PA  19102
 Paul H. Cramer
 Midwest Research Institute
 425 Volker Blvd.
 Kansas City,  MO  64110

 Robert W. Dellinger
 USEPA - EGD
 WH - 552
 401 M St. SW
 Washington, D.C.  20460

 Robert P. Fisher
 NCASI, Inc.
 P.O. Box 14483
 Gainseville,  FLA  32604

 J. Richrd Florence
 Hydroscience, Inc.
 Executive Park
 Knoxville, Tenn.  37919

 Donald A. Flory
 Spectrix Corporation
 7408 Fannin
 Houston, TX  77054

 Woody Forsht
 USEPA - EDG
 WH - 552
 401 M St. SW
 Washington, D.C.  20460

Gail Goldberg
 USEPA - Enforcement
 EN - 336
 401 M St. SW
 Washington, D.C.  20460

 C. Ellen Gonter
 Cyrus WM. Rice Division
 NUS Corporation
 15 Nobile Avenue
 Pittsburgh, PA  15205

 Clarence L. Haile
 Midwest Research Institute
 425 Volker Blvd.
 Kansas City,  MO  64110
                               148

-------
Jack R. Hall
Hydroscience
9041 Executive
Knoxville, TN
   Park Drive
   37919
R.C. Hall
Radian Corporation
P.O. Box 9948
Austin, TX  78766

Philip A. Hamlin
ITT Rayonier
Olyimpic Research Division
409 E. Harvard
Shelton, WA  98584

Frank Hochfesang
Mobile Res. & Dev. Corp.
Billingsport Road
Paulbsoro, NJ  08066

David J. Holmberg
Interlakd, Inc.
150 West 137th St.
Chicago, ILL.  60627
Pris Holtzclaw
USEPA - EDG
EH - 552
401 M  St. SW
Washington, D.C.
      20460
Dean Jarmin
USEPA
EMSL
26 West St.
Cincinnati,
Clair
OH
Street
Richard A. Javick
FMC Corporation
P.O. Box 8
Princeton, NJ  08540

Larry D. Johnson
USEPA
IERL/RTP MD62
Research Triangle Park, NC  27711

L.H... Keith
Radian Corporation
P.O. Box 9948
Austin, TX  78766

Dan Lent
USEPA - MDSD
WH - 553
401 M St. SW
Washington, D.C.  20460
                                149
Deborah L. Leoris
EDG Sample Control Center
Viar  & Co.,  Inc.
P.O. Box 1407
Alexandria, VA  22313

Bernard S. Maccabe
Carborundum Company]
P.O. Box 1054
Niagra Falls, NY  14302

David R. Marrs
Standard Oil Company
444 Warrensville Road
Cleveland, OH  44128

John G. Michalovic
Calspan Corporation
P.O. Box 400
4455 Genesee St.
Buffalo, NY  14225M

M. L. Moberg
ARL, Inc.
160 Taylor St.
Monorvia, CA  91016

Neil Moseman
Energy Resources Co.
135 Alewife Brook Parkway
Cambridge, Mass 02138

James E. Norris
CIBA - Geigy Corporation
P.O. Box 113
Meintosh, AL  36553

Dewey J. Northington
West Coast Technical
   Service Co,,
17605 Fabrica Way,  Suite D
Carritos, CA  90701

R.G. Oldham
Radian Corporation
P.O. Box 9948
Austin, TX   78766

Ms. Virginia M. Olsavicky
Eastman Kodak Co.
Kodak Park, Building  34
Rochester, NY   14650

W. M. Ollison
American Petroleum  Institute
2L01 L St. NW
Washington,  D.C. 20034

-------
William B. Prescott
American Cyanamid Co.
W. Main St.
Bound Brook, NJ  08805

John E. Riley
USEPA - EDG
WH - 552
401 M St. SW
Washington, D.C.  20460

Leon E. Rubin
Polaroid Corporation
750 - Main St. -3D
Cambridge, MA  02139

Dale R. Rushneck
Jacobs Environmental Group
373 S. Fair Oaks Avenue
Pasadena, CA  91105

James F. Ryan
Gulf South Research Institute
P.O. Box 26518
New Orleans, LA  70186

Alexander Schultheis
Hamilton Standard
Airport Road
Windsor Locks, CT  06096
Mail Stop 1A-2-4

Don Sharp
Monsanto Research Corporation
Station B, Box 8
Dayton, OH  45407

Art Shattuck
USEPA - EDG
WH - 552
401 M St. SW
Washington, D.C.  20460

James S. Smith
Allied Chemical Corporation
P.O. Box 1021R
Columbia Road and Park Avenue
Morristown, NJ  07960

George H. Stanko Jr.
Shell Development Co.
Box 1380
Houston, TX  77001
James L. Stauffer
Arthur D. Little, Inc.
Acorn Park (15/323B)
Cmbridge, MA  02140

Warren C. Steele
Foremost-Mckesson, Inc.
Foremost Research Center
6363 Clark Avenue
Dublin, CA  94566

Barry A. Stephenson
Stewart Laboratories, Inc.
5815 Middlebrook Pike
Knoxville, TN 37921

Garry E. Stigal
USEPA - EDG
WH - 552
401 M St. SW
Washington, D.C.  20460

Paul Sterch
Burns & Roe
P.O. Box 663
Paramu, NY  07652

Robert F. Stubbeman
Celanese Chemical Co., Inc.
P.O. Box 9077
Corpus Christi, TX  78408

John H. Taylor
Jacobs Environmental Group
373 S. Fair Oaks Avenue
Pasadena, CA  91105

Kathleen E. Thrun
Arthur D. Little, Inc.
Acorn Park (15/304)
Cambridge, MA  02140

W.F. Tully
Union Carbide Corp. R & D
770-316
P.O. Box 8361
S. Charleston, W. VA  25303

Dallas Wait
Energy Resources Co., Inc.
185 Alewife Brook Prwy.
Cambridge, Mass.  02138
                                150

-------
U.S. Environmental Protection Agency!
Region V, Library                 ^  -
230 South Dearborn  Street   "'"^
Chicago,  Illinois  60604

-------