EPA-600/3-83-050
I
PB83-253534
Love Canal Monitoring
Program. Volume I
GCA Corp., Bedford,, MA. GCA Technology Div.
Prepared for
Environmental Monitoring Svstam Lab.
Research Triangle Park, NC'
Jul 83
U.S. 8c
si l^anRa&n Sat wt<
^••^ y
-------
1 ECHNICAL REPORT DATA
(Please read Insmic do-is on the reI'trit1 before complc nng/
1 REPORT NO.
EPA-600/3-83-050
3. RECIPIENT'S ACCESSION NO.
4 TITLE AND SUBTITLE
Love Canal Monitoring Program - Final Report:
Volume I.
5. HtPORT DATE
July 1983
6. PERFORMING ORGANIZATION CODE
7. AUTHORIS)
8. PERFORMING ORGANIZATION REPORT NO.
GCA Corporation
GCA/Technology Division
9. PERFORMING ORGANIZATION NAME AND ADDRESS
GCA Corporation
Technology Division
213 Burlington Road
Bedford, Massachusetts 01730
10. PROGRAM ELEMENT NO.
11. CONTRACT/GRANT NO.
68-02-3168
12. SPONSORING AGENCY NAME AND ADDRESS
13. TYPE OF REPORT AND PERIOD COVERED
Environmental Monitoring Systems Laboratory
Environmental Protection Agency
Research Triangle Park, N. C. 27711
___
14. SPONSORING AGENCY CODE
EPA/600/08
15. SUPPLEMENTARY NOTES
16. ABSTRACT
This report summarizes the prime contractor activities during the monitoring phase
of the Love Canal project. Since GCA Corporation was only responsible for data
collection, no analytical results appear in this report. The program involved a
multifaceted sampling and analytical effort designed to detect and quantify a variety
of trace metals,* volatile organics, pesticides and other compounds in soil, sediment,
air, biota, and water samples. The principal purpose of these activities was to
provide data with which EPA could assess the extent of environmental contamination
in the Love Canal Area. Since the area declared as a National Emergency was extended
from those homes directly surrounding the Love Canal dumpsite to a more general area
on May 21, 1980, it had been determined that the overall exposure of residents must
be established as quickly as possible. The program, therefore, was on an extremely
tight schedule with field sampling activities to be completed by October 31, 1980.
GCA organized its efforts into seven technical elements, each of which is
discussed.
KEY WORDS AND DOCUMENT ANALYSIS
DESCRIPTORS
b.lDENTIFIERS/OPEN ENDED TERMS
COSATI I'icld/Ciruup
18. DISTRIBUTION STATEMENT
Release unlimited
19 SECURITY CLASS 11 Ins Report)
UNCLASSIFIED
21. NO OF PAGCS
217
20 SECURITY CLASS iTI
UNCLASSIFIED
22. PRICE
EPA Form 22?0-1 (Rev. 4-77) PREVIOUS EDITION is OBSOLETE .
-------
EPA-600/3-83-050
July 1983
LOVE CANAL MONITORING PROGRAM—FINAL REPORT:
Volume 1
by
GCA Corporation
GCA/Technolopy Division
Bedford, MA 01730
EPA Contract 68-02-3168
EPA Project Officer
Steven M. Bromberg
U.S. En"!i'--.fimr:^i P.stection Agency
["•i ,,-_",„.,-., x *
o""".-" , . 's
Chicago, Illinjis 60o04
ENVIRONMENTAL MONITORING SYSTEMS LABORATORY
OFFICE OF RESEARCH AND DEVELOPMENT
U.S. ENVIRONMENTAL PROTECTION AGENCY
RESEARCH TRIANGLE PARK, NC 27711
ir
NATIONAL TECHNICAL
INFORMATION SERVICE
OS DErMmcm OF coMXiicc
»]«!
-------
U,S.
Protection
-------
NOTICE
This document has been reviewed in accordance with
U.S. Environmental Protection Agency policy and
approved for publication. Mention of trade names
or commercial products does not constitute endc c-
ment or recommendation for use.
M-
11
-------
CONTENTS
Figures iv
Tables v
Exhibits vii
1. Introduction 1
Health and Safety 3
Sampling . 3
Analysis 4
Geotechnical 4
Sample Bank 5
QA/QC 6
2. Health and Safety 8
Medical Screening 9
Safety and Health Training 11
Training Laboratory Personnel 15
Sampling, Transport and Shipping Procedures 15
3. Quality Assurance/Quality Control . 16
Quality Assurance Objectives 16
QA Project Plan 17
Requirements for Subcontractors' QA Plans. ....... 22
GCA'S QA/QC Organization 24
Sample Collection and Handling QA/QC 25
Geotechnical Study QA/QC 28
Sample Analysis QA/QC 29
Data Management System QA/QC 41
Corrective Action System .... 41
Document Control/Chain-of-Custody Procedures 43
4. Sampling. 46
5. Geotechnical. 57
General 57
Scope of Work 57
Subcontractor Operations 65
Role of GCA 67
6. Analysis 70
Introduction 70
Subcontractor Selection 71
Program Management 84
Additional Related Activities 95
7. Sample Bank 98
8. Data Management ...... 102
Data Handling System 102
Coding Forms and Procedures 113
Raw Data Processing. .. ......... 134
11
-------
CONTENTS (continued)
Data Verification 161
Data Validation 194
QA/QC Programs 197
Appendices
A. Supporting Information on Analytical Activities A-l
B. Sampling and Analytical Internal QC Checks B-l
C. Software Documentation C-l
D. Edits and Checks Performed by Edit Programs D-l
E. Validation E-l
F. Data Reporting Forms F-l
G. Coding Manuals. G-l
H. Computer Reference File Listings H-l
I. Listings of GCA Love Canal Software 1-1
ill
-------
FIGURES
Number Page
1.1 GCA project organization and responsibility ..... 2
5.1 Drillsite information form • 68
7.1 Sample page from master log 99
8.1 Data handling system flow diagram 103
8.2 Actual data processing schedule 112
8.3 Love Canal data reporting form log 135
8.4 Data editing process 143
8.5 Final raw data processing flow 146
8.6 Love Canal tape log 148
8.7 Love Canal Study log of raw data reports sent to laboratories. 152
8.8 Verification of sample data 162
8.9 Verification of analysis data and generation of verified files 184
8.10 Verification correction process 193
8.11 First validation: January 13, 1981 196
8.12 Validation process for March, June, and November 1981 198
8.13 Data flow for external QC accuracy report 201
8.14 Data flow for intralab precision report for laboratory-
generated (Z) samples 203
8.15 Data flow for intralab precision report for duplicates
supplied by Sample Bank 205
8.16 Data flow for interlab precision report for triplicates
supplied by Sample Bank 206
-------
TABLES
Number
3.1 Appendix B—Section Headings 20
3.2 Subcontractor QA Plan QA/QC Measures 23
3.3 Sampling QA/QC Measures 26
3.4 Analytical QA/QG Measures 30
3.5 Cross-Reference to Analytical Internal QC Check Tables ... 32
3.6 Performance Evaluation Samples Used for Love Canal Study. . . 34
3. 7 External QC Check Samples Used for Love Canal Study 36
3.8 Data Management QA/QC Measures 42
4.1 Technical Evaluation Criteria (GCA 1-619-026-222-001) .... 47
4.2 Summary of Love Canal sampling activities 54
4.3 Summary of Daily Sampling Activity 55
5.1 Technical Evaluation Criteria: Drilling Program 58
5.2 Technical Evaluation Criteria: Supervisory Geologist
Activities, Geophysical Surveys, and Ground Water Modeling. 59
5.3 Prospective Bidder's List Geohydrology Subcontracts Love
Canal 61
6.1 Problem Compounds ...................... 74
6.2 Prospective Bidder's List ..... 77
6.3 Technical Evaluation Criteria 81
6.4 Analytical Program Configuration 83
-------
TABLKS (continued)
Number Page
8.1 Timetable for Raw Data Flow Ill
8.2 Love Canal Data Reporting Fortas 115
8.3 Reserve List of Sample H> Nusbers for Samples Generated
Internally by Analytical Laboratories . 130
8.4 Growth of Raw Data Master File (In Number of Card-Image
Records) 149
8.5 KM Reports and Their Uses 155
8.6 QC Samples With Analysis Data 157
8.7 QC Samples With No Analysis Data 159
8.8 Description of Love Canal Verification Action Forms ..... 177
8.9 Timetable of Verification Data Processing 181
8.10 Parameters of VERED Program Review 185
8.11 Conversion of Method Code 6250W 186
8.12 Format of Verified Data Records 188
8.13 Samples Not Verified 190
8.14 Schedule of Verification Corrections. .... 194
8.15 QA/QC Programs Run on Validated Data 200
VI
\
-------
EXHIBITS
Sample information on raw data listing sorted by lab 165
Page from Love Canal verification form log 167
Letter to verifying laboratories ..... . 168
Letter to nonverifying laboratories .... 169
. i • *
Log sheet used by Verification Coordinator during raw data
list review 171
8.6 Change or addition to analysis in block 179
8.7 Verification data processing log kept by Verification
Coordinator 183
VI1
-------
SECTION 1
INTRODUCTION
The purpose of the Love Canal Monitoring Program was to evaluate
pollutant levels present in the Love Canal Area of Niagara Falls, New York.
Environmental Monitoring Systems Laboratory (EMSL)/Las Vegas, Environmental
Monitoring Systems Laborato y (EMSL)/Research Triangle Park, Environmental
Monitoring ar>d Support Laboratory (EMSL)/Cincinnati, Health Effects Research
Laboratory (HERL)/Research Triangle Park, and the Robert S. Kerr Environmental
Research Laboratory, (ElvL)/Ada, Oklahoma, collaborated on the design of the
study with EMSL/RTP assigned as the coordinating EPA laboratory. Events
surrounding the Love Canal situation during the early summer cf 1980 made the
timely completion of the entire monitoring program of highest im^o-'ance
within the Agency. Accordingly, GCA/Technology Divisio.. was issued a tatk on
July 9, 1980 to assist EPA in successfully completing the Love Canal
Monitoring Program within a very restrictive time frame. GCA was charged with
the responsibility of managing the program and ensuring its timely completion
in a technically acceptable manner through the use of selected subcontractors.
Each of the EPA laboratories was to maintain a high level of involvement
throughout the program, providing technical advice, undertaking quality
control activities, and determining the validity of the data generated.
The program involved a multifaceted sampling and analytical effort
designed to detect and quantify a variety of trace metals,* volatile organics,
pesticides and other compounds in soil, sediment, air, biota, and water
samples. The principal purpose of these activities was to provide data with
which EPA could assess the extent of environmental contamination in the Love
Canal Area. Since the area declared as a National Emergency was extended from
those homes directly surrounding the Love Canal dump site to a more general
area on May 21, 1980, it had been determined that the overall exposure of
residents must be established as uickLy as possible. The program, therefore,
was on an extremely tight schedule with field sampling activities to be
completed by October 31, 1980.
GCA organized its efforts into seven technical elements as shown in
Figure 1.1, Project Organization Chart. Although the various EPA laboratories
had designed the overall study and provided detailed protocols for many of the
*Refer to Appendix A (pp. A-2) for compound hit lists and explanation of terms
used for compound classes.
-------
I
oc
s
UJ O
to or
O Q.
to
o
8
U
a
I
g
N
•H
«
60
U
O
O
01
in
O
u
c-
3
60
•H
&-,
i i
|
-------
J~ • program elements, it was GCA's responsibility to coordinate and implement
I . these procedures into a manageable system. The following paragraphs very
i J briefly describe the activities undertaken oy GCA in each of the seven program
elements.
HEALTH AND SAFETY
A comprehensive safety and health plan was developed which specified
procedures to help protect all personnel engaged in the Love Canal Monitoring
Project. The complexity and variety of chemical agents thought to be present
at the Love Canal made it necessary to design a highly flexible yet
conservative program of worker protection. The advice of consultants was
sought in the design of this program, and safety and health guidelines
provided by the Federal Government and the State of New York were incorporated.
'• The basic elements of the safety and health program consisted of a
preassignment medical examination of each employee who was engaged in sampling
in the field or in the handling of samples in the field prior to their
shipment to subcontractor laboratories. The medical examinations and
subsequent consulting services were provided by physicians who are
Board-certified in industrial medicine.
Individuals involved in field sampling activities were trained in the use
of safety and protective gear appropriate to the activity. GCA provided two
industrial hygienists, and the sampling subcontractor, Geomet Technologies,
Inc., also provided two industrial hygienists to review and monitor procedures
and sampling activities at the Love Canal site. The Geomet industrial
hygienists were on site during, any sampling activities. A series of Safety
and Health Directives were issued to address procedures to be followed during
the several sampling activities. Review of laboratory safety procedures in
force in the subcontractors' laboratories was made to ensure that the levels
.' of protection provided were consistent with the hazards potentially present in
, the samples they analyzed.
i SAMPLING
< The overall program design and the specific sampling protocols had been
j developed by EPA. It was GCA's responsibility to acquire via competitive bids
I a qualified subcontractor to coxlect the multimedia samples for subsequent
' analysis for volatile organics, semivolatile organics, pesticides, inorganics
I and radioactivity.
i Air samples were collected in the living ar'?as of homes, in basements and
1 outdoors using Tenax, polyurethane foam (PUF) and hi-vols. Water samples were
i collected from drinking water supplies, sewers, basement sumps, rivers,
streams and ground water monitoring wells. Soil and sediment samples were
j collected throughout the study area. A biota sampling campaign also resulted
j in the collection of animals and plant materials.
;: GCA managed the sampling activities from a field office established in a
! vacant home in the immediate vicinity of Love Canal. This office, staffed
; with knowledgeable technical personnel and support/secretarial capabilities,
-------
served as the communications center for all field activities. All
individuals involved with the field operations were required to report to the
field office at the beginning and end of each day, not only as a management
control but also to ensure that the latest technical and safety directives
were given to all affected parties.
ANALYSIS
As with sample collection, the analytical protocols were provided by
EPA. GCA had to procure via competitive bids the appropriate number of
subcontractor laboratories to provide the analysis of the collected samples.
In addition, subcontractor laboratory services were procured for the
preparation of the Tenax and PUF cartridges and the preparation of Tenax
cartridge standards and spikes used in the air monitoring program. The
HERL/RTP prepared the QC standards and spikes for the PUF cartridges.
Tenax samples were analyzed for volatile organic compounds; PUFs were
analyzed for pesticides and related compounds, and hi-vol filters were
analyzed for metals. Specific compounds and elements of these groups are
listed in Appendix A to this report.
The various types of water samples were analyzed for volatile organics,
semivolatile organics, pesticides, metals, and the anions fluoride and
nitrate. Ground water samples were also analyzed for total organic carbon
(TOO and total organic halic=s (TOX). The pH and conductivity of ground
water samples were also measured.
Soil and sediment samples were also analyzed for volatile organics,
semivolatile organics, pesticides and metals. Anions, TOC and TOX were not
measured in these sample types. Because no widely-accepted methods existed
for the analysis of semivolatiles in these media, a methods evaluation was
conducted on 48 samples to determine which of three available procedures was
most suitable. The analytical subcontractors used the method -determined by
this study for the remainder of the program.
Various biota samples were analyzed for organic and inorganic species on
the Love Canal Monitoring List. Biological materials, specifically mice,
worms, and crayfish were analyzed for semivolatile organics and pesticides.
Potatoes and oatmeal placed at the sampling sites were designated as
"foodstuffs" and analyzed for volatile organic compounds. Metals were
measured in vegetation and in hair.
GEOTECHNICAL
The geotechnical portion of the Love Canal Study was designed principally
by the EPA's Robert S. Kerr Laboratory in Ada, Oklahoma. It consisted of the
following three activities:
• Geophysical measurements such as ground penetrating radar,
electromagnetic conductivity and magnetometry to determine
subsurface characteristics.
-------
^ • Hydrogeological investigation which consisted of the installation of
^ 174 monitoring wells.
• Ground water modeling which predicted the movement in each aquifer.
GCA's responsibility was to obtain the necessary subcontractor skills via
competitive bids and to ensure the timely completion of the work. There were
Ada staff on site to provide technical advice during the field program.
SAMPLE BANK
The sample benk was set up and staffed with GO' personnel during the
field program. The purpose of the sample bank was to oversee all aspects of
sample handling, chain of custody procedures and document control. The Sample
Bank Manager served as GCA's Document Control officer onsite and directed the
following activities:
• dispensing custody records
• receiving precleaned air collection media
• receiving external QC samples and calibration standards
• dispensing sample tags
• receiving collected samples
• verifying custody record against collected sample before accepting
custody
• logging accepted samples into Master Log
• maintaining internal custody of samples while in the Sample Bank
• maintaining document control of sample tags and custody records
• regro iping collected samples for shipment to analytical labs
• inserting external QC samples into the analytical shipments
• packing samples appropriately for shipment
• transferring custody of the sample shipment to the carrier
• notifying the analytical laboratory of shipment
The selected Sample Bank facilities were appropriate for this
multifaceted operation; they included a 400 square foot shipping and receiving
area, 1,240 square feet of fully equipped laboratory space, and 230 square
-------
feet of office space at the Bell Aerospace Textron plant in Niagara Falls, New
York. A 35-foot refrigerated trailer was placed adjacent to the entrance to
provide 245 square feet of refrigerated storage area at the Bank; a
refrigerated truck was also leased to transport samples from their collection
area to the Bank and from the Bank to Buffalo Airport for shipment to
analytical laboratories.
The laboratory area included two 6-foot fume hoods, sinks, refrigerators,
a freezer, and adequate bench space. The air handling system was modified to
ensure 100 percent fresh air intake and no air return through the system to
avoid contamination of collected samples. The office area included standard
office furnishings and a copying machine.
The entire area, including the refrigerated trailer, had access
restricted to GCA employeec with keys; it was accessible 24 hours a day, 7
days a week. Bell Aerospace security guards maintained security of the whole
complex at all times. The file cabinets and storage containers for documents
and samples were equipped with security bars and combination locks.
QA/QC
The objective of the GCA/Technology Division contract efforts on this
program was to collect representative multimedia samples which could then be
analyzed in the required time frame and by the best available methods. In a
sense, most of GCA's work on the Love Canal Study was quality
assurance—managing so as to achieve high quality results.
An important objective of the quality program was to alert the
subcontractors to the importance of high quality work on their part and to
demonstrate continuing concern for quality throughout the program. Measures
taken by GCA to accomplish this included the provision of: general QA/QC
requirements in the Requests for Proposal (RFP) for all technical work to be
subcontracted; EPA's specific internal QC requirements in each analytical
subcontract document; the requirement for subcontractor QA Plans; QC
Coordinators with expertise in each technical area; written sampling and
analysis procedures including QC requirements and planned corrective actions;
Performance Evaluation and QC check samples for analytical work and, perhaps
most important, continuing communication between GCA and its subcontractors.
The QA goals of the program were stated as follows:
• Collect Representative Samples
• Maintain Chain of Custody of Samples
• Analyze Samples using Appropriate Procedures
• Achieve Comparable Results from Different Analytical Laboratories
-------
• Estimate Precision and Accuracy Achieved
• Manage and Store Sampling and Analysis Data Generated
• Report Analytical Results in a Useable Format
• Compile a Complete Project Document Inventory
A comprehensive four volume quality assurance plan was written and then
implemented by GCA in pursuit of the above stated goals and objectives.
The remainder of this report is organized around the seven program
elements just mentioned. Each succeeding section amplifies the activities
undertaken by GCA within these elements with much of the documentation being
provided as appendices. It should be noted that GCA has provided
documentation on several aspects of the Love Canal Monitoring Program in
previous submittals.* Generally, information presented in these earlier
documents, such as detailed sampling and analytical procedures, will not be
included in this summary report. However, program elements that have not to
date been documented in formal reports (e.g., Data Management System) will be
discussed in considerable detail in this Final Report.
*Previous submittals include the following documents: Quality Assurance Plan,
Love Canal Study; QA Plan Appendix A, Sampling Procedures; QA Plan Appendix B,
Analytical Procedures; QA Plan Appendix Q, Subcontractor QA Plans; and Love
Canal Study, QA/QC Summary Report, January 1982.
-------
SECTION 2
HEALTH AND SAFETY
GCA was responsible for developing a health and safety plan both for its
own employees as well as providing minimum standards that each subcontractor
had to meet. Protocols were established covering field operations and
laboratory operations. Extensive use was made of the following documents in
preparing the Plan.
• "Enforcement Considerations for Evaluations of Uncontrolled
Hazardous Waste Disposal Sites by Contractors," National Enforcement
Investigations Center (NEIC), U.S. EPA, April 19SO.
• "Safety Manual for Hazardous Waste Site Investigations," U.S. EPA,
September 1979.
• "Safety Plan, Procedures to be Observed During the Collection of
Soil Samples at the Love Canal Chemical Waste Dump," New York State
Department of Health.
The plan incorporated the rules, guidelines and recommended work
practices contained in the above-referenced material. All project personnel
were expected to follow the safety and health procedures set forth, referring
to the appended material for background information. In the event of doubt or
ambiguity, supervisors were to contact the GCA Safety and H.jalch Officer for
resolution before proceeding.
It is clea ly impossible to anticipate all specific safety and health
hazards beforehand so all field and laboratory personnel must exercise common
sense and good judgment in their approach to a given situation. Expert advice
on these matters was available to all personnel through the Safety and Health
Officer, and through GOA1s medical consultants.
Highlights of the Safety and Health program for the Love Canal Study
included:
• Consulting services of a physician with a specialty in occupational
medicine, were available through GCA/Technology Division.
• Field personnel underwent a physical examination both before and
after the field sampling phase of the Love Canal Project.
-------
• All field personnel who entered the Love Canal Site and who
collected or handled environmental samples participated in a
training program in the use of respirators, protective gear and in
specific safety procedures to be followed.
• As the various phases of the Sampling Tasks progressed, Safety and
Health Directives were issued at the GCA Office located at the Love
Canal Study Area, Colvin Boulevard, Niagara Falls, New York. A copy
of these Directives and any additions were issued to each GCA staff
member and to each member of the subcontractor teams at the
beginning of each day's work when that member signed-in at the GCA
Field Office.
• An Industrial Hygienist was on call at all times during sampling
activities at the Love Canal Site. Any questions that arose
concerning potential exposures to toxic vapors or other chemical
agents were directed to him. Any deviations from Safety and Health
procedures contained in the onsite Directives had to be cleared
through the Industrial Hygienist beforehand.
• Local Emergency Services (Fire Department, Hospital) were advised of
the sampling activities at the Love Canal Site and they were
prepared to deal with helath-related emergencies.
• If flammable materials were encountered during any sampling
operations, the local Fire Department was to be contacted
immediately.
• Each drill rig was equipped with a Fire Extinguisher (10 Ib, for
Class A, B, C fires), a First Aic?
-------
Assessment of Health Status of Personnel Prior to Work
The preassignment medical examination will:
• Evaluate physical and psychological suitability of the worker for
the proposed work.
• Identify health factors which may disqualify or restrict a worker
from specific tasks under the project.
• Assess capability of candidate worker to wear the required
protective gear.
• Determine baseline or reference data to help evaluate the
significance of findings that may appear during subsequent medical
examination.
The details of the examination are left to the discretion of the
examining physician but should consist of a medical and occupational history,
a thorough physical examination with particular attention to the
cardiopulmonary systems, general physical fitness, skin, blood forming,
hepatic, renal and nervous systems. Tests should include chest x-ray,
electrocardiogram, ventilatory pulmonary function, urinalysis, a complete
blood count, liver function and a blood chemistry profile. Additional tests
and procedures may be indicated in the judgment of the examining physician,
based on knowledge of the work to be performed.
Copies of all medical reports associated with the Love Canal Study
Project must be provided to the prime contractor. These medical records will
be held in strict confidence by GCA and will not be released to anyone without
the written permission of the individual.
Support of Health ot Personnel During Project
During the project the following will be provided by the prime contractor:
• First Aid equipment will be maintained at the GCA Field Office and
Emergency Medical Services (Fire Department, Police Department and
local Hospital) will be notified of activities at the Lov?, Canal
Site.
• An Emergency Transport Plan will be explained to all field personnel
during the Training Session.
• A Medical Consultant with a specialty in occupational medicine will
be available for consultation at the Love Canal Site should health
problems arise during the course of sample collection and handling.
• An Industrial Hygienist Consultant will establish specific
procedures to be followed for each sample collection and handling
activity. Services will also be available to monitor any suspect
area for toxic chemical agents during the study.
10
-------
Evaluation and Care of Personnel in the Event of a Work-Related
Accident or Illness
Emergency medical care will be provided through a local hospital under
consultation with the Medical Consultant assigned to this project. The local
Poison Control Center will also provide services as required.
Monitoring of Personnel for Evidence of Post-Study Adverse Health Effects and
Determination of Their Suitability for Future Work Assignments of this Type
The purpose of post-study medical monitoring will be to:
• Assist in early detection of work-related health effects (from this
study and others that may be undertaken in the future) in project
personnel.
-i,
• Assess the health status of project personnel as to fitness and
suitability for future assignments of this nature.
Copies of the post-study medical examinations must be provided to the
prime contractor's medical consultant for recordkeeping purposes. These
medical records will be held in strict confidence by GCA and will not be
released to anyone without the written permission of the involved party.
SAFETY AND HEALTH TRAINING
Field Personnel Training
Before beginning work at the Love Canal Study Area, each field team
member must attend a Safety and Health Training Program. The Training
Sessions will be given at or near the Love Canal Site. Each session will be
approximately 1 day in length and will be conducted by a staff trained in
"industrial hygiene and safety. Topics to be discussed will include:
1. Purpose of Training:
• Ensure that regard for the health and safety of the employees
of other agencies, the public, and the environment is maximum.
• To comply with all laws, rules, and regulations to safeguard
the health and safety of all employees, the public, and the
environment.
• Increase the ability of employees to react responsibly and to
handle emergency situations in a safe manner.
2. General Field Safety Techniques:
• Availability of safety and health Consulting Services (medical,
Industrial Hygiene) and when and how to use them.
11
-------
Responsibilities
- Site surveillance/observation/plan development,
- Restricted zones,
- Safe zones,
Vehicles (cars, trucks, drill rigs)
- Inspection,
Operation,
- Mandatory rules, regulations, and orientation,
- Decontamination.
Hazardous Materials in the Field
- Hazards,
Storage,
- Transportation (DOT requirements for common sample
preservatives, plus general "common sense" rules).
Use of Field Equipment and Supplies
Work Tools,
- Testing Equipment,
- Sampling Equipment:,
Working Practices
Working Alone (buddy system)
- Isolated Areas,
- Streams, Rivers, Lakes,
- Hazardous Waste Sites.
Work Limitations
Prohibited Work Practices,
Fatigue,
- Hours of Work.
12
-------
3. Personal Protective Equipment and Clothing:
• Respiratory I-rotection,
• Selection,
• Fit,
• Donning and Use.
• Personal Protection Apparel
- Clothing (gloves, aprons, coveralls, etc.),
Eye Protection,
Foot Protection,
- Head Protection.
• Limitations of Clothing and Equipment
• Decontamination of Clothing and Equipment
• Disposal of Contamination Clotning and Equipment
4. Emergency Help and Self-Rescue
• Principles of First Aid
- Restoration of Breathing,
- Control of Bleeding,
- Recognition and Treatoenf of Physical Shock,
- Open and Closed Wounds and Burns,
Fractures and Dislocations,
- Transportation.
• Availability of Emergency Services
- Poison Control Center,
- Hospital and Ambulance Services,
- Local Fire and Police Departments.
13
-------
• How to Obtain Emergency Treatment in the Field
- How and When to File a Report of Accident/Incident
• Employee Compensation Benefits
5. Sampling Techniques J
3
• Hazards of Sampling
i
• Amount of Samples
i
• Containers for Samples ;
• Field Tools (
Radioactivity, »
<
- Explosivity, I
Other.
• Sample Security ;
• Packaging (DOT/EPA)
• Shipment (DOT/EPA)
6. Safety and Health Directives
• At the training session, the system of Safety and Healty
Directives for the Love Canal Study will be explained. These
Directives will be posted at the GCA Field Office and will be
provided to all project team personnel. It is the
responsibility of each person to read and comply with
instructions given in these Directives.
The following STANDING ORDERS will be in force for the full period of the
Love Canal Monitoring project:
STANDING ORDERS
• No member of the project team may enter the Love Canal area without j
signing in at the GCA Field Office beforehand..
- Each person must sign, and indicate the area of his work |
(sector) and punch-in his reporting time on the time clock
provided. j
• There shall be no smoking, eating, drinking or chewing gum while in i
the Love Canal area. '"•
-------
• No open fires are permitted in the Love Canal area.
• Ho person may enter the fenced-in portion of the Love Canal site
alone or without the knowledge of his supervisor.
TRAINING LABORATORY PERSONNEL
Each Subcontractor providing analytical services to the Love Canal Study
must provide a copy of his Laboratory Safety Manual, or a description of the
in-house laboratory safety program before work begins on this project.
Additionally! each Subcontractor must provide to GCA/Technology Division the
name of the Laboratory Safety and Health Officer and the name of an
alternate. These individuals will coordinate safety and health matters
relating to the laboratory aspects of this project. Lists of chemical agents
which may be present in each sample category will be provided to the
Subcontractor Laboratory Safety Officer as they become available.
SAMPLING, TRANSPORT AND SHIPPING PROCEDURES
Sampling
It is anticipated that samples of waste water, airborne participate,
soil, sediment and biological materials will be collected during the course of
this program. No samples will be drawn from drums, barrels or other
concentrated sources with the singular possible exception of possible
unintentional puncturing of drums during drilling.
Safety and Health Directives will be issued for the sampling of all the
above media and for the special precautions that must be taken during drilling
operations. These Directives will be issued to each Subcontractor and to each
individual participating in the sampling activities at the Love Canal Site.
Individuals will receive these Directives during the Training Session at the
Site.
Transport
Ail samples collected under this project are to be taken to the GCA Field
Office located at Colvin Boulevard, Niagara Falls, New York and/or to the GCA
Sample Bank for preparation for shipment to appropriate laboratories. No
samples, specimens, or other materials may be removed from the site other than
those which will be transmitted to the Sample Bank or to designated disposal
areas. Samples will be transported to the Sample Bank only in approved
vehicles. Personal or rental cars cannot be used for this purpose. All
samples must be properly packaged following the sampling protocols specified
elsewhere. In addition, all samples must be placed in a suitable container
(e.g., a plastic-lined cardboard container) before transport.
Shipping
Shipping containers and labeling procedures will follow the protocol
established in the QA/QC Manual. Every precaution should be taken to prevent
container leakage or for sample residues to contaminate the exterior of the
vessel.
15
\
-------
SECTION 3
QUALITY ASSURANCE/QUALITY CONTROL
QUALITY ASSURANCE OBJECTIVES i
As mentioned in the Introduction, most of GCA1 s work on the Love Can. , j
Study was, in a sense, quality assurance. The specific goals of the quality ;
assurance program were as follows: . j
• Collect Representative Samples \
• Maintain Chain of Custody on Samples j
• Analyze Samples -isirg Appropriate Procedures
• Achieve Comparable Results from Different Analytical Laboratories I
it
• Estimate Precision end Accuracy Achieved ]
e Manrge and Store Sampling and Analysis Data Generated |
• Report Analytical Results in a Useable Format 1
3
• Compile a Complete Project Document Inventory
The bi-A laboratories listed below had designed QA/QC programs for their
areas of responsibility in the Love Canal Study.
• EMSL, Research Triangle Park
• HERL, Research Triangle Park
• EMSL, Las Vegas
• ERL, Ada
I
• EMSL, Cincinnati i
The QA/QC measures planned by these laboratories and included in their Love i
Canal monitoring protocols were integrated into GCA's Quality Assurance ;
Project Plan for the Love Canal Study. The following section britfly f
describes the Plan. i
16
-------
QA PROJECT PLAN
The Love Canal Study QA Plan was tailored to t-he EPA1 s Office of Research
and Development May 1980 draft of "Guidelines and Specificatior><; for Preparing
Quality Assurance Project Plans." Content requirements stated *.a that
document were met and the format used in the appended Model QA Plan was
followed. The Love Canal Study QA plan was reviewed and approved by EPA1s
Project Officer and the QA officers at each of the participating EPA
laboratories.
The major sections of the QA Plan are listed below:
Love Canal Study QA Plan—Major Sections
1.0 Project Description and Organization
2.0 Major Program Elements
3.0 QA Objectives
4.0 Document Control Procedures
5.0 Chain of Custody Procedures
6.0 Sampling Internal QC Checks
7.0 Analytical Internal QC Checks
8.0 Precision and Accuracy
9. 0 Data Management System
10.0 System and Performance Audits
11.0 Corrective Action Procedures
Attachments: Subcontractors' QA Plan Format Suggestions
Appendix A: Sampling Procedures
Appendix B: Analytical Procedures
Appendix Q: Subcontractors' QA Plans
Appendices A, B and Q were contained in separate volumes and all four
volumes were distributed in ring-binders to enable easy revision of individual
sections as necessary. The separate volume format made it possible to provide
detailed operating procedures including EPA1s required internal QC procedures
to the sampling and analysis subcontractors without giving them information on
planned external QC measures such as performance audits. All four volumes
were provided to EPA and GCA personnel.
17
t.,
-------
f
Overall Plan
The first volume provided an overview of the quality assurance measures
for the entire program and was not distributed to subcontractors. The
required sampling and analysis internal QC checks were presented in tabular
fashion, classified according to media sampled and pollutant class measured.
The replicate sampling and analysis scheme and the external QC samples used,
as well as the appropriate equations to statistically estimate the precision
and accuracy achieved, were documented in considerable detail in the overall
plan.
Chain of Custody and Document Control procedures specific to the Love
Canal Study were developed following NEIC guidelines. The QA Plan sections on
these topics were reviewed by NEIC staff members and their comments
incorporated. These sections were provided to each subcontractor at the start
of his subcontract activity so that uniform procedures would be followed; they
were also included in Appendices A and B.
Appendix A: Sampling Procedures
This set of detailed sample collection procedures was compiled by GCA and
provided to the sampling subcontractor and the supervisory geologist before
any field sampling started. The manual was organized so that a specific,
stand-alone procedure existed for each type of sample to be collected; the
field sampler could remove and use only the procedures he needed. The
appropriate type of sample container and its required cleaning, the volume or
weight of sample to be collected, step by step collection procedures, field
preservation and internal QC requirements, and chain of custody requirements
were included in each procedure.
The complexity of the sampling program is shown by the major sections
contained in Appendix A and listed below. Each section contained an overview
for that media and then the individual procedures for sample collection. The
procedures were dictated both by the media sampled and the analyses to be
performed. It should be noted that different types of water required
different sampling procedures due to equipment needs, safety considerations,
preservation requirements and analytical needs.
Appendix A—Major Sections
1.0 Air Sampling
2.0 Pollutant Transport Air Sampling
3.0 Residential Water Sampling from Sumps
4.0 Drinking Water Sampling
5.0 Sewer and Drain Water Sampling
6.0 Surface Water Sampling
18
-------
t;
h,
7.0 Ground Water and Well Core Sampling
8.0 Soil Sampling
9.0 Sediment Sampling
10.0 Biomonitoring
11.0 Field Sample Data Forms
12.0 Document Control/Chain of Custody Procedures
Procedures for quick approval and documentation of necessary changes in
the field were identified early in the program. These included identifying
the necessary approvals (within the subcontractor, GCA and EPA project teams),
documentation of the change request and approval in Change Logs, and
incorporation of the changed procedures in revisions to Appendix A.
Appendix A was a working document, revised to meet conditions found in
the field. It has been thoroughly field-tested so that the third and final
revision represents a valuable resource for investigation of hazardous waste
sites in general.
Appendix B: Analytical Procedures
The methodology to be used in analyzing samples collected during the Love
Canal Study was provided to the analytical subcontractors in two different
formats before it physically became Appendix B to the QA Plan. The analysis
protocols were provided as appendices to the RFP for Sample Analysis Services
in Support of the Love Canal Study so that accurate cost estimates could be
made by the firms bidding on the RFP. Also, the analytical protocols specific
to each subcontractor formed part of that subcontract document so that
procedures to be followed, including EPA's internal QC requirements, were
clearly spelled out before any samples were analyzed.
GCA prepared the analytical protocol documents, researching technical
questions and, ••' f necessary, revising the subcontract protocols. These same
analytical protocols were grouped somewhat differently in Appendix B than in
the subcontract documents to provide the methodology and QC requirements for
the entire Love Canal Study.
The major section and subsection headings are given in Table 3.1 to
indicate the content and organization of the analytical procedures manual. As
with the sampling procedures manual, each analytical procedure was a complete,
specific protocol that could be removed and used by the analyst. Quality
Control Procedures formed a subsection of every analytic?! protocol and were
presented in both text and taoular formats.
19
\
-------
TABLE 3.1. APPENDIX B—SECTION HEADINGS
1.0 Air Analysis
1.1 Overview
1.2 Volatile Organic Compounds on Tenax
1.3 Pesticides and Related Compounds on Polyurethane
Foam Plugs
1.4 Metals
1.5 Dioxins
2.0 Water Analysis
2.1 Overview
2.2 Volatile Organics
2.3 Semivolatile Organics
2.4 Pesticides
2.5 Metals
2.6 Anions
2.7 Total Organic Carbon
2.8 Total Organic Halogens
2.9 Dioxins
2.10 Radioactivity
2.11 pH
2.12 Conductivity
3.0 Soil and Sediment Analysis
3.1 Overview
3.2 Volatiles
3.3 Semivolatiles
3.4 Pesticides
3.5 Metals
3.6 Dioxins
3.7 Radioactivity
(continued)
20
-------
TABLE 3.1 (continued)
4.0 Biota Analysis
4.1 Overview
4.2 Semivolatiles and Pesticides in Animal Tissue
4.3 Volatile Organics in Foodstuffs
4.4 Metals in Hair
4.5 Metals in Vegetation
4.6 Interferon in Mouse Cells
5.0 Data Reporting Forms
5.1 Overview
6.0 Chain-of-Custody Procedures
6.1 Document Control Procedures
6.2 Chain of Custody Procedures
7.0 Reference Material
7.1 Method 603 - Acrolein, Acrylonitrile
7.2 Method 605 - Benzidines
7.3 Method 608 - Organochlorine Pesticides and PCBs
7.4 Method 624 - Purgeables
7.5 Method 625 - Base/Neutrals, Acids and Pesticides
7.6 Dioxins
7.7 Qualitative Analysis by GC/MS
7.8 Metals by Atomic Absorption Spectroscopy
21
-------
Appendix Q: Subcontractors' QA Plans
To establish GGA's and EPA's concern for quality early in the program,
each subcontractor was required to submit to GCA a QA Plan specific to his
project activity. GCA provided suggested QA Plan outlines tailored to
sampling, analysis or geotechnical work to each subcontractor with his
subcontract document or shortly thereafter; the detailed sections on Document
Control and Chain of Custody procedures were also provided. Analytical
subcontract documents included analysis protocols containing EPA1s internal QC
requirements.
The submitted QA Plans were reviewed with particular attention to these
QC requirements; revisions were requested when necessary and the plans were
conditionally approved by GCA subject to EPA1s final approval. The plans were
then grouped according to project activity and incorporated in their entirety
in Appendix Q. This volume was distributed to EPA and the EPA Project Officer
and the QA officers at the participating EPA laboratories approved the
subcontractors' QA plans by their approval of the GCA QA Project Plan.
REQUIREMENTS FOR SUBCONTRACTORS' QA PLANS
Subcontractors were required to submit QA plans to demonstrate their
understanding of, and capability for performing, quality work on their
specific prrject tasks. It was not expected that different subcontractors
performing the same task would submit identical QA plans but, rather, that
they would meet the minimum QC requirements for that task. This would ensure
that all data generated would be of at least that level of quality. It was
felt that no problem would be caused by subcontractors who exceeded the
minimum requirements since their data would be at or above the required level.
The measures taken to ensure the adequacy and completeness of all QA
Plans, and the comparability of plans prepared for the same project activity
are listed in Table 3.2. The subcontractors were grouped according to their
project activity as follows:
• Sampling/Analysis Preparation
• Sample Collection
• Sample Analysis
• Geotechnical Study
Subcontractor QA plans were to be submitted to GCA within 5 days of
receipt of executed subcontracts; this rapid turnaround was required by EPA
and GCA in an effort to establish a firm QA/QC program prior to the start of
sample collection and analysis. However, not all of the specific information
requested in the QA Coordinator's suggested outlines was available during this
time period. This was true primarily of the control limits and corrective
actions to be used for the various analytical internal QC activities.
22
-------
TABLE 3.2. SUBCONTRACTOR QA PLAN QA/QC MEASURES
Suggested outlines specific to sampling, analysis, geotechnical project
activity provided
Internal QC requirements provided
Document control heading format provided for QA Plans
Overall Document Control system framework provided
- Subcontractor described internal system operation and identified
Document Control Officer
Overall Chain of Custody system framework provided
Subcontractor described internal system operation and identified
Sample Custodian
Standardized Data Reporting Forms provided
- Subcontractor described internal data review/reporting practices
Corrective Action Requirements
- Subcontractor described internal correction action system and
identified individual(s) responsible for initiating action
Review of submitted plans by GCA technical staff and/or QA Manager
23
-------
As noted earlier, the Sampling Procedures Manual was compiled by GCA and j
provided to the sampling and geotechnical subcontractors before any field ?
samples were collected. This provided a basic QA Plan for field sampling; ;
subcontractors were to provide calibration and maintenance procedures and
operational checks for their equipment, state precision and accuracy goals
when possible and identify individuals responsible for various project
elements.
In the case of analytical work, subcontract documents included analytical
and QC procedures obtained from EPA, but a number of these protocols were
missing specific control limits for surrogate compound and LCS recovery;
amended protocols were distributed. Appendix B to the QA Project PI* .1 was the
final document on analytical and internal QC procedures. It contained all
internal QC activities, control limits as stated by EPA and suggested
corrective action?; the sections relevant to each subcontractor's task were
distributed before field sample analysis commenced. Appendix B provided a
basic, comparable QA Plan for analytical work with individual subcontractors
providing the same types of information requested of sampling subcontractors.
GCA'S QA/QC ORGANIZATION
GCA, through its QA Coordinator and QC Coordinators for major program
elements, provided liaison between EPA and the subcontractors working on the
Love Canal Study. GCA served as a conduit through which information on EPA's
QA/QC requirements and questions on the implementation of these requirements
could flow. Each subcontractor was responsible for implementing the QA/QC
requirements appropriate to his project tasks.
One of GCA's prime concerns was providing information in a uniform manner
to all subcontractors engaged in the same project activity. To accomplish
this, GCA assigned staff members with appropriate technical expertise to serve
as QC Coordinators for the following program elements:
• Sample Collection and Handling
• Geotechnical Study
• Sample Analysis
• Data Management System
The Sampling and Geotechnical Study QC Coordinators worked principally onsite
at or near Love Canal in Niagara Falls, NY; the Analytical and Data Management
QC Coordinators and the QA Coordinator worked principally at GCA's Bedford, MA
facility.
This section provider a brief overview of GCA's QA/QC activities in the
program areas listed above and, in addition, addresses Corrective Action,
Document Control and Chain-of-Custody Procedures.
24
V
-------
SAMPLE COLLECTION AND HANDLING QA/QC
The onsite GCA Field Office and Sample Bank were significant QA measures
designed to aid in the implementation of sampling QC. Their technical staff
served as sampling QC Coordinators, enabling the rapid resolution of problems
as they came up in the field. Table 3.3 lists the QA/QC activities of the
Field Office and Sample Bank. Section 4 of: this report provides more detail
on these operations.
The most significant activities of the QA Coordinator related to the
sampling effort were the following:
• coordinating preparation of the sampling procedures manual
(Appendix A)
• establishing uniform procedures for field blanks, container cleanup
and preservation of samples
• identifying the approvals needed to change a sampling procedure
• setting up a Change Log at GCA and Georaet field offices
• supervising the implementation of custody requirements
• onsite system audits
Internal QC Program
The participating EPA laboratories provided QC requirements for field
blanks and replicate field samples; Geomet Technologies, Inc., the sampling
subcontractor, provided operation checks for their sampling and meteorological
equipment. These internal QC checks for sample collection and meteorological
measurements were integrated into the QA Project Plan by GCA and presented as
summary tables in Section 6 of that plan. The tables included control limits
and planned corrective actions and were prepared to enable review of the
sampling program internal QC measures by GCA or EPA. Section 6 is included in
Appendix B to this report.
Each sampling procedure contained in the Love Canal QA Plan Appendix A
included the appropriate internal QC operation check, control limits and
planned corrective actions for that procedure to ensure use of these measures
by field technicians. Appendix A was distributed to Geomet Technologies, Inc.
who collected all samples except those related to ground water, and to JRB
Associates, Inc., the supervisory geologist subcontractor, who collected
ground water and well core sediment samples. Internal QC checks for Geomet's
activities are summarized in Tables 6.1.1 through 6.5.1 of Section 6 in
Appendix B to this report; QC checks for JRB's sample collection tasks are
included in Tables 6.3.1 and 6.4.1.
25
-------
TABLE 3.3. SAMPLING QA/QC MEASURES
• Standardized Sample Collection Procedures
• Each Air Collection Medium Prepared by One Laboratory from One Lot
• Onsite GCA Field Office
- Expedited all field activities
Interacted with EPA and Subcontractors
- Coordinated actual sampling operations
- Received collected samples in the field
• Equipment Calibration Procedures/Control Limits Provided
• Onsite GCA Sample Bank
- Directed chain of custody procedures
- Received collected samples
- Shipped samples to analytical laboratories
- Inserted external QC samples in shipments
- Maintained document control
• Documentation of Procedure/Equipment Changes
• Planned Corrective Actions
• Field Blanks and Replicate Field Samples Collected
• System Audits Onsite—Qualitative Overview
• Performance Audits—Air Sampling Flow Rates
26
-------
Onsite System Audits
These reviews of the entire sampling operation proved invaluable. The
first such system audit was conducted on 20 to 22 August 1980 and included the
first day of air sampling and the second day of soil sampling by Georaet
Technologies, Inc. Geomet's Field Office and GCA's Field Office and Sample
Bank operations were also audited.
This audit just at tha start of the sampling program provided an
opportunity for field personnel to raise questions and for the QA Coordinator
to identify and investigate sampling procedures which might cause analytical
problems. The protocol for implementing changes in the field was established
during this site visit. Confusion about field blanks was found within the
Geomet staff; the purpose and handling of field blanks was discussed with both
Geomet and GCA personnel. Without exception, the project team members were
weir-motivated and a valuable interchange of information occurred.
A second site visit by GCA's QA Coordinator on 10 to 11 September 1980
included followups on previous corrective actions and an audit of JRB
Associates, Inc. ground water sample collection. Special attention was given
to field blank handling, sample container cleanup and field preservation
requirements.
Memos on field blank handling and container cleanup were prepared and
distributed during the audit. A memo on field preservation procedures was
distributed the following week, after further investigation of appropriate
preservation techniques by the QA Coordinator.
A third site visit on 2 to 3 October 1980 concentrated on the
geotechnical study activities but included a followup audit of ground water
sample collection. Earlier inefficiencies in ground water sampling had been
corrected.
The reports of these audits are contained in Appendix A to GCA's QA/QC
Summary Report. The reports were written in several sections so that a
subcontractor received only the audit of his own activities and any overall
summary; EPA and GCA personnel received all the sections.
Performance Audits
A flow audit of the two types of air sampling pumps and the Hi-Vol
samplers used in the Love Canal Study was conducted by EPA1a Quality Assurance
Division from EMSL/RTP. The audit included 32 of the 92 DuPont Model P125
pumps, 36 of the 64 DuPont Model P-4000 pumps and 9 Hi-Vol Samplers. It was
conducted in the field, just before and just after actual sample collection on
29 September to 1 October 1980. The audit report is presented in Appendix A
to GCA's QA/AC Summary Report; it shows good agreement between Georaet's and
EPA's calibration values. The Mean Percent Difference between Geo-.net's
calibration and EPA's audit calibration for all Model P125 pumps was 2.04
percent, for all Model P4000 pumps 4.46 percent and for all Hi-Vol Samplers
7.7 percent.
27
-------
It is very difficult to audit water, soil and sediment sample collection
when no flow or volume measurements are involved. The field blanks processed
with these types of samples monitored the sample handling techniques; the
samples collected in duplicate and triplicate were used to estimate overall
measurement precision and audited both sampling and analytical techniques.
GEOTECHNICAL STUDY QA/QC
This part of the Love Canal Study included geophysical surveys,
assessment of the site geology, installation of monitoring wells, collection
of ground water samples, aquifer tests, and modeling of the groundwater flow.
Four subcontractors performed this work under the direction of GCA's technical
monitor.
GCA's Geotechnical QC Coordinator was onsite during all active
geotechnical work. His presence was particularly valuable since standard
quality measures used for sampling and analysis are not readily applicable to
many aspects of geotechnical work. The QC Coordinator used the onsite GCA
Field Office as his base of operations and interacted with EPA and GCA project
personnel and the subcontractors to expedite the geotechnical field work.
The most significant activities of the QA Coordinator related to the
geotechnical study were the inclusion of sample collection procedures specific
to ground water in Appendix A to the QA Plan, and the onsite audits of ground
water sample collection and general geotechnical work.
Internal QC Program
The internal QC requirements for the geophysical survey consisted mainly
of equipment calibration checks provided by Technos, Inc., the subcontractor
conducting the survey measurements. Table 6.6.1 in Appendix B to this report
presents those QC checks. Internal QC requirements for ground water and well
core sediment samples collected by JRB Associates, Inc. are summarized in
Tables 6.3.1 and 6.4.1 of Appendix B.
Onsite System Audits
Collection of ground water samples was audited on 10 September 1980 and
again on 3 October 1980. The 2 to 3 October site visit concentrated on the
geotechnical study and included field observation of drilling crews and
hydrologic testing. Geophysical survey measurements were not in progress at
that time but the instrument calibration procedures were observed. Appendix A
to GCA's QA/QC Summary Report contains the audit reports.
The October audit showed the four subcontractors were working
cooperatively and maintaining the proposed schedule despite initial start-up
problems. The drilling crews were careful and conscientious in their work;
the required well drilling documentation was available and up-to-date. A
supervisory geologist was onsite wherever drilling was in progress.
28
-------
The supervisory geologists were conscientious in maintaining field logs,
taking flow rate and conductivity measurements and interpreting the
split-spoon sediment samples taken at eacli monitoring well. The 10 September
audit of ground water samples had shown some minor problems and inefficiencies
which had been overcome before the 3 October 1980 audit.
Phase 1 of the geophysical survey had been completed before the 2 October
site visit and was briefly reviewed during that audit. Instrument charts and
manual plots of the data were available and documented in log books; the
onsite calibration techniques demonstrated were in accord with the submitted
QA Plan writeups.
Performance Audits
The same techniques used for other water samples—field blanks and
replicate field samples—were applied to the ground water sampling program.
SAMPLE ANALYSIS QA/QC
The samples collected at Love Canal were analyzed by 12 subcontractor
laboratories and 3 EPA laboratories; analysis of replicate field samples or
samples split for QA/QC purposes was conducted by 5 EPA laboratories (1
mentioned above and 4 additional laboratories). Table 3.4 lists the measures
employed to ensure comparability of results between the many active
laboratories.
These measures were documented in the Love Canal Study QA Plan and
implemented throughout the program by all members of the project team. The
subsequent validation of analytical results by EPA QA personnel showed that
the planned QA/QC measures were effective in achieving comparable results from
different laboratories. Internal QC analysis requirements provided analytical
data for use as evaluation criteria in. judging the validity of results on
field samples.
The interaction of GCA with subcontractor and EPA technical staff was
invaluable. The GCA analytical QC coordinators provided liaison between the
subcontractor laboratories and EPA and ensured that all laboratories received
the same information. If one laboratory had a technical problem or question,
the appropriate EPA project staff member was consulted and, if necessary,
further research was done to solve the problem. The information gained was
then supplied to all laboratories to whom it would be pertinent.
GCA's principal activities related to the analytical effort
included:
• coordinating preparation of the analytical procedures manual
(Appendix B)
• reviewing laboratory QA Plans
• supervising external QC program
29
-------
TABLE 3.4. ANALYTICAL QA/QC MEASURES
Standardized handling and shipping procedures for ail collected samples
Standardized analysis procedures
Internal QC Requirements specified
- Method blanks
- Calibration specifications
Calibration Checks
Laboratory Control Standards (LCS)
Internal standards
- Surrogate compounds
Laboratory duplicates
Planned Corrective Actions Outlined
Field Blanks and Replicate Field Samples Analyzed
System Audits Onsite—Qualitative Overview
Performance Audits
- Performance Evaluation Samples
- External QC Check Samples
30
-------
• interfacing with EPA and subcontractors
• onsite audits
GCA's detailed knowledge of the sample collection operation enabled the
prevention of analytical problems which might have been caused by sampling
procedures and the quick resolution of those which had occurred. Questions
which arose in field sampling or laboratory analysis were viewed from the
overall measurement perspective and addressed from that viewpoint.
Internal QC Program
Internal QC measures were specified in each analytical procedure and were
made a program requirement by their inclusion in subcontract documents. These
QC requirements were provided by the participating EPA laboratories and
integrated by GCA into the QA Project Plar; Section 7 of that plan summarized
' the Analytical Internal QC Checks, including control limits and planned
corrective actions, for the whole program. The section was prepared to
facilitate review of the QC measures by GCA or EPA staff and is included in
Appendix B to this report. Table 3.5 lists the subcontractor laboratories,
their analytical tasks and the table in Section 7 which includes the
appropriate Internal QC Checks for each task. It can be seen that different
laboratories performing the same analyses were required to implement the same
internal QC measures.
Each analytical procedure contained in the Love Canal QA Plan Appendix B
included the same QC operation checks, control limits, and planned corrective
actions as those listed in the summary tables. Each subcontractor received
the sections of Appendix B which were relevant to his project task. However,
in some cases, the control limits had not been included in subcontract
documents because a daca base sufficient to derive this information did not
exiet. CCA worked with EPA in defining control limits when required;
Section 6 of this report discusses such efforts.
Control Chart Requirements
GCA included in its Requests for Proposal for analytical tasks, and in
the analytical subcontract documents a section of general QA/QC requirements
for the subcontractor. These requirements included the maintenance of control
J charts on instrument calibration and on known QC samples, such as Laboratory
Control Standards (LCS), The analytical protocols for specific analyses whick
were included in subcontract documents and in the QA Plan Appendix B, in most
i cases, stated that control charts should be maintained for LCS. The
i laboratories were required to report their results on LCS (as well as known QC
| samples, surrogate compounds and internal standards) on. Coding Form A2:
| Internal QC Report. The LCS control chart data is therefore part of the data
I base for the Study.
31
-------
TABLE 3.5. CROSS-REFERENCE TO ANALYTICAL INTERNAL QC CliECK TABLES
Subcontractor
Analysis task
Summary table3
Battelle Columbus Laboratories
PEDCo Environmental, Inc.
Gulf South Research Institute
Southwest Research Institute
Compuchem/Mead Technology
Laboratories
PJB/Jacobs Engineering Group
TRW, Inc.
Acurex Corporation
Energy Resources Company,
Inc. (ERCO)
Midwest Research Institute
Wright State University
Advanced Environmental
Sys terns, Inc.
Volatile organics on Tenax 7.1.1
Volatile organics on Tenax 7.1.1
Pesticides on polyurethane foam 7.1.1
Organics in water 7.2.1
Organics in soil and sediments 7.3.1
Pesticides on polyurethane foam 7.1.1
Inorganics in water 7.2.1
Organics and inorganics in 7.3.1
soil and sediments
Organics and inorganics in 7.4.1
biota
Organics in water 7.2.1
Organics in soil and sediments 7.3.1
Inorganics and organics in 7.2.1
in water
Inorganics and organics in 7.3.1
soil and sediments
Organics in water 7.2.1
Organics in water 7.2.1
Organics in soil and sediments 7.3.1
Inorganics in water 7.2.1
Inorganics in soil and 7.3.1
sediments
Organics in biota 7.4.1
Dioxin in all media 7.1.1, 7.2.1,
7.3.1
TOC/TOX, pH, Conductivity in 7.2.1
ground waters
aRefers to Table in Section 7 in Appendix B of this report.
32
-------
Laboratory Site Visits/System Audits
Each laboratory was visited by GCA or EPA or both during the first month
of analytical work and the working relationship established then proved
valuable throughout the study. The?e visits constituted system audits of each
laboratory's operations for the Love Canal Study; examples of the audit
reports are contained in Appendix A GCA1s QA/QC Summary Report. Internal QC
records were inspected and chain of custody procedures were reviewed. These
site visits were particularly useful in identifying and resolving sample
preparation and analysis problems early in the analytical effort.
External Quality Control Program/Performance Audits
Well-characterized samples whose true values were not known to the
subcontractor laboratories were submitted to each laboratory throughout the
program for use in evaluating overall performance and in estimating the
accuracy achieved by each laboratory. These QC samples were shipped to the
GCA Sample Bank at Niagara Falls to ensure that they were handled in the same:
way as field samples. However, the objective of having these QC samples
indistinguishable from collected field samples was only achieved with the
tenax cartridges and polyurethane foam plugs used for air sample collection.
The other QC samples were easily recognizable as different from collected
samples.
Two general types of QC samples were used: Performance Evaluation
Samples and External QC Check Samples. The differences between the two types
of samples are briefly described below and the individual samples are listed
in Tables 3.6 and 3.7.
Performance Evaluation (PE) Samples—
These samples were provided by EMSL-Cincinnati and EMSL-Las Vegas as
"double-blinds" (true values unknown to both GCA and the subcontractor
laboratories) for use in the water, soil, sediment and biota monitoring
programs. Analytical results on these samples were sent directly to the
providing EPA laboratories where they were evaluated as Acceptable or Not
Acceptable for each analyte reported.
EMSL-Cinciunati provided three different groups of PE samples to be
shipped to appropriate laboratories at monthly intervals during M.e analytical
work. The samples were concentrated solutions and instructions for dilution
before analysis were included. GCA's QA Coordinator and Analytical QC
Coordinators checked on prompt submission of results and, more importantly,
investigated any Not Acceptable results. EMSL-Cincinnati sent their
evaluation reports to GCA to enable this follow-up. All Not Acceptable
results were checked with the individual laboratory and corrective actions
initiated as necessary.
EMSL-Las Vegas supplied two groups of PE samples to be used to evaluate
soil and sediment analysis procedures. The samples were NBS River sediments
spiked with organic compounds as noted in Table 3.6. These samples were
analogous to the "double blinds" mentioned above and were treated in a similar
manner.,
33
-------
J>H
g
H
CO
F-3
•^4
Js
5i
a
Ct)
™>
S
eS
g
O
W
CO
ED
CO
U
rJ
a,
<2
CO
z
O
H
*3«
y~i
(_3
^
^
W
U
£3
<(«
*?
2
o
f*
&
w
•
vO
*
m
u
pq
13
H
CO
CU
r-l
CX
E
CO
CO
00
C
• H
.£
•. tfl
00^
0> PQ
c •->
W OH
CO
3 co
O Ol
0) rrt
3 CX
cr E
< tfl
•^ co
to
•rt 4J
co C
,v, O)
rH E
CO -rt
C TJ
< 0)
CO
^1
«£ A
rrt
rl -rt
O O
UH CO
CO -
rH rl
CO 01
4J 4J
CU CO
S 3
0) C
0 0
CO -rt
U 4-1
H 3
^ O
rrt CO
~
55
M
O
|
ti
r3
•
O
C
l-l
A
CX
3
O
^
o
c°
•rl
p
Ol
c>
c
•rt
00
c
W
CO
,0
o
CJ
CO
O -•»
r^ pQ
os i->
W P-«
W
p
O (0
ai ai
3 t-<
a* cx
*^* ^
*"**» 03
CO C,Q
• H
CO 4-1
^ C
r— 4 CJ
c3 6
C -rl
< tJ
01
Oi CO
O
r-l -
i-H
M -rt
O O
<4-l CO
CO »
i-H rl
CO 01
4J 4-1
CU 10
S 3
0 O
CO -rl
t^ ^J
H 3
r-l
-3- o
r-l CO
55
25
M
CJ
1
D?
g
•
o
c
r-l
n
CX
3
O
rl
O
00
•rl
rl
01
Ol
c
•rt
CUD
c
M
to
XI
o
o
tfl
"-J
O ^~
CJ «
c; i-i
w a,
^4
01
4J
CO
3
~C
O
• rt
4J
3
i— I
O
CO
CO
3
o
cu
3
cr
*-^
cu
TJ
^
O
3
i— i
pC4
^
01 CO
4J Ol
CO r-l
rl CX
<-> E
• rt co
& CO
g
J55
IH
O
|
CO
s
•
o
c
M
to
E
01
4-1
CO
^
CO
1— 1
cfl
4-1
C
Ol
E
c
0
•rt
^
P
fr"]
TJ
0)
O
CO
1
C
O
•rt
4J
3
r-<
O
CO
CO
3
O
CU
3
^J
^^
^^
o
O
H
C to
O Ol
rl CX
CO g
U to
CO
u
•rt rl
C a)
CO 4J
r?3
o
TJ
r-l C
CO 3
4J O
O rl
H O
25
2
r-i
CJ
1
s
CO
Ol
•rt
rl
O
4-1
T)
o
n
CO
rJ
J>^
M)
O
r-l
O
c
c o
O 0)
• rt H
cd *o
rl CO
O 01
txs
cSl
J=
X CJ
01
4J
3
4-1
• rt
4-1
CO
C
M
_C
0
rl
Cfl
CU
CO
Ol
05
f-l
4-J
3
O
CO
ey 3
rl CX>4-I
3 E r-l
o o
< CJ
*^^
l-l
o
c
cfl
4J
Ol
s
CO CO
TJ CU
C r*
3 :x
o E
CX CO
E co
o
O 4J
c
cj ai
• rt E
C -H
tfl TJ
00 Ol
rl CO
O
^
0) r-l
l-l -rt
•rt O
4-1 CO
Cfl
O rl
> 0>
4-1
CM tt)
r-l 3
~
2
M
CJ
1
to
W
c§
•
o
c
r-l
r.
CX CU
3 4-1
0 3
M 4-1
O -rt
4J
60 CO
c e
•rt r-l
rl
Ol JS
Ol O
C W
•rt tO
60 01
C co
W 01
OS
to
XI 4-1 •
O co o
O CU C
CO S IH
•-> X!
*^ 4-1 •*
M 3 3
^ O OS
CX co H
CO
Ol
(j
0
4-1
CO
o
X* CU
CO 4-1
rJ 3
4-1
^t *rt
00 4-1
O CO
r-i C
O r-l
C
x: x;
Coo
O CU *-i
•rt H CO
.'J 01
cO TJ to
rl CO Ol
0 01 OS
Lj "^^ r^
o E u
CJ 01 3
X O
X 0 CO
01 3
l-l CXM-l
3 E r-l
0 O 3
< CJ 0
^_
c
O
• rt
4-J
3
I-H CO
O Ol
CO I-H
*>-. CX
to E
TJ to
C CO
3
O •
CX CO
E U
0 0
CJ -rt
03
O
•rt »
C 4J
CO C
00 01
r) E
O 'rt
TJ
Ol O>
rH CO
•rt
4-1 «
tfl rrt
rH -rl
0 O
> CO
•rt
e '
Ol rl
CO CU
4J
CN tfl
-3
~
2
I-H
0
1
a
s
«
o
c
l-l
A
CX CU
3 4J
0) 0 3
4-1 I-i 4-1
3 O -rt
4-) 4J
•rt 00 CO
4-1 C C
CO -rt M
C rl
M Ol XI
CU CJ
X! C rl
O -rt CO
rl W) Ol
CO C CO
o: W 0)
CO OS
ai co
OS XI 4J «
O co o
4-1 0 01 C
CO tfl 3 rH
0) "-) J3
3 ->. 4-1 »
TJ P3 3 3
•rt r, O OS
S On CO H
y^N
1
-------
/-*
•o
CU
3
C
•H
4-1
c
0
CJ
*— '
\0
•
fo
ptj
H-?
pQ
^2
H
CO
CU
i-l
0.
E
CO
CO
60
c
• H
^
• H
cu
0
cu
CO
cu
•r4
(.4
O
4J
CO
O
3
cu
CO
3
^>
• H
^1
4-1
CO
E
CO
4-1
CU
4-1
C
O
U
CU
1-4
(X
E
CO
CO
cu
3
o
CO
cu
t-4
(X
CO
CO
CO CO CO
cu cu cu
•1-1 -H •!-!
M 1-4 1-1
o • o • o •
4-1 U 4-1 O 4-1 O
CO C co C CO C
I-l I-l MM M I-l
0 O O
jOCU" X> " JO 0) -
CO 4J CX CU CO (X CU CO 4-1 C« CU
4->O3 CUO3 4-> O 3
^» "H ^i* 4J ^» 4.' VJ i-l ^> *^4 \-l -LJ
M4JC5-H 603O-1-I 004JU-H
O 10 4-1 O 4-1 4-1 O M 4-1
i-lpOOCO i-l -iH 60 0> r-4CU)cn
OMCC O4-ICC OMCC
C -i-l M C W -H M C '1-4 >-l
CUUCU.C COMCU.C cuuoix:
OCUI-iCUU OCU QJ O O OJ P r-ICO 4J O'HCO 4J CU^f-ICO
COT3C060CU CO"OV4{>OCU COT3COOOO
McOOICco WicocOCco bttcuCco
ocuc^ucu OCUCUHOI o co Pi w ;•
O.S ei tJ. S a> « (X S oi
O g 4-1 XI 4-1 O E Oi JO 4-1 • O B 4-1 -O 4_1
OCU3CCO OCU Ocoo OCU3OCO
43OOCU JC4JOCUC J3OOCU
XCJCOC03 XOCOCOSM X O CO ct) 3
CU3 ^XI CU3CU^XI CU3 »~i f-
SEMi^g 3E-OW33" 3gM«3
OO3^O CJO-H»-5Og CJO3^O
^
4^
e
— cu -o
CO R C
cu .U co
•o -o
•H 4-J CU «-l
O C CO -r)
•H cu O
4-1 E • co
CO -H i-l "v.
cu -a -H 4-1
FU cu o e
CO CO CU
u-i E
*4* *O •> *H
CM •«
CO CO CU
6 ~>- C -H
o 4-1 o «:
U C AJ
CU CU CO CO
cu E u PQ oi
M. -H CU U CO U 4J
•H > CO -H CO -i-l C
§-H 0) 4-1 4J CU
Cd i-4 CO CO CO E
en ex cu 4J cu -H
co E (x o fXt *o
CN CQ CO -i-l CU
i-l Z CO CO CQ 1/1 CO
5S
> M >
•J CJ Ij
1 1 1
CO CO CO
w w w
35
-------
Q
i*
c
c/)
at
u
Lt
3
0
at
a
E
to
f
4J
r-J
4-1 C
CO O CQ
C C
LI Li
*J C3 D.
-^ a*
C -0 v-
o
to -O c:
o to
-1 -O
TJ -H
C it TJ
0 4J •
to O -—• 4J
,-« a. *-» e
4-1 Li Q/ tO
ai OT ty
E -o at L.
O LI D.
r, -I C.-W
a.
&
T
to
Li
TJ
u a>
u i~<
tx o
at co
o. u
e ^
CO •-<
CO
01
•••4
O
to u
Li C
O 0) M
3J--
0) 4J JJ
3 W C
-o c a.1
E w E
3 C
-< .C O
O U Li
O L, -^
at it c
.-« ty
D 05 O
tJ f- a
n) »-< u
0!
a.
E
&
H
X
rt
C
a>
H
CO
O
s
L?
O
at
o
V40
01
£1)
c
«
Li
at
U 3
Li t-i
« to
4) C
os M
D *J
O Q
O
CF 41
O CJ
4J 3 r- 3
o o o
a;
O C 4-1 M-l
J O -» -^
• -< E Li
O CO --« >
<0 4-t C 2
r-l C O X
(U C O 1
t: O QJ -J
U »J CO
ro ai s:
u
c
„
D.
3
O
Li
0
00
c
Li
O
0)
c
•<-4
c
w
V)
o
(0
"-)
L) 33
OS -1
CO
CO *
c to
< (U
LI rt
o :*
C co
W O D
tc jj 'D.
H
3 T3
00 tT 0)
55
55
M
V
Cj
5
4J
y>
00
c
c
s
CO
J
o
(A
01
E
p—4
2
X
CO
co to
•-, ai
« a
C E
< to
to
a.
U M
(-< ty
*j
LI co
O 3
(4-1 ->^
C
Cfl 4J
O1 — •
s o
->
-J C
o
c
^
a.
3
O
U
ot)
c
•rH
Li
at
D
C
C
W
CO
o
^
u «
or r»
c
o
Tl
o
to
3
O
0
O"
at
•rl
o «
3 at
U. O-
a> t/>
•^ cO
2 ^
Z
M
1
X
S
O CJ
D.
U E
nj W
•H >»
LI .n to
a -a
O T3 C
C. U O
(X ro Q,
tD -* E
c a. o
-^ ^j -j
c u
o -^
*j w a
w 3 0
y o — *
C O J3
O Li CO
oc
OJ CD Li
0^0.
CO
at
M
O
4J
ctf
O
-O Of
•J 3 4J
4J 3
OJ} *J •-•
O W 4-1
— ' C W
O M C
5 jc M
c y u ^
o L.
4j at to
to TJ to o;
Li ^ 11 W
O D OS -« E
O
eg ex. eu
t-J 3 4-J
>\ Ll 4J
60 y '<^
.-* 00 w)
O EC
C --4 I-*
jr L,
co at .£
o a> a u
- H c L.
•rJ ID
TJ 60 CJ
CO C to
c) uj at
2: os
•^^ w
E ,c *•* •
t. ty O w (J
13 "-} JC
E — ' CQ 3 3
o 3 *-) o cc
CO C
"c E
3 -^
O TJ
§•£
O
(_> *
a -H
-r- O
C CO
CO
O at
W C
OJ O «
00 -H 0>
LI U <-"
£ r2 6*
«£• tO CO
z
?
d
§
-a
ty TJ •"
3 • •-* U
Li Z O "
L, H Z < i~
O M *-<
• 1 0- E
O en _J .-< n
»-• U to •-'
n. LI s: 4j >~>
E D W JB X
to 01 >, o «
U .0 >
8-2-gl J ._.
ty -H to ^-« tsj
to -H -^ X E — '
-^ W Lt J3 CC (5
L. X 1" CO C
CL^* > TJ CD
O tB CJ 4-/
Li C W U O O
P. 3 — • Li
CO CO r-i O. 4J 4J
c u CD a* x c
M ,-1 > K W C
CO
at
Li
o
flj
o
cO CV
00 --<
1-4 tO
0 C
C M
do js
O QJ O
•rJ H L(
4-f CO
w
x u 3
CJ 3 X
3 E -^ 3
tJ O 3 O
C
J
O
^
o
to
Lt
V
"£
o
4-1
O
to
— * 4)
O — '
C CL
0> E
J3 to
O4 (A
^
Z
V
g
TJ
c
4J
o
u
\
-------
X— S
•o
** W
-» n
TO o]
ft) C O-
iX (X
E *•» E
TO Q fl
tn c w
CO
Li
o
TO
Li
O
TO •*-*
•-J 3
4-1
Qfj 4_( (J
i~< C *-*
O M
cj O TO
H TO C
u
c *-.
•~i 3 it
Cot
tO E -U
O O>
*A O W
Z
V
tn
s
Li
O
D
00
'J^
C
TO
t
!
CD
^
C
TO
O
C
cx o>
3 *J
O 3
o • ^
w
t-t
(Q 3 2
-1 O OS
ft* tn H
•°!
tu
a, o
nj u
E
O -<
u en
0 O
r4 C
•-I 1
U O
5J N
a. >,
(0 •-• •
o: «
Hi C OJ
r-i TO •-•
IX 0.
E w E
TO O CO
to C to
n
I*
Li
0
0)
Li
0
CO 4U»
.-I 3
4J
20 U
O w
--* C
O -^
c
c .c
c u -j
• -( H "3
V TJ «
O 31 OS
Li ••*. JH
0 fe S
j: o
x o tn
OJ 3
Li (X MJ
0 O J
< L> O
^
U
CO
Li
X -
•ss
Li
3 W
01 0"
a* 2
0) C
» o
•*4 CO
^ r2 *0.
co -•*. tn
0 T3 *>
> C C
•-i 3 W
E 0 E
W E -0
O
3 *J
O 3
O ---1
OO u,
C C
Lt
U X
C L-
• H nj
00 0>
C co
w
O W 0
t> Ol C
TO 3 ^
•-) j^
-^ *J
« 3 3
*-i o «
a< tn H
0*
o.
E
Li TO
O to
M-4
Ol W O1
a. L. -- <
TO "O "-<
tn ai u
U to
CJ) O 1>
cr^o.
4-» (O
CO -f* >\
"4 (0 J3
p. -H -O
O 03 Ol
Li C U
O. ^
C U i>
«
O)
Li
O
Li
O
_n 41
*J
00 *J
O CO
o£
C
C 'J U
-^ H «
4J 0)
TO T1 W
(X S
Li -^, X
CJ 0> 3 (J
J= O C
X U tO t-«
3 E"^ 3~
U O 3 Ptf
< U U H
*^*.
Ol
•o
.y
n c
Ol O)
(x, E
0 "S
j3 cn
L>
U — <
O -^
L* O
•o to
ac -
Li
T3 Ot
•^ C
MOM
o •* a;
•-•*-> ^-i
j= 3 a
0 t-< E
O 03
i** Ji in
§
t-i
u
tn
X
TO
O
*J
y
ex
U)
tv
a.
E
to
CO
CJ
t,
O
« C
O
J3 01 *
TO *J CX, V
^J 3 3 *J
*J O 3
(iO *J U ••"*
O (0 U
^ C 00 10
0 M C C
C -H —I
c u a S
o a CE; w 01
LI ^ J 0)
CJ £ 3 O to O
X O U 01 C
x t- tn to 3 t-<
01 3 T> .n
LI a. <*-* *^. *-> «
u o 3 *-» o cc
< u e> x w t-<
•-*
o
Li
V
•^
c
o
3
•-I H)
O 0)
*^
u
E
"£
E
O
o
Li
O
U 01
o ,."
OC u
O
E
•o
Of
^
o
Li
a>
n)
c
o
4->
3
O
CO (0
rt «-*
U (X
a. E
r~t tn
T
g
V
to
5
(0
o
u
U
M-l
01
G.
(0
Ol
a,
E
TO
tn
£
to
Li
^
• f-f
C
Ol
TO
to
4J
"so
•rH
^
*
0
en
•*H
o
tn tc
C "<
•H O.
X E
0 (0
•M cn
a
T3 to
C -
•f4 U
O Oi
y T3
- «
(^
H
f
a?
s
,5.
g
03
t— )
0
4-1
2
"4-1
a.
to
CX
03
tn
3
1
CO
W
CO
01
•O (X
C E
03 OJ
to
•D
C 4J
CO g
E
Li T3
V OJ
U CO
TO
•o w
U Li
3 Oi
O TO
•H l£
T3 -^
03 >»
rt U
f
tn
S
vTJ
CS'
o
1-t
I
~*
•a
3
to
r-4
S
3
(U
Q
u
o
(0
37
-------
Table 3.6 lists the PE samples used in the study and identifies:
• the providing EPA laboratory (sample source)
• the analytes contained (sample contents)
• the physical state of the sample (matrix)
• the types of collected samples audited by this sample (use)
• the subcontractor laboratories receiving each PE sample
(laboratories receiving samples)
External QC Check Samples—
These samples were provided by several different EPA laboratories and one
subcontractor as can be seen in Table 3.7. This table follows the format of
Table 3.6 and includes a Comments section. The solutions were concentrates to
be diluted as directed before analysis. The true values were known by GCA but
not by the subcontractor laboratories. The providing EPA laboratories
referred to these samples by different names (QA samples, QC check samples, QC
samples) and the term External QC Check Samples is used to encompass all of
these samples and distinguish them from internal QC samples whose true values
were known to the subcontractor laboratories.
EPA1s QA/QC planning for the Love Canal Study had not incorporated a
complete series of QC samples whose true values were known to GCA. In
response to GCA's request, EMSL-Cincinnati supplied large quantities of their
QC Check Samples for water quality and drinking water analyses to supplement
the QC samples which had been specifically designed for the study. This
provided quickly available external QC Check Samples for almost all the
classes of pollutants monitored at Love Canal. Some of the water quality and
drinking water QC samples initially pro/ided had been designed for use with
different analytical methods (e.g., total phenols by colorimetry) with lower
detection limits than the methods used on the Love Canal Study. This problem
was discovered early in the program and appropriate PE samples (Method 625
acid fraction) were substituted. Another problem was that, in general, these
QC check samples were not available for shipment with the first batches of
collected samples.
The External QC Check Samples were provided at the approximate rate of
one QC sample for every twenty field samples of similar type shipped to an
analytical laboratory. Table 3.7 lists the sample contents, their use and the
receiving laboratories. Results were included in the data base to enable
rapid follow-up of any questionable results.
GCA staff reviewed all External QC Check Sample Results for
reasonableness and checked all questionable results with individual
laboratories. Reporting and calculation errors were found in this way, but
very few problems with analytical techniques were uncovered. This correlates
with the internal laboratory QC results which demonstrated good analytical
performance by the subcontractor laboratories.
38
-------
The Comments section of Table 3.7 indicates the experience with each
sample during the study. The external QC Check samples specifically designed
for the study were much more useful than the general use water quality and
drinking water QC Check samples. The most frequent problems with the general
use samples were that they were inappropriate for the Love Canal analysis
procedures and that the samples were familiar to the analytical laboratories.
EPA Quality Control Audit on GC/MS Data
In accordance with a request from the Love Canal Study Project Officer,
GCA/Technology Division completed a limited study of mass spectral data from
each of the laboratories performing organic analysis for the Love Canal
Study. Since time and cost precluded extensive examination of all mass
spectral data supplied, two analyses were selected from each laboratory for
verification. Selection criteria for data to be verified was arbitrary but
was influenced by several considerations, including:
• Levels of pollutants reported
• Nuraber of compounds reported
• Toxicity of compounds reported
Using these criteria, two samples from each of the laboratories were
chosen for examination. Included in this list were analyses representative of
all sample types collected in the study.
The data was examined primarily to verify compound identifications.
Quantitation of the substances found would have required data inputs not
immediately available such as:
• Amount of sample extracted
• Concentration volume of sample
• Internal standard spike level
• Instrument response factors
• Surrogate spiking levels • %
While this data could be made available through interpretation of
laboratory notebooks, time constraints prohibited this process.
Upon completion of this task, a summary report, Verification of Selected
Mass Spectral Data From the Love Canal Study, was submitted to the Office of
Research and Development, U.S. EPA, Washington, D.C. in May, 1981.
39
-------
Precision and Accuracy Estimates
The measures planned to enable the estimation of precision and accuracy
included the collection and analysis of replicate samples and the analysis of
well-characterized external QC check samples. Section 8.0 of the Love Canal
Study QA Plan describes these measures in detail; they were used to provide
estimates of:
• Intralaboratory measurement precision-based on the analysis of
replicate field samples by each subcontractor laboratory.
• lutralaboratory precision for analysis only - based on the analysis
of duplicate aliquots from one field sample by each subcontractor
laboratory.
• Interlaboratory measurement precision - based on the analysis of
replicate field samples by each subcontractor laboratory and an EPA
referee' laboratory.
• Accuracy for analysis only - based on the analysis of external QC
check samples by each subcontractor laboratory.
All the planned measures were implemented with one exception. This
involved the Tenax and foam air samples collected in duplicate sampling trains
in the living areas of homes used as base sites in the residential air
monitoring program. The plan was to estimate both intralaboratory and
interlaboratory measurement precision from the analysis of these duplicates,
using the sample splitting scheme described in Section 8.2.1 of the QA Plan.
Pressures of time and difficulties of analysis prevented the complete
implementation of the sample splitting scheme and only intralab measurement
precision was estimated from these samples. Since two laboratories analyzed
the Tenax samples and two different laboratories analyzed the foam samples,
the estimation of interlaboratory precision was not as critical as when more
laboratories were involved.
The computer reports described in the bullets above were prepared and
submitted to EPA. Each printout included both a statistical report (mean
difference, standard deviation, etc.) and an individual pairs listing which
identified the samples statistically treated. The statistical report was
classified by media, by pollutant and by laboratory so that it was easy to
compare the performance of laboratories analyzing the same pollutant. The
pairs listing was classified by lab, media and sample bank number, and
pollutant.
The pairs listing gave the analytical result for each reported pollutant f
in each collected sample or external QC check sample and showed the wide range
of reported concentrations for almost any pollutant in different collected s
samples. These varying concentration levels for one pollutant in one media |
were placed in one subgroup for statistical treatment. Defining several
different concentration ranges for each pollutant and estimating precision or
accuracy separately for each range would be helpful. However, so many
40
-------
pollutants were reported as trace or not detected that the total number of
analytical results available for statistical treatment was small. Breaking
these down into different concentration ranges would result in very small
statistical subgroups and, therefore, was not done.
It was recognized that if concentration levels varied widely, precision
estimates would suffer from the problem described above; it was thought that
estimates of accuracy based on external QC samples would not. However, most
external QC samples were provided at several different concentration levels
and, therefore, shared the problem.
DATA MANAGEMENT SYSTEM QA/QC
The purpose of the data management system was to collect and store all
data generated as part of the Love Canal Study and to provide basic reports of
the monitoring results. Table 3.8 lists the QA/QC measures employed on this
program element.
Throughout the study, GCA1s technical staff provided input to the Data
Management staff; e.g., on design of the Sampling and Analytical Coding
Manuals and Data Report forms, and were available for consultation on
technical questions. The Data Management QC Coordinators' consulted with
these in-house departments as necessary, and interacted with subcontractors on
coding questions.
As noted in Table 3.8, the data reports were reviewed and checked at each
stage in their processing by the Data Management staff and built-in
computerized checks. Keypunching was machine-verified; questionable values
were flagged on computer reports before they were sent to subcontractor labs
for verification.
The thoroughness of the data review is demontrated in the remarkably
error-free data set found in the Data Management System Audit. The audit
report is included in Appendix A to GCA's QA/QC Summary Report.
CORRECTIVE ACTION SYSTEM
Section 11.0 of the Love Canal QA Plan describes the closed-loop
corrective action system followed by GCA during the study. The purpose of a
closed-loop system is to ensure that any quality problem is reported to a
person responsible for correcting it who is part of a system ensuring action
and followup on all reported problems. Both immediate and long-term
corrective actions are incorporated in the system.
Each subcontractor was required to include in his QA Project Plan:
• the named individual responsible for corrective action at each stage
of his operations.
• Internal QC procedures which met or exceeded EPA1s internal QC
requirements and included control limits and corrective actions to
be taken if these limits were exceeded.
41
-------
TABLE 3.8. DATA MANAGEMENT QA/QC MEASURES
Uniform Coding Standards
Individual coding manuals for sampling data and for analysis data
Individual coding forms for sampling data reporting and for analysis
data reporting
Log-in of all data forms sent to or from CCA/Technology Division
- Forms traceable by serial number and/or sample ID number
Review of Data
Check of coding forms by GCA for reasonableness before keypunch
- Check of computer reports by GCA for reasonableness before sending
to subcontractor lab for verification
Check of computer reports by each subcontractor lab to verify entries
- Alert reports of concentrations exceeding alert levels
Computer Card Preparation
Acceptable ranges of values supplied to keypunch operators
- Key-verification of all cards
Computer Check and Edit of Data
Edit check programs specific to sampling data, analysis data and
verification action data
ID Cross-reference file to prevent double entries
- Coordinates file - automated entry of sampling site coordinates
Computer Program Safeguards
All programs tested with legal and illegal test data before actual
data was processed.
"Levels" assigned to data files during program execution to prevent
access by out of sequence programs
Docu ^ntation of Changes to Data
Notations on coding forms by GCA staff
Notations on computer reports by CCA staff
Notations on computer reports by subcontractor labs
Notations in project workbooks by CCA staff
Verification action reports by GCA DHS staff
. 42
-------
Subcontractors were free to follow their own corrective action system as long
as the stated requirements were met. Each subcontractor's project document
inventory contains the documentation of corrective actions taken by that
subcontractor. GCA did not monitor this documentation.
Summary tables of the EPA internal QC requirements are contained in
Sections 6 and 7 of the Love Canal Study QA Plan. Each media sampled or
analyzed is covered in a separate table and control limits and planned
corrective actions are included. To ensure actual use by field and laboratory
technicians, the Appendix A Sampling Procedures and Appendix B Analytical
Procedures included the appropriate internal QC requirements.
Immediate Corrective Action
The control limits and planned corrective actions were provided to enable :
the uniform application of appropriate corrective actions as part of normal
operating procedures. The actions taken were to be noted in field or t
laboratory notebooks but no other formal documentation was required. These
on-the-spot corrective actions were an everyday part of the QA/QC system; they
helped to avoid the collection of poor quality data. ";
Long-Term Corrective Action »
GCA placed any problem not solved by immediate corrective action into the <
long-term category requiring appropriate documentation and followup. The
initial record of the need for corrective action, the problem investigation,
and the actions taken was made in the project workbook maintained by each GCA
technical staff member. i
GCA1 s routine policy is to initiate a Corrective Action (CA) Request Form <•,_
for each long-term corrective action. The time pressures of the LOVP Canal
Study caused these forms to be initiated after the fact in many cases, and not
at all in some cases. However, workbook documentation was dated and complete ;
and provided the action dates given on the CA Forms which were prepared by *
GCA. These forms are contained in Appendix C to GCA1s QA/QC Summary Report; •
memos or letters which formed part of the corrective action are included to I
enable tracing the action. '
I •
DOCUMENT CONTROL/CHAIN-OF-CUSTODY PROCEDURES g
\
The procedures used on the Love Canal Study were defined by the GCA QA '•*
Coordinator working with NEIC staff members, and following guidelines -j
developed from: s
•*i
• NEIC Policies and Procedures Manual, EPA-330/9-78-001R, Section II. if
• Enforcement Considerations for Evaluations of Uncontrolled Hazardous
Waste Disposal Sites by Contractors, NEIC, April 1980, Sections VIII
and IX.
43
-------
NEIC staff members reviewed the project-specific procedures and their comments
were incorporated. GCA1s QC Coordinators and QA Coordinator consulted with
NEIC on any questions throughout the program.
A detailed writeup covering both document control and chain-of-custody
procedures and requirements was provided to each subcontractor with his
subcontract documents, or shortly thereafter. In addition, Appendix A to the
QA Plan contains these procedures with special emphasis given to the sampling
aspects; Appendix B contains a Document Control/Chain-of-Custody Section
directed to the needs of analytical laboratories. Sections 4 and 5 in the
overall QA Plan volume present the same information, addressing both the
sampling and analysis requirements.
Document Control Procedures/Document Inventory
The purpose of document control is to ensure that all project documents
will be accounted for and assembled into a document inventory when a project
is completed; this facilitates tracing the history of the entire project or
any part of it. The system implemented by GCA during the Love Canal Study was
managed by Document Control Officers (DCOs) who were responsible for issuing,
controlling and maintaining records of controlled documents.
GCA identified the principal items subject to document control, provided
a project numbering system including individual subcontractor codes and
provided serialized sample identification and custody forms. GCA1s DCO at the
onsite Sample Bank maintained overall document control on all items related to
sample collection and shipment. GCA1s DCO at Bedford logged in all project
documents, including data reporting forms, received at Bedford.
At the conclusion of project activity, each subcontractor was required to
submit his complete project document inventor;', classified according to the
guidelines provided, to GCA for subsequent transferral to EPA. GCA has
assembled these subcontractor document inventories at its Bedford
headquarters; they will be submitted to EPA on request or at tl.a conclusion of
GCA's project activity.
Use of Project Notebooks—
GCA required that its own and subcontractor technical staff members
maintain complete and traceable records of all project activity—in bound
laboratory notebooks, when appropriate. The onsite Sample Bank DCO issued
bound notebooks and Field Data Sheets to the Sampling and Geotechnical
subcontractors; Subcontractor and GCA/Bedford DCOs maintained control of
notebooks issued to their staff. These notebooks form an important part of
the project document inventory.
Chain-of-Gustody Procedures
The purpose of chain-of-custody procedures is to document the identity of
the sample, and its handling, from its first existence as a sample until
information derived from it is introduced as evidence during an enforcement
proceeding. Customized sample identification tags and custody records were
44
yy
-------
provided by GCA to subcontractors and GCA1s onsite Sample Bank directed
chain-of-custody procedures and transferred custody of collected samples and
QC samples to analytical laboratories. Each appropriate subcontractor was
required to identify a Sample Custodian and to maintain document control of
custody records.
^rij&flhja^^i&ttaifcdGMtaUfc*.
-------
SECTION 4 I
5
SAMPLING
Once standardized sampling protocols had been established as discussed in
the previous section, the next major responsibility for the sampl rig element
of the GCA program was to select a qualified subcontractor.
The various sampling protocols were combined into one consistent Bid
Package. The number of samples to be collected using each protocol was j
stipulated as well as the schedule so that comparable bids would be received. |
The bid package also included Special Instructions, Technical Evaluation
Criteria, General Instructions, Representations, Certifications and
Acknowledgments, Certificate of Current Pricing Data and Delivery
Requirements. Of particular relevance here is the list of Technical
Evaluation Criteria that was contained in the RFP and is listed in Table 4.1.
The various EPA laboratories had supplied GCA with the names of firms
that were thought to have the capability of undertaking the sampling task.
Since time did not permit the formal announcement of the solicitation, the
list of 10 firms supplied by EPA was used as the prospective bidder's list.
Each of the firms was contacted by telephone on 14 July 1980 to be invited to
the 17 July Bidder's Conference in Buffalo, New York. Confirmations of these
invitations were issued by TELEX on 16 July 1980.
The firms invited to the bidder's conference included the following:
1. Battelle Memorial Institute
505 Kings Avenue
Columbus, OH 43201
2. Research Triangle Institute (RTI)
Post Office Box 12194
Research Triangle Park, NC 27709
3. Radian Corporation
Post Office Box 9948
Austin, TX 78766
4. PEDCo Environmental, Inc.
11499 Chester Road
Cincinnati, OH 45246
46
"^™**—• j * \ ~~ -—
1 • —— )t
-------
L.
TABLE 4.1. TECHNICAL EVALUATION CRITERIA
(GCA 1-619-026-222-001)
Technical proposals received in response to the subject RFP will be evaluated
using the following equally weighted criteria:
I. Offerer's immediate availability to initiate sampling, ability to
respond on short notice with sufficient field personnel, and ability to
complete the sample collection program by 31 October 1980.
II. Offerer's understanding of the scope and magnitude of the sampling
effort required as demonstrated in the proposed technical approach.
III. Offerer's experience in sampling environmental pollutants in air, water,
soils, sediment and biota.
IV. Offerer's proposed overall program management including qualifications
of assigned personnel; i.e., experience, capability, and percentage of
effort devoted to the program.
V. Offerer's experience in sampling organic pollutants by means of sorbent
media.
VI. Offerer's ability to supply the required sampling equipment and spare
parts.
VII. Offerer's Quality Assurance/Quality Control Plan.
47
-"--
-------
5. IIT Research Institute
10 West 35th Street
Chicago, IL 60616
6. Geomet
15 Firstfield Road
Gaithersburg, MD 20760
7. Southwest Research Institute
Post Office Box 28510
San Antonio, TX 78284
8. Midwest Research Institute
452 Volker Blvd.
Kansas City, MO 64110
9. Gulf South Research Institute
Post Office Box 1177
New Iberia, LA 70560
10. Environmental Science & Engineering
Post Office Box ESE
Gainesville, FL 32604
Of the 10 organizations invitad to the bidder's conference, eight were
actually in attendance. The eight include the following:
1. Battelle Columbus Laboratories
2. IIT Research Institute
3. Gulf South Research Institute
4. Geomet
5. Southwest Research Institute
6. Radian Corporation
7. PEDCo Environmental, Inc.
8. Environmental Science & Engineering
At the bidder's conference the RFP package was distributed to all in
attendance. All aspects of the cost and technical requirements were discussed
in detail by GCA personnel with questions from the attendees encouraged as
each aspect of the program was discussed. Any remaining questions that had
not been dealt with during the previous discussions were answered so that each
of the prospective bidders had an equal understanding of the program
requirements. The due date for proposals was 21 July 1980 at 5:00 p.m. EDT.
48
-------
The entire bidder's conference was transcribed by a Notary Public and typed
copies of the proceedings were available to GCA at the end of the day for
documentat ion.
Five firms responded with proposals in response to the RFP. These
included:
1. Battelle Columbus Laboratories
2. IIT Research Institute
3. Geomet
4. PEDCo Environmental, Inc.
5. Environmental Science & Engineering
Based on the technical evaluation of these five proposals, two firms were
invited to GCA for best and final discussions. This process led to the
selection of Geomet Technologies, Inc. as the sampling subcontractor.
With this task completed, efforts were immediately directed to getting
the sampling effort underway.
GCA established a centrally-located field office on Colvin Boulevard,
just north of the. Love Canal. This location facilitated communications,
sampling operations monitoring, chain-of-custody protocol, and subcontractor
coordination during the sampling campaign.
The primary responsibility of the field technical staff was Quality
Assurance and Quality Control of field sampling procedures. Field sampling
crews were observed to assure that proper protocols *ere followed for:
• sample container preparation
• sampling equipment calibration
• sample site documentation
m sample handling
• sample preservation
• sample labeling
• sample chain-of-custody form preparation
Sampling of sumps, soil, air, and water was monitored on a daily basis;
initially, to ensure that procedures were followed and uncertainties could be
quickly resolved. Once the sampling crews became proficient, 'aonitoring was
49
-------
accomplished by periodic spot checks to assure that protocols were being
followed throughout the program. A protocol log book was kept by the field
office personnel to document any changes in protocols and QA/QC problems.
The following sampling protocol components were aonitored at each
sampling site visit:
• sample container preparation
- correct volume
- correct cleaning procedure
correct cap which was similarly cleaned
- proper storage of containers
• sample collection and handling
- clean sampling equipment
- cleaning of equipment between samples
- use of only glass, teflon or stainless steel for sample contact
- care to avoid contamination of samples
- liquid portions of samples collected before sediments to avoid
mixing
/
- Texax and PUF samples handled with nylon gloves
- protocol followed for all samples.
• sample preservation
- correct preservative for type of analysis
- correct normality of preservative
- expiration date not exceeded
- correct amount of preservative used
- sample kept well iced and doublebagged
- samples dg1 /ered to sample bank truck every 4 hours
50
-------
I
1
• chain-of-custody '
i
j
- sample tags completely prepared j
1
- sample container numbers matched the correct chain-of-custody ;
form ;
samples doublebagged with tag inside ]
I
- chain-of-custody forms and tags properly completed and signed ^
recheck all information upon entry into sample log book j
Protocol changes were made during the testing period. The field office sj
personnel wen.- responsible for notifying all affected field crew and j
monitoring and documenting the implementation of the changes. i
Another important function of the GCA field office personnel was security f
and sample integrity. Twelve-hour continuous air samples were collected at f
over 60 unoccupied homes. Security against tampering and vandalism was '",
essential to ensure the integrity of these samples. The following security j
measures were implemented for the protection of the samples.
• all homes were locked and secured.
• additional padlocks and hasps were installed on all entrances, with
master keys issued only to the subcontractor and the GCA field ;
office. »
r<
• each home was equipped with a log book in which any one who entered {
the building was required to sign in indicating name, date, time, '.
and reason for being there, and any additional comments or ,[
observations upon entering the home. I
• evidence tape was placed on all windows and doorways serving as an »
indicator for any unauthorized entry of the homes.
• upon entry all personnel were to check the evidence tape for breaks j
to verify unauthorized entry. Observations were to be recorded in )
the log book at the home. ';
• a private security guard was hired to make hourly inspections of the y
homes during the night and report any infractions upon the homes |
immediately to the field offico. personnel. |
The homeowners were escorted by field office personnel if they requested i|
entry. The homes were kept up to the condition in which they were accepted $
for the program, requiring repairs of damage by vandalism, break-ins and ;j
accidents. A written appraisal of all privately-owned ,;|
homes was done by a licensed appraiser prior to the sampling program to "I
provide proper protection and liability of all parties involved. 4
51
-------
At those locations wnere outside air samples were being taken, totally
enclosed cages were built to accommodate the hi-vol, PUF and Tenax samplers.
These cages were i-nspected for tampering by GCA personnel during the day, and
by the security guard at night. All wells which were sampled were equipped
with locking caps to which only the samplers had a master key. This was to
protect against tampering or unauthorized use of the wells.
The subcontractors were required to collect and deliver all air samples
to the sample bank truck located at the GCA field office within 4 hours of
sampling completion. Ail other samples; i.e., soil, water, and biota were
taken and delivered to the sample bank by the samplers directly.
The field office was further responsible for carrying out or delegating
the following tasks:
• hazardous waste disposal
• health and safety monitoring and enforcement
• payment of utilities
• public relations
A hazardous waste disposal contractor was engaged to handle the wastes
and paper work associated with them. The waste was turned over the contractor
doublebagged, coded, labeled with the proper chain-of-custody forms and tags,
and sealed in Department of Transportation (DOT) approved barrels to conform
with shipping regulations. The barrels were kept under lock and key until
custody was transferred to the hazardous waste contractor.
Field office personnel monitored the adherence of all field crew members
to health and safety protocol as listed below:
• sampling personnel must wear rubber suits, boots and fit-tested
halfmasks, or completely contained breathing apparatus as needed in
hazardous areas.
• all sewers and sumps must be checked with photoionization
hydrocarbon monitor to assure the safety of entering such areas.
• all samples must be handled with the proper care and respect for
their constituents.
• traffic control must be provided when sampling in roadways.
Payments of utility bills for the homes were made through the GCA field
office. Reimbursements were also made for accommodations and meals to those
homeowners who participated in the occupied/unoccupied field studies.
52
-------
The field office also provided a public relations center when directed by
the EPA on-scene coordinator. Information and site visits were provided as
well as the briefing of officials on the program background and status. The
field office also served as a communications point for homeowners and samplers
to coordinate sampling times which would be convenient for the residents of
occupied homes.
Air, sump, surfacewater, sewer, soil and biota sampling sites were
selected by EPA on location. Homeowners in the Love Canal area were asked to
volunteer their homes for sampling. Those homeowners in residence near the
Love Canal who voluntered their homes were relocated for the duration of the
sampling campaign. The Love Canal area was divided ir.to 11 regions or
sampling areas. Sixty-four (64) homes within these sampling areas were
selected for the study. Sites were chosen on the basis that the owner would
allow sampling to be done in his home, electrical services were available,
running water was available, and the house had a sump. One site in each area
which met all of these criteria was designated as base site for that sampling
area. High-vol air samplers were installed outsice these homes. The
remaining homes in each area (fixed sites) were sampled at the discharge
points which were available. Sites were also selected for background air
levels and air tranport. Special occupied/unoccupied air samples were also
taken during the sampling program. Selected homes were sampled occupied from
October 16 through October 20, and unoccupied (following 2 days to ventilate
the homes from October 23 to October 27. All sites were coded by the GCA
field office, plotted on a USGS map and coordinates were established for each
sampling point. Table 4.2 is a summary of the types of sampling sites
indicating the numbers of samples collected for each medium, including
duplicates, spikes and blanks. Table 4.3 is a daily summary for all field
samples indicating the number of samples collected each day by medium
excluding duplicates, spikes and blanks.
The GCA field office was responsible for the master log of all samples.
As samples were received from field personnel they were logged-in in
accordance with the attached chain-of-custody forms. All available
information, including the sampling point map coordinates were recorded prior
to relinquishing the samples to the sample bank truck which was taken to the
sample bank every evening.
Exceptions to this chain-of-custody procedure were followed for: air
samples, which after being logged-in at the field office were transported in
iced coolers directly to the sample bank; the night sump samples associated
with the air sampling, which were taken directly to the sample bank; and biota
samples which were also delivered directly to the sample bank.
53
-------
TABLE 4.2. SUMMARY OF LOVE CANAL SAMPLING ACTIVITIES
Medium Sites Samples
Air
Regular Program 66
Occupied/Unoccupied 6
Transport Study 5
Request 2
79 3,092
Water
Domestic Tap 47
Sumps 50
Surface Water 18
Sewers 30
145 3,260
Sediment
Surface Water 18
Sewers 30
48 338
Soil
Regular Program 168
Request 5
174 1,503
Biota
Crayfish 2
Worms 9
Mice Traplines 12
Dogs 40
Silver Maple 31
Potato/Oatmeal 19
113 465
Program Total 559 8,658
54
\
-------
TABLE 4.3 SUMMARY OF DAILY SAMPLING ACTIVITY
Sample
collection
Date
July 1980
7-30
August 1980
8-8
9
10
11
12
19
20
21
22
23
26
27
28
29
30
September 1980
9-3
5
6
8
9
10
12
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
October 1980
10 - 1
2
3
4
5
6
7
8
Air Water
4
7
10
9
12
8
15 32
24
39
15
64 41
101
18
2
95 14
80
11
11
35
146 86
30
42
5
4
151 72
66
74
63
4
141 148
20
6
83 20
167 121
160
56
81
61
164 79
24
69
Soil Sediment
2
31
18
30
38
14
17
23
19
71
91
17 31
30 23
33
10
25 28
36 20
32 15
28
34
42
5
34
72 4
38 1
55
68
62
83
Sample
date
Biota total
6
38
28
39
50
22
17
23
19
118
115
87
68
33
10
105
101
18
2
109
80
11
11
35
232
30
42
5
4
223
119
130
110
10 42
323
5 67
6 17
7 110
322
236
27 122
4 140
12 73
1 244
4 96
1 63
13 165
(continued)
55
-------
TABLE 4.3 (continued)
Sample
collection
Date
Air
Water
Soil Sediment
Biota
Sample
date
total
October 1980
10
- 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
23
29
30
31
161
5
2
71
151
17
6
6
7
160
11
6
6
6
158
13
6
19
1
95
11
90
64
41
35
115
154
54
42
67
28
32
14
43
7
80
71 27
42 10
59
67
71
34
10
29
41
39
45
41
5
17
7
1
9
19
18
18
13
16
5
12
42
3
30
23
41
348
169
99
37
68
272
394
123
58
86
35
237
71
57
93
47
158
13
9
97
24
143
11
November 1980
11
- 1
23
1
6
1
2
6
December 1980
12
- 16
17
18
Total
2,010
1
2
14
2,508
1,549 298
369
1
2
14
6,734
aBlanks nd QA/QC samples are not included.
! 1
*i i
& !
3 ;
56
-------
SECTION 5
GEOTECHNICAL
GENERAL
As part of the overall Love Canal investigation, four major geotechnical
operations were undertaken:
• Supervisory Geologist Activities
• Monitoring Well Installation
• Geophysical Surveys
• Ground Water Modeling
These individual operations were required for the complete hydrogeologic
interpretation of the Love Canal area and the possible effects on ground water
quality by the migration of contaminants from the burial site.
Five independent subcontractors were eventually selected by GCA to
perform the geotechnical task. GCA in its role as the prime contractor was
responsible for the coordination of all the operations with the overall
program.
SCOPE OF WORK
GCA Protocols and Evaluation Criteria
In conjunction with the U.S. EPA R.S. Kerr Research Laboratory in Ada,
Oklahoma, e scope of work was defined by GCA. The geotechnical task was
subsequently divided into four separate operations for bidding purposes. Two
separate requests for proposals (RFP) were prepared for bidders. The first
RFP included the supervisory, geophysical, and ground water modeling
operations while the second was exclusively well drilling. Both RFPs
contained detailed protocols specified by GCA in accordance with the
requirements of the scope of work. Proposals were evaluated on specific
criteria (Tables 5.1 and 5.2).
57
-------
TABLE 5.1. TECHNICAL EVALUATION CRITERIA: DRILLING PROGRAM
Responses to the subject RFP were evaluated using the following equally
weighted criteria:
I. Offerer's immediate availability to initiate drilling with sufficient
drill rigs and ability to complete the drilling program by 15 October
1980. The number of available drill rigs and crews will be one measure
of these abilities.
II. Offerer's Quality Assurance/Quality Control Plan.
III. Offerer's experience in drilling the required types of wells in similar
formations and installing the required well points with appropriate
grouting to prevent int'jraquifer contamination.
IV. Offerer's specific experience in the greater Niagara Falls area.
V. Offerer's proposed overall program management approach including
qualifications of assigned personnel; i.e.; experieace, capability, and
percentage of effort devoted to this program.
VI. Offerer's Health and Safety Plan.
58
-------
-------
TABLE 5.2. TECHNICAL EVALUATION CRITERIA: SUPERVISORY GEOLOGIST ACTIVITIES,
GEOPHYSICAL SURVEYS, AND GROUND WATER MODELING
Responses to the subject RFP were evaluated using the following equally rated
criteria:
I. Offerer's immediate availability to initiate work on ths project.
Availability of required equipment.
II. Offerer's Quality Assurance/Quality Control Plan.
III. Offerer's experience in performing the required types of work.
IV. Offerer's proposed overall program management including qualifications of
assigned p&rsonnel; i.e., experience, capability, and percentage of
effort devoted to this program.
V. Offerer's Health and Safety Plan.
VI. Of'.eror's specific experience in the greater Niagara Falls area.
59
-------
The protocols which were established by GCA included ground water and
sediment sampling, rock coring, well drilling, a Quality Assurance/Quality
Control plan, and a safety plan. Each bidder was required to submit
individual QA/QC and safety plans in conformance with GCA. The ground water
and sediment sampling protocols are contained in GCA's Quality Assurance Plan,
Love Canal Study, Appendix A: Sampling Procedures. These protocols included
procedures for purging wells, sample preservation, documentation, and
chain-of-custody practices. The well drilling procedures were outlined in the
second RFP and are summarized here.
GCA reviewed available drilling techniques for monitoring well
installation with several experts and selected those techniques which provided
the maximum effectiveness in the sediment and rock types found in Niagara
Falls. Prevention of cross-contamination was also paramount in the selection
of drilling techniques.
Hollow-stem augering was the method chosen for drilling through the
unconsolidated sediments. Drilling to depths below which auger refusal
occurred was accomplished by utilizing two types of water-rotary drill bits:
NX diamond coring bits, and tri-cone roller rock bits.
The monitoring wells were constructed using 2 in. and 4 in. steel casing
with threaded flush-joint couplings. Well-screens were the 5 ft long
wire-wound stainless steal type welded to the casing.
Two basic types of monitoring wells were designed: shallow (A-type)
wells completed in the upper portion of the glacial till, and deeper (B-type)
wells into the dolomitic bedrock. These two types of wells were generally
paired together approximately 10 ft apart. Several of the bedrock wells were
later drilled to a greater depth in order to penetrate the entire Lockport
Dolomite and extend to the Rochester Shale. Also, seven 2 in. PVC casings
were temporarily installed in the overlying clay unit for the purpose of
performing slug !:ests at various depths.
Bid Preparation and Briefing
The bidding structure was designed to allow contractors to bid on one or
more of the geotechnical operations; the exception being that the well
drilling operation could not be performed in conjunction with any other
geotechnical operation by the same contractor.
Written communications were mailed to a list of qualified contractors
recommended by EPA and GCA (Table 5.3). Invitations were also extended to
contractors requesting to be included on the mailing list.
A bidders conference was held on 28 July 1980 at the Airways Motor Inn
located in Buffalo, New York. The conference was followed by a site visit to
the Love Canal area. Attendance at the meeting was mandatory to receive an
RFP and offer a bid. Proposals were due at GCA offices in Bedford,
Massachusetts by 5 p.m. on 1 August 1980.
60
as 4 ,
-~
-------
TABLE 5.3. PROSPECTIVE BIDDER'S LIST GEOHYDROLOGY
SUBCONTRACTS LOVE CANAL STUDY
Battelle NW A
P.O. Box 999
Richland, WA 99352
Attn: Stuart Brown
509-375-3607
Empire Soils C
S. 3858 Sheldon Road
Orchard Park, NY
Attn: Dean Anderson
716-662-5525
Engineering Enterprises, Inc. C
1225 West Main, Sdite 215
Norman, OK 73069
Attn: John Fryberger
405-329-8300
Geotechnical Engineers, Inc. AB
1017 Main St.
Winchester, MA
Attn: Bartlett Paulding
617-729-1625
GecTrans A
P.O. Box 2550
Raston, VA 22090
Attn: James Mercer
702-435-4400
Goldberg-Zoino, Assoc. A
30 Tower Road
Newton, MA
Attn: John Ayres
617-969-0050
Haley and Aldrich A
238 Main Street
Cambridge, MA 02142
Attn: Richard Stulgis
617-492-6460
(continued)
61
-------
TABLE 5.3 (continued)
8. International Resource Consultants A
P.O. Box \306
Camp Verde, AZ 86322
Attn: Richard Tinlin
602-567-3139
9. JRB Associates, Inc. A
8400 Westpark Drive
McLean, VA 22102
Attn: Robert Colonna
10. Law Engineering ABC
2749 Delk Road
Marietta, GA 30067
Attn: Don Miller
404-952-9005
11. Radian Corporation A
P.O. Box 9948
Austin, TX 78766
Attn: Kirk Holland
512-424-4797
12. Pochester Drilling Co. C
45th Street
Rochester, NY
Attn: Carl Asprinio
716-458-0821
13. Steffen Wolff C
Fisher Road
E. Syracuse, NY
Attn: Steffen Wolff
315-437-1429
14. Steohen A. Alsup and Assoc., Inc. B
2344 Commonwealth Avenue, Suite 1-2
Newton, MA 02166
Attn: Stephen A. Alsup
617-965-0397
15. Technos, Inc. AB
P.O. Box 330891
Miami, FL
Attn: Richard C. Benson
305-858-4665
(continued)
62
\
-------
TABLE 5.3 (continued)
16. Western Geophysical AB
P.O. Box 550
Westboro, MA
Attn: Vincent Murphy
John Doherty
617-366-9191
Key:
A = Geohydrology studies including modeling
B = Geophysical survey
C = Drilling
63
-------
Selection of Subcontractors
A total of 13 proposals were received for the 4 geotechnical operations.
Proposals for each operation were ranked procedurally and technically by GCA
personnel. Unacceptable or technically deficient proposals were eliminated
and acceptable proposals were then evaluated jointly by EPA and GCA
personnel. A consensus was reached on subcontractor selection and
subcontracts were negotiated.
Two proposals were received in response to the drilling services RFP:
Empire Soils, Inc. and Rochester Drilling. Only the Empire Soils proposal was
responsive to all of the requirements of the RFP. Discussions with Empire
Soils indicated that the proposer was technically qualified to perform the
work and could field the required number of rigs and crews to accomplish the
work according to EPA's schedule. A subcontract was neogtiated with Empire
Soils.
In response to the supervisory geologist services RFP, two proposals were
received before the submission deadline: Engineering Enterprises, Inc., and
JRB, Inc. After initial review at GCA, the JRB and Engineering Enterprises
proposals were reviewed and discussed by Jerry Thornh5.ll (EPA, Ada), Paul Beam
(EPA, Washington), and David Cogley (GCA) at a 8 August 1980 meeting in
Research Triangle Park, NC. Both firms were judged to be well qualified. JRB
was selected on the basis that they were more familiar with the Niagara Falls
geology and the principal investigator for JRB was proposed to be assigned to
the project full-time. A subcontract was negotiated with .JRB.
Four proposals were received in response to the geophysical survey RFP:
Steven Alsup, International Resource Consultants (IRC), Technos and Weston
Geophysical. Proposals were scored numerically on: quality assurance, health
and safety, management, methods preposed, availability, and general
experience. This scoring showed that IRC offered services only for complex
resistivity surveys and magnetoraetry whereas the other three firms offered
services for: ground penetrating radary, simple sensitivity, electromagnetic
induction conductivity, metal detection and magnetometry. Of these latter
three firms, Technos was judged to be the most experienced in applying these
methods to hazardous waste site investigations. At the meeting on 8 August,
EPA and GCA pers mnel first determined that complex resistivity al«n» or in
combination with magnetometry could not offer guidance to other program
elements. In addition, there was no experience with the use of complex
resistivity at hazardous waste sites. For these reasons, the IRC proposal was
eliminated from further consideration. After further consideration of the
remaining three proposals, Technos was selected on the basis of their
experience, technical approach, interpretive capability and equipment. A
subcontract was negotiated with Technos.
64
V
-------
In response to the modeling RFP, two proposals were received before the
submission deadline: Battelle and Geotrans. The Battelle and Geotrans
proposals were both judged to be technically acceptable. Battelle proposed to
assign to the project one or more persons with extensive experience at Love
Canal. At the 8 August review meeting, Geotrans was selected on the basis of
price. A subcontract was negotiated with Geotrans.
SUBCONTRACTOR OPERATIONS
Supervisory Geologist
The role of the Supervisory Geologist was to provide trained personnel
familiar with well drilling techniques to monitor the activities of the
drillers. Each supervisor was responsible for core and well logging
descriptions. They were also responsible for recording all hydrological data,
conducting aquifer tests, and distributing the data to the other geotechnical
contractors.
JRB Associates of Maclean, Virginia was selected as the Supervisor
Geologist contractor. JRB supplied a multidisciplinary field team and senior
management personnel for the project. JRB also provided a special team to
purge the wells and sample the ground water.
Geophysical Surveys
Technos Inc. of Miami, Florida was the geophysical contractor for this
project.
These surveys were performed in two phases: a preliminary methods
testing and base-line establishment phase and a confirmatory phase. The
preliminary phase utilized ttie following exploration methods:
• Ground Penetrating Radar
• Resistivity (Surface)
• Electro-Magnetic Induction (Shallow and Deep)
• Seismic Reflection and Refraction
• Magnetometry/Metal Detection
The surveys were conducted in areas of suspected contaminant migration,
historically documented problem areas, the canal proper, and several
relatively undisturbed areas.
65
-------
After the preliminary data were collected and reduced, an interim report
was prepared. The results of each method were evaluated by GCA and EPA to
determine the most effective strategy for Phase II. Seismic methods were
found to be inappropriate for the site conditions and therefore not continued
in Phase II.
The surveys were then continued over selected areas for a more refined
definition of the subsurface profile. Survey results were also used in
determiniiig potential areas for locating drill sites as well as screening
these areas for unknown objects.
Ground Water Modeling
Geotrans of Herndon, Virginia was selected to model both the shallow and
deep aquifers identified in the Niagara Falls area. They reviewed existing
hydrologic data, especially the work by Johnston (1964). The regional and
site specific geology was evaluated and a two-dimensional model was selected.
The historical data were used to calibrate the model and additional data were
made available by the other geotechnical contractors.
The data requirements for predicting flow rates and directions were
discussed and the aquifer test strategy planned. Geotrans assisted JRB in the
design and implementation of the aquifer tests. The tests included slug tests
in seven shallow bore holes, packer tests at 10 ft intervals to identify
permeable zones and short and long term pump tests to measure draw down and
determine if any interconnection between aquifers existed. The hydrologic
data were then reduced, plotted, and calculations performed to predict
migration.
Well Drilling
The well drilling contract was awarded to Empire Soils of Orchard Park,
New York. The driller was required to provide the necessary equipment and
materials to complete the anticipated 245 wells in 8 weeks. It was estimated
that a maximum of 10 drill rigs would be required to be in operation to meet
the scheduling requirements. It was later determined to limit the extent of
drilling to 178 wells.
Those wells constructed outside the restricted area were completed to
just below the ground surface and housed in concrete vaults flush with the
surface* The wells completed within the fenced area were completed with at
least 3 ft steel riser pipes. All of the wells were provided with locking
caps.
Land Surveying
A subcontract was issued by Empire Soils to Rene Sanvageam of Niagara
Falls, New York to survey the well locations. For horizontal control the
wells were tied into Niagara Falls city datum to an accuracy of _+01 in. The
locations were then converted to state planar coordinates and plotted on a
1 in. to 200 ft scale map. Vertical control was established to an accuracy of
66
-------
0.001 in. The elevations were determined relative to mean sea level and were
surveyed to the top of casing, ground level, and several control points along
Cayuga and Bergholtz creeks and the Little Niagara River.
ROLE OF GCA
Due to the complexity of conducting field investigations with numerous
subcontractors, it was necessary for GCA to provide an onsite coordinator for
geotechnical activities. GCA was primarily responsible for interfacing EPAs
study requirements with the subcontractors. GCA1s field office became a
central clearinghouse for pertinent information. Data which were acquired by
the field activities or outside research was promptly distributed to ensure a
high degree of performance. Examples of this type of data include: aerial
photographs, local and regional maps, and state planar coordinates for all the
sampling sites.
Drillsite Procurement
EPA provided GCA with a list of potential drillsites based partially on
th "base" sites previously established by EPA for the overall sampling
program. Additional sites were selected on the basis of availability,
geologic data, the location of swales, and numerous request sites. GCA
reviewed this list with EPA, JRB and Geotrans for additional site
recommendations. After obtaining concurrence from EPA on drillsite locations,
GCA issued drilling authorizations to JRB. Written authorization forms to
enter private property and install monitoring wells was obtained by GCA.
Properties belonging to the City of Niagara Falls were authorized for drilling
by the Mayor's Office through the EPA. State properties required special
permits which GCA obtained through the Department of Transportation,
Department of Environmental Conservation, and the Urban Development
Corporation.
A cooperative agreement was made between GCA, EPA, and the State of New
York whereby well logs were made available to the state in exchange for
permission to drill. At the completion of the project, a number of well^ were
released to the State and the remainder were grouted and abandoned.
After the proj-2r authorization was obtained for a given drillsite, a
special form (Figure 5-i) was filled out by GCA to tabulate pertinent
drillsite information which was distributed to JRB and EPA.
Prior to drilling at any of the sites, GCA contacted the following
utilities for clearance:
• Niagara-Mohawk
• New York Telephone
• National Fuel
67
-------
GCA/TECHNOLOGY DIVISION
DRILLSITE SELECTION
Location:
Owner:
Address:
Telephone:
Permission: Date:
Clearance: NY Telephone—
Niagara-Mohawk—
National Fuel—
Niagara Falls—Water Dept.
Sewer Dept.
Other—
Selection Criteria:
Drillsite Number:
Strata:
Site Code:
Coordinates: E
N
Elevation:
Well Type:
No. of Wells:
Comments:
Drill Dates—Begin
Approval:
Complete:
Date:
Figure 5-1. Drillsite information form.
68
-------
• Niagara Falls Water Department
• Niagara Falls Sewer Department
Each drillsite was also screened by JRB using a metal detector to locate
any buried objects.
TOC/TOX Analyticel Data
Preliminary ground water samples were collected by JRB and delivered to
the GCA Sample Bank. These samples ware analyzed by Advanced Environmental
Systems in Niagara Falls, New York within 24 hours of sampling. The TOC/TOX
values were used to characterize future ground water samples which may require
special handling required by the Department of Transportation for shipping.
TOX values were selected as the criteria for handling the well cuttings
(sediment and rock fragments) as hazardous materials.
The EPA (Ada) determined that a. level of 500 ppm or greater would
constitute a hazardous substance. All well cuttings were drummed and labelled
for storage until the TOX values were determined. Those well cuttings from
which TOX values exceeded 500 ppm were disposed of as hazardous waste by CECOS
International of Niagara Falls, New York.
Drilling fluids were contained in transfer trucks leased by Empire
Soils. The fluids were filtered through a series of sand and diatomaceous
earth filters before discharge into the Love Canal treatment plant collection
system.
Special Sampling and Hydrology
At the request of EPA (Ada), two separate ground water sampling programs
were conducted by GCA. Selected wells were resampled after extensive purging
to confirm the existance of abnormally high pH readings.
The EPA also requested that additional hydrologic data from number of key
wells. Continuous recorders were supplied by EPA and were operated for a 3
day period in November.
Data Evaluation and Reporting
After the completion of the field exploration phase of the geotechnical
task, GCA outlined, reviewed, made recommendations, and approved the reports
of the subcontractors. GCA also reviewed and evaluated the analytical data on
ground water quality with EPA at the R. S. Kerr Research Laboratory in Ada,
Oklahoma.
•'-I
69
-------
SECTION 6
ANALYS IS
The magnitude and diversity of the analytical program for the Love Canal
Study required that a number of subcontractor laboratories be employed for its
conduct. GCA was responsible for procuring such subcontractors, implementing
the EPA analytical protocols, managing appropriate sample disbursement, and
monitoring subcontractor analytical progress. These activities are described
in some detail below.
INTRODUCTION
The analytical portion of the Love Canal Study measured volatile
organics,* pesticides, and metals in air; volatile organics, semivolatile
organics, pesticides, and metals in water, soil, sediment, and biota; fluoride
and nitrate in water; and additional parameters in ground water only.
Analytical protocols and x-equired internal QC measures were provided by EPA.
These analytical methods are briefly stated below. Full analytical and QC
procedures can be found in Appendix B to the Project QA Plan.
The analysis of the Tenax cartridges for volatile organics in air was
accomplished by thermal desorption, cryogenic trapping and capillary column
gas chromatography/mass spectrometry (GC/MS). The polyurethane foam plugs
were extracted with 5 percent ether in hexane and, following column
chromatographic cleanup, analyzed for pesticides in air by electron
capture-gas chromatography (EC-GC). Analytical confirmation was by GC/MS.
The hi-vol filters were analyzed for metals in air by inductively coupled
plasma (ICP) and neutron activation analysis (NAA) by EMSL/RTP.
Water samples were analyzed by Methods 624, 625 and 608 for volatile
organics, semivc'atile organics and pesticides, respectively. Analysis for
metals was by ICP or Atomic Absorption Spectroscopy (AAS) except that AAS was
required for mercury and selenium. In addition, ground water samples were
analyzed for Total Organic Carbon (TOG) and Total Organic Halides (TOX) and
their pH and conductivity were measured.
*Refer to Appendix A (pp. A-2) for compound hit lists and explanation of terms
used for compound classes.
70
'\
-------
The soil and sediment samples were analyzed for volatile organics with a
modification of Method 624 for purgeables. In this analysis, a soil or
sediment sample was placed in the purge chamber, a prescribed amount of
organic-free water was added, and the sample purged for 12 minutes. The
sample was then analyzed by the standard purge and trap GC/MS system used in
Method 624.
The analysis of soils and sediments for seraivolatiles was not so straight
forward in that a methods evaluation needed to be conducted on 48 samples to
determine which of three available procedures would produce the best results.
The data from these first 48 samples were evaluated by EPA and discussed with
GCA in order to determine the most suitable protocol to be used for the
remainder of the program. The selected method consisted of extraction using
an homogenization apparatus. Samples were extracted with pHll and then pH2
methylene chloride to obtain the base/neutral and acid extractables,
respectively. After concentration of the extracts, an aliquot of each was
taken to form a combined extract. The combined fraction was then analyzed by
capillary column GC/MS.
Pesticide analysis of soils and sediments required sample preparation
methods specific to each medium. For soils, a soxhlet extraction with
acetone-hexane was followed by extract concentration and successive cleanup by
aluminum oxide and Florisil columns. The sediment extraction procedure used
column elution with 1:1 acetone:hexane, washing with water, and extraction of
the pesticides from the water by methylene chloride in hexane. In both cases,
the analysis of the extracted pesticides was accomplished using Method 608
(EC-GC). The priority pollutant metals in soils and sediments were determined
by AAS after an acid digestion.
Various biota samples were analyzed for organic and inorganic species of
interest to the Love Canal Study. Biological materials, specifically mice,
worms, and crayfish, were analyzed for seraivolatile organics and pesticides
using both GC/Hall and GC/MS after appropriate digestion and extraction
procedures. Potatoes and oatmeal placed at the sampling sites were designated
as "foodstuffs" and analyzed for volatile organic compounds. The analytical
method consisted of enclosing the sample with an acid solution in a septum
vial, digesting the sample to generate a headspace sample within the vial, and
analyzing the headspace for the hitlist compounds by GC/Hall. Metals were
measured in vegetation by ICP and AAS after acid digestion and in hair by AAS,
also after acid digestion.
SUBCONTRACTOR SELECTION
Understanding Program Requirements and Bid Package Preparation
The initial work performed by GCA involved the familiarization of GCA
staff with the methodology requirements and Che stage of development of the
program. This activity entailed a series of meetings with the various EPA
groups during the period of 9 July to 19 July 1980. It became clear very
early on that the methods were, with some exceptions, very near implementation
71
J
-------
and that GCA's role was to execute the program using tl.e analytical method
specified by EPA and to obtain data consistent with the capabilities of the
methods. Our objective then was to document the analytical methods and EPA-
required quality control measures in sufficient detail that bid packages and,
subsequently, analytical procedures manuals could be prepared.
In order to prepare the bid packages, GCA staff consulted with the
responsible EPA personnel or their designees for each medium to be addressed.
Brief summaries of the results of these consultations and specific problem
areas are given below for each medium.
Air—
EMSL/RTP had responsibility for the volatile organics and trace metals
and HERL/RTP had responsibility for setnivolatile organics in air. Because
EMSL/RTP elected to perform the trace metal analyses iii-house, no
subcontractors were solicited for these analyses. For the organic species,
each EPA lab provided a list of targeted compounds (see Appendix A, Tables A-l
and A-2) and protocols for the analyses outlining both the methodology to be
used and the quality assurance requirements. GCA developed, in conjunction
with the EPA* labs, the instrument performance specifications and the type and
amount of quality control necessary to meet the QA requirements. These
performance specifications and QC requirements were incorporated into the
appropriate analytical protocols; the number of QA samples to be analyzed was
added to the field sample totals to obtain the number of samples requiring
analysis. The volatile organic analyses were designated as Task I and the
semivolatile requirements as Task II of the bid package.
Crucial to this issue of organics in air analysis was the provision of
the specified collection media. Because the sampling effort was to commence
immediately, in order to meet the time constraints of the program, the
solicitation of competitive bids for this portion of the program was
precluded. This fact was fully recognized by both EPA labs involved.
Therefore, GCA solicited, in each collection medium, bids from organizations
recommended by the EPA labs** and proceeded to let sole source subcontracts
to these organizations for the preparation of the collection media. PEDCo
Environmental was subcontracted to perform preliminary cleanup on the Tenax
resin and to prepare sampling cartridges. SWRI was subcontracted to perform
cleanup on the PUFs.
*For the volatile compounds collected on Tenax, the staff of RTI was also
consulted (vida post).
**The recommendations made by EPA were, in the opinions of the responsible GCA
staff, clearly as qualified as any laboratories that could have been
solicited for the media preparation tasks.
72
-------
The procurement of the collection media differed for volatiles and
oeraivolatiles. Because HERL/RTP had a sufficient quan ity of PUF plugs for
the collection of semivolatile organics, no external procurement reed existed;
this medium was simply supplied to the selected subcontractor. For the
volatile organics, GCA elected to procure a sufficient quantity of Tenax and
to provide the required amount to the selected subcontractor.
Water—
A hit list of compounds was developed by the responsible EPA
laboratories. This list included the "priority pollutant" compounds and
several nonpriority pollutant species that were suspected to be present in the
area. (See Table A-3 in Appendix A.) The Federal Register (FR) methods
(Methods 624, 625, 608) were specified as the analytical protocols for organic
compounds and the ICP option for the matals.
Prior to commencement of the program, it was recognized by EPA that all
compounds on the hit list were not amenable to analyeia by the specified FR
procedures. To address these problems, EPA provided a statement of the
problems and recommended solutions as footnotes to the compound hit list. The
problems that needed resolution are listed in Table 6.1. For items 1 and 6
(acrolein and acrylonitrile and benzidine), the alternate analytical methods
would be used for quantitation if Method 624 or Method 625, respectively,
detected their presence. Items 2 and 3 (N-nitrosodimethylamine and
hexachlocyclopentadiene) would be resolved by the use of fused silica
capillary column methodology. For items 4 and 5 (1,2-diphenylhydrazine and
N-nitrosodiphenylaraine), the indicated reporting procedure would be used.
Because all samples were being analyzed by Method 608 for pesticides and PCBs,
the 608 extracts would be available for GC/MS analysis. This additional
specification was added to the prctocols; i.e., if Method 608 compounds were
observed, at concentration levels observable by GC/MS, then GC/MS confirmation
of their presence was required.
In order to detect compounds of potential interest that were not included
on the list, the requirement to identify, for each sample, up to 20 additional
compounds present at relative concentrations above a specified level was added
to the protocols. Finally, for reasons of cost-effectiveness, it was desired
to perform the GC/KS analysis of Method 625 compounds by combining the acid
and base/neutral extracts into a single sample; also, for compounds that
partition into both the acid and base/neutral extracts, higher quality data
would result when extracts were combined. Because existing data indicated
that the fused silica capillary column methodology was capable of the
resolution required, the combination procedure was added to the protocols as
an option. This option would be exercised only after additional data were
collected and deemed satisfactory. The additional data would be generated by
each subcontractor analyzing the first 15 samples using both combined and
individual extracts.
73
-------
TABLE 6.1. PROBLEM COMPOUNDS
1. Acrolein and Acrylonitrile
These compounds may be observed to about 1 milligram per liter with
Method 624. If detected the quantitative analysis must be accomplished
by purging a second aliquot of <_.;<. sample at 85°C as described in Method
603.
2. N-nitrosodimethylamine
This compound is not resolved from the solvent with the standard columns
required for Method 625. It may be resolved with a 30 meter fused silica
SE 54 capillary column.
3. Hexachlorocyclopentadiene
This compound decomposes in the injection port at the temperature
required for Method 625. A second injection at a low injection port
temperature is required to observe this compound when using the standard
Method 625 columns. However, this compound is sufficiently stable for
analysis with the 30 meter fused silica SE 54 capillary column.
4. 1,2-diphenylhydrazine
This compound decomposes in the GC injection port to azobenzene, and
cannot be distinguished from it by 70 eV mass spectrometry. If observed
it must be reported as 1,2-diphenylhydrazine or azobenzene.
5. N-nitrosodiphenylamine
This corapou. A decomposes in the GC injection port to diphenylamine, and
cannot be distinguished from it by 70 eV mass spectroraetry. If observed
it must be reported as N-nitrosodiphenylaraine or diphenylamine.
6. Benzidine
This compound is decomposed during the solvent concentration and drying
steps of Method 625. It may be observed, but the concentration
measurement cannot be considered reliable. It must be confirmed using an
alternative method such as Method 605 (Federal Register, December 3,
1979).
7. Alpha-, Beta-, Gamma-, and Delta-BHC, Endosulfan I&II, and Endrin
These compounde are sensitive to the pH=12 extraction conditions of
Method 625 and are not observed under normal conditions. These compounds
must be measured using Method 608 with confirmation by GC/MS using the
Method 608 extracts.
74
-------
r
It was GCA'a responsibility to incorporate the modifications and
additional specifications required to meet the program neads and provide
written procedures. As for the air media, the quality conrrol measures
necessary to monitor control limits and to characterize the quality of the
resulting data were determined in conjunction with EPA. and these requirements
were added to the bid packages.
In addition, GCA assumed the responsibility of procuring certain
materials to be supplied to the subcontractors; the materials supplied to the
subcontractors will be discussed in more detail in the following sections.
Soils and Sediments—
EMSL/LV was the EPA laboratory responsible for the soil and sediment
analytical protocols. The list of compounds targeted for analysis was the
same as the water list. The methods selected for these analyses were
basically modifications of the 624, 625 and 608 methods for organics.
Standard preparation procedures with an atomic absorption analytical finish
were selected for the metal analyses.
It was recognized by EPA, at the outset of the program, that specifying
procedures adequate for the preparation of soil and sediment samples for
organic analysis was problematic because no standard, or even widely accepted,
protocols existed. For this reason, EPA requested that a method study be
conducted as part of the program. In order to conduct this study, 2PA
provided three procedures to be evaluated. These procedures would be
evaluated by analyzing the first 50 samples (40 soils and 10 sediments) by all
three procedures. The data generated from this study would be provided to EPA
in order to assist in selecting the method most appropriate to analyze the
remainder of the samples. GCA documented these requirements and incorporated
the study into the bid package as a separate task (designated as Task IV).
The methods study was made a separate task principally to facilitate the
bidding process and to allow evaluatioa of proposals in terms of the methods
study experience of the individual offerers.
The analytical work required for the remainder of the organic analysis of
soil and sediment samples was designated as Task V in the bid packages.
Offerors interested in bidding Task V were required to provide individual bids
for each of the three procedures. As for the other tasks, the internal
quality control requirements were specified and the external quality control
and quality assurance samples were added to the sample totals for bidding
purposes.
As mentioned above, GCA assumed responsibility for providing to the
selected subcontractors certain materials required to complete the program.
These included:
• Fused Silica Capillary Columns
• Analytical Standards
75
-------
• Internal Standards -_._
» Surrogate Compounds
v
• Quality Control Samples
The specifications for and source of these materials were provided by the
responsible EPA laboratories. GCA procured a sufficient number of narrow bore
fused silica capillary columns (from the same production run) to provide each
subcontractor with at least one column. GCA procured the internal standards
and surrogate compounds and provided them to EMSL/LV for disbursement. _j_
Quality control samples (sealed ampules of concentrates) were provided to GCA
by EMSL/CI for disbursement to the subcontractors. At the request of GCA, «,
EMSL/LV (via the EPA standards repository) provided the analytical standards " ->^
to the subcontractors.
N
As for the water samples, the option of analyzing the sample extracts by ,
combining certain extracts was included in the bid packages. This led to L • •.
several configurations to be bid upon because extract combinations, if any,
would depend on the preparatory method selected. The extract combination i \s
decision was not made part of the methods study but wr.s deferred until an -
extraction procedure was selected. If extract combination remained an option
after determining the extraction procedure, then combined and uncombined
extracts would be analyzed on a sufficient number of samples to make the
decision.
Biota—
EMSL/LV had responsibility for the biota program. At the time the bid
packages were being prepared, the design of the biota program was not
specified. However, analytical procedures for semivolatile organics in
specific biota were provided that were sufficient to obtain competitive bids.
Therefore, the procedures were incorporated into the bid package along with
the QC and QA requirements. The analytical effort for biota was designated
Task VI.
Bidders Conference
Because of the stringent time frame of the program, a formal CBD
announcement of the subcontracting requirements could not be made. Therefore,
GCA contacted a number of organizations potentially qualified to perform the
analytical work and invited them to a bidders conference. We began with lists
of potentially qualified companies provided by the EPA laboratories and
supplemented these lists with other companies which we felt qualified. In
addition, all organizations who gained knowledge, by whatever means, of the
upcoming bidders conference were invited to participate. Table 6.2 lists
these prospective bidders. GCA felt that a bidders conference was mandatory
because of the time constraints which essentially precluded any extensions in
bid response time for scope of work clarifications.
76
i~\>"^ ._ "•>•-;\-_^ .v;':"^-'^~^—— . v$
/ ..' "^^.i^c:_^./--jsS:-j'v-x—-'• -'y^r "-vx ^" 7* s^r*
v\S ••
\
-------
TABLE 6.2. PROSPECTIVE BIDDERS LIST
Company
Attended
bidders conference
Submitted
proposal/bid
Acurex Corp.
485 Clyde Avenue
Mountain View, CA
94042
Battelle Memorial Institute
505 Kings Avenue
Columbus, OH 43201
California Analytical Labs
401 N. 161th Street
Sacramento, CA 95814
Environmental Science & Engineering
P.O. Box ESE
Gainesville, FL 32604
Geomet
15 Firstfield Road
Gaithersburg, MD 20760
Gulf South Research Institute
P.O. Box 1177
New Iberia, LA 70560
IIT Research Institute
10 West 35th Street
Chicago, IL 60616
Jacobs Engineering (PJB)
251 S. Lake Avenue
Pasadena, CA 91101
Midwest Research Institute
452 Volker Blvd.
Kansas City, MO 64110
Radian Corporation
P.O. Box 9948
Austin, TX 78766
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
Yes
No
Yes
Yes
Yes
Yes
Yes
(continued)
77
• i '
;A-
-------
TABLE 6.2 (continued)
Company
Attended
bidders conference
Submitted
proposal/bid
PEDCO Environmental
11499 Chester Road
Cincinnati, OH 45246
Systems Science Software
P.O. Box 1620
LaJolla, CA 92038
Southwest Research Institute
P.O. Box 28510
5220 Culebra Road
San Antonio, XX 78284
TRW
3 New England Executive Park
Burlington, MA 01803
University Hygienic Lab
University of Iowa
Oakvale Campus
Iowa City, IA 52242
Monsanto Research
Station B, Box 8
Dayton, OH 45407
OH Materials
P.O. 1022
Findl-.y, OH 45840
Science Applications, Inc.
464 Prospect Street
P.O. Box 2351
LaJolla, CA 92038
ERCO
185 Alewife Brook Parkway
Cambridge, MA 02183
Yes
Yes
Yes
Yes
No
Yes
Yes
Yes
Yes
Yes
No
Yes
Yes
No
No
Yes
Yes
Yes
(continued)
78
-------
TABLE 6.2 (continued)
Attended Submitted
Company bidders conference proposal/bid
Measurement Sciences Corp. Yes Yes
300 Garden City Plaza
Garden City, NY 11530
Compuchem No Yes
A Division of Mead Corporation
5 Triangle Drive
Research Triangle Park, NC 27709
Kemron Environmental Services No Yes
32740 Northwestern Highway
Farmington Hills, MI 48018
Advanced Environmental Systems, Inc. No Yes
Monitoring and Support Laboratory
P.O., Box 165
Niagara Falls, NY 14304
79
-------
A bidders conference was held on July 22, 1980. The conference vas
attended by representatives from 19 organizations as indicated in Table 6.2.
Each organization was provided a copy of the RFP which included, for each
task, a scope of work, a statement of the deliverables, and the procedures to
be used. The scope of work was addressed in some detail verbally with
reference to specific procedures and other points in order to introduce the
program requirements to the potential offerers in enough detail to provoke
specific questions.
The minutes of the conference were recorded by a qualified court recorder.
Proposal Evaluation
Proposals were received from 19 organizations as indicated in Table 6.2.
One proposal was not accepted because it was received late (the following
day). All proposals were received by GCA Contracts Administration where the
technical and cost proposals were separated. After receiving all proposals,
the technical portions were provided to the technical staff for review.
The technical proposal review was conducted in three phases. Each phase
of review focused on one or more of the proposal review criteria set forth in
the RFP package. These criteria are listed in Table 5.3. First, the
proposals were reviewed in terms of their responsiveness, in general, to all
criteria. This overall evaluation emphasized quality assurance management and
experience in the analysis of environmental samples. All bidders that (1)
failed to provide a description of their QA/QC program, (2) did not have
experience in the analysis of environmental samples and (3) did not delineate
their management approach were eliminated. This phase resulted in the
elimination of three offerers.
The second phase of the review dealt with determining the specific
qualifications of the offeror versus the individual tasks they bid. In order
to be considered further as a candidate for a specific task, the offerov must
have had experience with the specified method, have the necessary facilities
and equipment (except those to be provided by GCA), and have the qualified
personnel required. This phase of the review resulted in a listing of
potential subcontractors grouped according to "task qualified for."
In the third phase, the capacities and the preferences (statements of
preference were solicited) of the qualified offerers were considered. Some
offerers were qualified for more than one task but stated a preference of a
particular task and/or stated the impact on their ability to perform a task if
awarded certain other tasks. Finally, while some offerers were qualified to
perform certain tasks, the quantity of analyses bid was very small compared to
the total program needs. These offerers were not eliminated from
consideration; however, they would not be considered as prime candidates
unless necessary to meet the program requirements. The data from this phase
were used to construct matrices of possible award configurations.
80
-------
TABLE 6.3. TECHNICAL EVALUATION CRITERIA
Responses to the subject RFP will be evaluated using the following equally
weighted criteria:
I. Offerer's immediate availability to initiate analysis, ability to respond
on short notice with sufficient analytical personnel, and ability to
complete the sample analysis program by 30 November 1980. The
demonstrated analysis throughput of each laboratory will be one measure
of these abilities.
II. Offerer's Quality Assurance/Quality Control Plan.
III. Offerer's experience in analyzing environmental pollutants in air, water,
soils, sediment and biota, as appropriate to your bid.
IV. Offerer's availability of appropriate instrumentation and other
facilities required to complete the program.
V. Offerer's proposed overall program management including qualifications of
assigned personnel; i.e., experience, capability, and percentage of
effort devoted to the program.
81
-------
Next, the cost proposal data were added to the configuration matrices
developed from the technical review. At this point an analytical program
configuration that was believed to be in the best interest of GCA and EPA was
constructed. It should be noted that in the recommended configuration, the
offerers would not receive as many samples as their stated capacity but
rather, in most cases, about 50 percent of the stated capacity.
Recommendations to EPA and Contract Awards
GCA staff presented the recommended configuration to EPA on August 6,
1980. The presentation was made in a series of meetings at RTP and by
conference call. Representatives from EMSL/RTP and HERL/RTP (air media),
ERL/Ada, Oklahoma (ground water), and EMSL/CI (water QA) attended the meetings.
In addition to the data used to construct recommended configuration, all
information resulting from the evaluation was made available to the
responsible EPA staff. One offerer in the air analysis area was questioned by
EPA. EPA staff indicated a problem with this offerer's lab that surfaced
during previous work at Love Canal. The problem was clearly defined and
involved the offerer's failure to adequately prepare the sampling medium.
Because this offerer would not be preparing the sampling medium and because he
was qualified to perform the analytical task, it was agreed that his selection
would not be detrimental to the program.
The presentation of GCA recommendations was then made to EMSL/LV, via a
conference call, regarding their program responsibilities. Again one offerer
was questioned as to his ability to deliver on schedule. GCA explained that
this offeror had been explicit and emphatic in his proposal regarding the
issue of capacity. Nevertheless, EPA directed GCA to visit this offerer's
facility and if, as a result of the visit, GCA remained convinced of his
ability to deliver on schedule, then the offeror would be acceptable. The
visit was conducted and it was concluded that the offeror was definitely
capable of performing on schedule.
The meetings resulted in approval of GCA recommendations. The
subcontracts were then negotiated and prepared for EPA Contracts' approval and
bilateral execution. At this point the analytical program was configured as
shown in Table 6.4. The subcontract with MRI was not finalized until later in
the program; i.e., after final selection of the biota media and the analytical
methods to be used. Also, SWRl's subcontract was subsequently modified to
include a portion of the biota analyses (trace metals in biota and volatile
organics in foodstuffs) and Task VII.
After some problems emerged with one of the subcontractors (vida post), a
subcontract was negotiated with TRW, Inc. of Redondo Beach, CA for the
analysis of water samples. In addition, several sole source subcontracts were
let based on EPA recommendations and additional program requirements.
Research Triangle Institute (RTI) was subcontracted to prepare spiked standard
and QC Tenax cartridges for use in the air monitoring program. As an
82
-------
TABLE 6.4. ANALYTICAL PROGRAM CONFIGURATION
Subcontractor I II
Acurex Corp.
Battelle X
CompuChera
ERCO
GSRI X
MRI
PEDCO3 X
PJB
SWRIb X
III IV V VI VII
XXX
X X
X
X X
X
XXX
X X
aPEDCO was also selected to prepare the Tenax
sampling media.
°SWRI was also selected to prepare the PUF sampling
media.
i 1
N
83
-------
-fl -^
Ij
II
II
I! -
additional check on the Tenax QA/QC, IIT Research Institute (IITRI) was
subcontracted to analyze a number of the standard and spiked Tenax cartridges
using independent calibration. Advanced Environmental Systems (AES) was
subcontracted to perform TOC, TOX, pU, and conductivity measurements on ground
water samples prior to their shipment from the Sample Bank so that highly
dangerous samples could be appropriately handled. AES was chosen, in part,
because of their location in Niagara Falls, NY. The final analytical
subcontractor, Wright State University (WSU), was chosen to perform dioxin
analyses in all media. WSU was recommended by HEXL/RTF, which directed the
dioxin study, because they were known to be experienced in the analysis of
dioxin-containing samples, had a high resolution mass spectrometer for use in
this program, and demonstrated a safety program adequate for the handling of
dioxins. This brought the total number of subcontractors to 14 organizations
involved in the analytical portion of the program.
PROGRAM MANAGEMENT
The role of GCA in establishing and conducting the analytical program of
the Love Canal Study is described below for each medium. Despite its seeming ,]
to be a rather routine matter of passing on information from one organization I
to another, the time constraints placed OR project start-up and the need to ; j
coordinate activities of 5 EPA laboratories, 12 analytical subcontractors and
2 analytical QA subcontractors proved hectic in the early weeks of the ]
program. Many of the internal QC measures and control limits were finalized j
as subcontracts were being prepared and required considerable discussion with ']
EPA personnel, the potential subcontractors and members of GCA1s analytical ,;
staff. After all analytical and quality control procedures had been i |
established and the program was underway, similar discussions occurred if j i
analytical subcontractors or EPA noted chronic problems with a measurement or j ]
sample type. Substantial effort was dedicated to keeping the subcontractors ! j
apprised of analytical problems encountered by other laboratories which were } \
relevant to their own activities on this project. Likewise, EPA personnel I i
were informed of analytical problems requiring their action and, as necessary, ! i
were updated on current progress of the subcontractors. j
i i_
GCA technical staff members maintained project notebooks documenting ' J
their activities on the Love Canal Study. Excerpts from these notebooks i i
giving examples of actions taken by GCA are given below. It is expected that i
these examples will serve to highlight some of the more important aspects of
GCA's involvement in the study and will demonstrate the variety of activities
conducted in fulfilling the needs of so diverse a study.
Air—
The monitoring program for ambient air samples collected in the Love
Canal area was designed to provide information on three types of chemical
substances. Each general class of pollutants* required its own specific
sample collection method resulting in three distinct sample types, each with a
prescribed method of analysis and differing QC requirements. In addition, QC
requirements were specified by the EPA laboratories involved in the air
monitoring program. As stated before, volatile organic compounds were
-------
collected on Tenax-GC resin and their monitoring was the responsibility of •
EMSL/RTP. Pesticides and related compounds were collected on polyurethane '
foam plugs in a segment of the program supervised by HERL/RTP, and inetals were
monitored on hi-vol filter samples under the direction of another group at
EMSL/RTP. These three groups designated different degrees and areas of
responsibility to GCA for the performance of the air monitoring program.
Ten.ix cartridges—Selected volatile organic species were measured in
ambient air as part of the Love Canal Study. Samples were collected by
drawing specified volumes of air through glass cartridges packed with ;
Tenax-GC. Analysis of these cartridges, or traps, was then accomplished by
thermally desorbing the Tenax into a GC/MS system for quantitation and
identification of the specified compounds. :
GCA's role in the implementation and day-to-day management of this
program consisted of collection and subsequent distribution of the analytical
data and monitoring the internal QC data. As previously discussed, GCA, in
consultation with technical personnel from EMSL/RTP, selected subcontractors
for sample analysis, Tenax preparation, and QA activities. These '
subcontractors and their roles were as follows:
• Research Triangle Institute (RTI) prepared standard calibration
cartridges as well as blind quality assurance spiked traps. RTI was
also responsible for checking background 1evel contamination of trap
batches. .
• PEDCo Environmental was selented as an analytical subcontractor. «'
PEDCo was also responsible for preparing the traps by first cleaning
the Tenax and then packing the tubes and checking each batch of •
tubes for background contamination. :
• Battelle Columbus Laboratories was selected as an analytJoal ;
subcontractor. I
5
• IIT Research Institute (IITRI) was chosen to perform analyses on 5
calibration and check traps for accuracy and precision. :
{
#
Ac a meeting in Research Triangle Park on 26 September 1980, a discussion S
of problems associated with tha analyses was held. Attending th« meeting were \
representatives from EPA, GCA, Battelle Columbus Laboratories, PEDCo •
Environmental, and Research Triangle Institute. Problems in shipping, broken j'
traps, benzene and toluene contamination of the Tenax, and problems wi.th >
meeting QC criteria, especially by Battelle, we.re discussed. Resolution of jp
some of these problems was accomplished as described below. J
*Refer to Appendix A (pp. A-2) for compound hit lists and explanation of terms
used for compound classes.
85
V
-------
To ensure compliance with column performance specifications, PEDCo was to
purchase a capillary column from Battelle. Also, GCA was to inform its
sampling subcontractor to pack traps more securely in their culture tubes
after sampling. This resulted in a drop in the breakage rate, although some
inevitable damage still occurred.
Problems with low levels of benzene and toluene contamination of the
Tenax (approximately 20 ng/tube) continued throughout the program. QC
monitoring of one batch of tubes (PEI-O) resulted in the recleaning of the
batch because of high contamination levels. It was determined chat low levels
of contamination could not be prevented. In addition, because contaminant
levels were observed to rise with the age of the tube, strict monitoring of
the 30-day limit between tube preparation and analysis was observed.
It was also determined at the meeting that GCA would find another
analytical subcontractor to perform analysJs on standard and check traps to
determine their precision and accuracy. In early October 1980, GCA contacted
IIT Research Institute requesting quotes on analytical services to include the
analysis of nine traps (six standards and three quality assurance traps).
IITRI was not to be notified of concentrations of compounds in the traps,
rather, they were to set up calibrations using their own systems and to
quantitate versus these calibrations. In this manner, a check would be made
on the standards preparation. GCA amended the contract of RTI to provide a
redistribution of QA and standard traps to facilitate this test. This effort
resulted in the external verification of 3 batches of spiked tubes (from a
total of 12 batches). Results of these analyses were to be submitted to GCA
within 7 days of the analysis and forwarded to EMSL/RTP. IITRI was selected
as subcontractor for this work and received their first set of samples on 28
October 1980.
GCA continued to monitor the quality control sample results for both
subcontractors following onsite audits in October 1980. At the request of
EMSL/RTP, results of all calibration check samples were submitted verbally as
soon as available to GCA. After compilation and statistical analysis, these
data were then relayed to EMSL/RTP where potential or ex_sting problems were
discussed. This compilation of data continued until the end of the program.
GCA distributed field samples to tne subcontractor labs so as to prevent
overloading and to enable the labs to meet th» stringent time compliance
requisites for the analysis. As previously mentioned, monitoring of analysis
dates was necessary due to contaminant levels. GCA established a schedule
involving tube preparation, background determination, standard tube spiking,
sampling, and analysis. Usirg chain of custody and shipping documentation,
each batch of tubes was monitored to ensure that all traps were sent to the
analytical laboratories no more than 10 days after preparation and x;ere
analyzed within 20 days of receipt.
86
1C
r*.
-------
GCA also served as an informational intermediary for EPA, gathering such
data as spiking techniques for both standards and internal standards,
specifics on tube preparation, levels of contaminants, and quantitative
techniques. Problems encountered by the subcontracting laboratories were
relayed through GCA to EPA, and, when required, followed up by GCA personnel
and resolved in consultation with EMSL/RTP. For example, PEDCo had analyzed
several tubes with high enough concentrations of a compound to cause
instrument shutdown. After consultation with EMSL/RTP, it was determined that
the problem compound was water, which had been trapped on the tube during
sampling. Visual examination of the problem tubes prior to analysis had shown
some moisture in the culture tubes enclosing the traps. A check of
meteorological data from the sample date confirmed high humidity. To rectify
the problem, it was decided tuat any traps showing condensation would first be
dried in the culture tubes with sodium sulfate. No further problems of this
sort were encountered.
Finally, GCA was requested by EMSL/RTP to obtain data on absolute area
counts for internal standards spiked onto each sample tube by the
subcontracting laboratories. Battelle had submitted this as part of its data
package, and a slight modification to the computerized data reading algorithm
allowed this listing. PEDCo, however, haa not included these data and a
further subcontract was awarded to complete this listing.
Polyurethane foam plu^s (PUFs)—The analytical procedure for pesticides
in polyurethane foam plugs was initially provided to the subcontracting
laboratories in the Appendices to the analytical RFP. This method package had
been obtained from HERL/RTP, the EPA laboratory responsible for this program
area. It included extraction, concentration, clean-up and EC-GC procedures
but did not address che specifics of quality control. Extensive discussion
was required to establish a QC program which would be satisfactory to the
goals of HERL and the overall project and which could also be reasonably met
by the analytical subcontractors. One of GCA1s analytical QC coordinators was
responsible for initiating this discussion with HERL and, as necessary and as
requested by HERL, obtaining input from the PUF subcontractors regarding
reasonable frequency and control limits. While the ultimate decision for this
area of the °tudy was made by HERL, the subcontracting laboratories aluo were
involved in designing the internal QC program. GCA served as a mediator and
was responsible for all parties involved having correct and complete
information. This series of information exchanges culminated in
correspondence from HERL/RTP to GCA on August 27, 1980. This memorandum
served to clarify and confirm internal QC procedures for the polyurethane foam
plugs analyses. Summaries of entries in Notebook No. 1-619-026-124 of the
Analytical QC Coordinator responsible for establishing and implementing this
area of the program, illustrate GCA1s role as follows:
87
-------
Date Notebook Entry
8/10/80 This first entry regarding PUFs analysis wa, primarily an
exchange of information within GCA. Ms. B. Myatt was preparing
analytical protocols for inclusion in subcontract materials and
needed the internal QC procedures thus far established by
EPA-HERL in consultation with Cr. S. Zclenski of GCA. It was
determined that one (1) method blank and one (1) spike recover'
would be processed with each batch of extracts and that
acceptable recovery ranges would be established by data
collected in the Love Canal Study.
8/12/80 It was recognized that several items of QC specified in other
protocols were not included in the PUFs analysis. These
included standardization and calibration of the instruraent(s),
a list of compounds to be used in lab control standards (LCS)
and the use of surrogates or replicates. A series of
conversations were recorded between Ms. Myatt and Dr. Zelenski,
Merrill Jackson of HERL and Randall Watts of HERL. Details of
the requisite QC operations were given and the QC protocol for
inclusion in the PUFs subcontracts was delineated.
8/22/80 Some questions had arisen within GCA as to whether some of the
internal QC requirements were too stringent (i.e., calibration
check every 2 hours when only running one sample per hour by
GC). Discussion with Dr. K. McGregor of GCA and Dr. R. Lewis
of HERL led to Ms. Myatt1s being asked to contact the two
analytical subcontractors involved to get their opinions on
appropriate calibration and standardization procedures to meet
quality needs and also to be consistent with the sample
throughput required. Also, analysis of phenols by HPLC had
been approved in the RFP for this task but GCA was having
difficulty obtaining the method which was still in the
prepublication stage. If the subcontractors were not using the
HPLC, ic would be omitted from the analytical methods manual.
8/22/80 Telecons with Dr. E. McGovern of SWRI and Dr. R. Novak of GSRI
and were recorded. These communications summarized each
8/25/80 laboratory's usual standardization procedures for GC and
recorded their agreement to meet required instrument response
and detection limits. Points were brought up (internal
standard) which necessitated further discussion with HERL/RTP.
8/26/80 Telecons with R. Watts and M. Jackson of HERL and McGovern and
Novak were recorded. Information was transmitted regarding
achievable standard checks and possible compounds for internal
standards. The subcontractors were contacted directly by HERL
to finalize details. HERL was to call B. Myatt regarding the
resolution.
88
-------
8/27/80 R. Watts of HERL called B. Myatt to give final standardization
procedures and QC limits. The HPLC method was released and
assigned an EPA report number, and correction factors for
compound collection efficiency were transmitted. The
memorandum cited above was written and sent to GCA as
confirmation of the finalized procedures. It included the PUF
sampling and HPLC analytical procedures. M. Jackson of HERL,
was contacted regarding shipment of standard compounds to the
two subcontractors. Appropriate chain of custody forms were
sent to him, and the addresses to which materials were to be
sent were transmitted.
This documented the set-up of the internal QC program for analysis of air
samples collected on polyurethane foam plugs. Following this start-up and
implementation, GCA's role in the PUFs analysis became one of managing sample
distribution and analysis and acting as go-between for questions initiated by
EPA or the subcontracting laboratories. Very few questions arose in theso
analyses.
Hi-Vol filters—The analysis of hi-vol filters for selected metals was an
area of the Love Canal Study under complete control of the EPA laboratory
(EMSL/RTP). This program was not subcontracted through GCA but was conducted
totally by EMSL/RTP. GCA requested that, for the sake of completeness, the
analytical procedures and appropriate QC measures be provided for inclusion in
Appendix B to the Project QA Plan. These were received from EPA and formatted
to be compatible with Appendix B. Through a recently discovered oversight,
the Neutron Activation Analysis procedure used for Arsenic was not included.
However, a written analytical procedure was followed and is included in
Appendix A to this Final Report. Data sheets and a Coding Mar al were
provided to the EPA analytical laboratory so that results could be included in
the Love Canal Study data base.
Water—
The water monitoring program for the Love Canal Study was performed under
the direction of several EPA laboratories and included the analysis of sump,
ground, drinking, sewer, and surface waters for selected parameters and
pollutant classes. The analytical and QC procedures were specified by these
EPA laboratories and were incorporated into analytical subcontracts and
Appendix B by GCA. Radioactivity measurements were made by EMSL/LV and did
not involve GCA. However, the radioactivity procedure was supplied to GCA for
inclusion in Appendix B for the sake of completeness of the manual. The
measurement of volatile organics, semivolatile organics, pesticides, metals,
and the anions, fluoride and nitrate, was conducted according to a protocol
developed primarily by EMSL/Cincinnati, although EMSL/LV also played a role in
determining analysis and QC. Total organic carbon (TOC), total organic
halogens (TOX), pH and conductivity v/ere performed only on ground water
samples as directed by the R. S. Kerr Laboratory, ERL/Ada. Selected samples
were analyzed for TCDD as specified by HERL/RTP.
89
-------
The role of GCA in the water monitoring program was primarily to combine
the analytical and QC procedures provided by EPA into a comprehensive package
for use by the subcontracting laboratories. As necessary, GCA fielded
questions from EPA or the subcontractors and obtained answers from the
appropriate parties involved. Several examples of these problems, resolutions
and contract management are given below:
Date Notebook Entry
8/13/80 Many of the subcontracts pertaining to water analysis were sent
out to the appropriate analytical laboratories. The procedures
and internal QC protocol were those specified by EMSL in
several documents (memoranda, draft reports) given to GCA.
8/18/80 K. McGregor of GCA met with EMSL scientists regarding methods
clarifications. Surrogate compounds, internal standards and
LCS compounds were specified, and comments on the methods
documents that were in the subcontracts were taken for
incorporation into final protocols.
8/21/80 K. McGregor met with M. Kozik and B. Myatt of GCA to instruct
them to make these changes. Of significance was the change in
analysis of selenium and arsenic. The original methods called
for selenium by ICP and arsenic by flameless AAS. The final
methods reversed these two elemental analyses. In addition,
the analysis of soil/sediment by ICP was no longer permitted;
only AAS could be used for soil/sediment samples.
9/2/80 The updated methods packages were distributed to analytical
subcontractors, as appropriate.
Notebook Entry
B. Myatt was requested by R. M. Ellt.rsick to call ERL/Ada for
the analytical and QC procedures for TOG and TOX.
8/5/80 B. Myatt had telecon with Craig Shew of ERL/Ada. TOG was to be
performed by the standard EPA Method from EPA-600/4-79-020.
TOX was defined as titratable chloride, similar to TOG. Shew
believed that the subcontractor, AES, was using a Dohrraann
apparatus and suggested that AES be contacted directly for
their procedure.
8/5/80 B. Myatt contacted AES to request a-copy of their TOX
procedure. The document was received 8/10/80.
8/26/80 R. M. Bradway and B. Myatt discussed which samples required TOC
and TOX analyses and Optional Forms 60 were requested from AES
prior to preparation of the subcontracts.
90
-------
8/29/80 The analytical subcontracts for TOG and TOX were transmitted to
AES.
10/15/80 R. M. fillersick of GCA questioned the standardization procedure
AES was using for the TOX analyzer after discussion with
G. Smith of ERL/Ada. She requested that further information on
recommended procedures be obtained from Dohrmann.
10/17/80 B. Myatt spoke with John Whitechurch of Dohrmann who judged
that the AES procedures seemed to be acceptable. This
information, along with Dohrraann's prescribed standardization
and analytical procedures, was transmitted to R. M. Ellersick.
10/21/80 R. M« Ellersick prepared a memo reporting the follow-up to the
original QA audit of the TOX Subcontractor. Four points of TOX
standardization were listed. Three had been followed by the
subcontractor since the beginning of the program. The fourth,
which consisted of correcting data for percent recovery by the
analyzer, would be followed from this date on. GCA would
correct data already submitted.
2/25/81 Corrective Action Request Form No. LC-1-619-026-024 by R. M.
Ellersick details events surrounding AES correction of data
submitted prior to 10/21/80. Corrections were in error and
considerable communication was required with the subcontractor,
Dohrmann and GCA before final data were obtained and verified.
Date Notebook Entry
9/9/80 B. Myatt received telecon from K. Hausknecht of ERGO regarding
water samples received for inorganic analyses. The internal QC
program required that the first 10 samples of each type of
water be spiked with known amounts of the elements of interest
to determine if interferences were present. The first batch of
samples received by ERGO did not have any identification of
water type (i.e., sump, sewer, drinking), thereby delaying the
start of the analytical program.
R. M. "Ellersick was notified and informed the Sample Bank Co
note water type on sample tags from this point on and to call
the laboratories which had received samples to give them sample
type information on those samples already received.
GCA's role in the water monitoring program was again primarily to
implement the analytical and QC protocols specified by several EPA
laboratories. This often required numerous discussions to clarify details of
analytical methods and control limits for interns! QC. Data reports were sent
directly from the analytical subcontractors to Che appropriate EPA
laboratories concurrently with subcontractor submission of data to GCA. GCA
acted as go-between for EPA and subcontractor.questions and comments but did
not review data for quality.
91
-------
Soil/Sediment—
Soil and sediment samples were analyzed in a program directed by
EMSL/LV. Compounds and elements measured.by analytical subcontractors
included volatile, semivolatile, and pesticide organics, metals and TCDD.
Prior to the performance of the complete Love Canal Study, a methods
evaluation program was conducted for semivolatile organics. GCA was
responsible for setting up the analytical and internal QC protocols for all
methods in the evaluation based on EPA requirements. Subcontracts were then
written so that two laboratories (Acurex and SWRI) were involved in the
evaluation and an additional three (PJB, GSRI and Cotnpuchem) were
subcontracted to perform the analyses by the method chosen. GCA managed both
phases of this program. All decisions regarding the method of choice and the
quality of analytical data were ultimately made by EMSL/LV in concert with
GCA. As described for other media, the GCA role was one of coordination
between EPA and the analytical subcontractors. An example of GCA's
involvement in this program area is given below:
Date Notebook Entry
10/7/80 GCA received a call from D. Sauter of EMSL/LV requesting that
data from one of the analytical subcontractors be evaluated for
quality since results for several samples were suspiciously low
in organic compounds. One of Gf 's senior organic chemists
(G. T. Hunt) was requested to pursue this matter.
10/9/80 G. Hunt submitted memo to R. Bradway and K. McGregor
summarizing Sauter1s comments on the data and his own review of
the data package. He also documented a telecon to the
subcontractor laboratory and his recommendations for further
investigation.
10/10/80 R. M. Ellersick memo to R. Bradway and K. McGregor summarized
actions to be taken regarding the data from this laboratory.
These included obtaining surrogate recovery data from other
analytical subcontractors and possibly scheduling a site visit
to the laboratory in question.
10/16/80 A meeting was held at EMSL/LV and was attended by K. McGregor
and G. Hunt of GCA and several EMSL/LV staff members involved
in the Love Canal Study. Its purpose was to determine what
actions should be taken regarding the problem laboratory.
Foremost was to perform an onsite audit of the laboratory to
establish whether the data in question were, in fact,
unacceptable. In addition, the timely redistribution of this
laboratory's sample load (should it be necessary) and treatment
of data thusfar submitted were discussed.
10/17/80 K. McGregor, G. Hunt and D. Sauter of EMSL/LV traveled to the
suspect laboratory to discuss internal QC practices and
problems they may have been having.
92
-------
10/21/80 G. Hunt memo to K. McGregor and R. Bradway summarizes the site
visit, its conclusions, and actions to be taken thereon. The
volatiles and pesticides data were judged acceptable. The
semivolatile results were unacceptable and the subcontractor
was requestad to send one half of each semivolatile extract to
GCA for further disposition. After demonstrating satisfactory
performance on QC samples, the subcontractor was to rerun each
of the 75 sample extracts thus far analyzed. No further
samples would be sent to this laboratory until QC problems were
rectified.
2/5/81 Corrective Action Request Form No. LC-1-619-026-020 by R. M.
Ellersick documents this complete series of events. Extracts
sent to GCA were to be retained and, if necessary, analyzed by
GCA. The subcontractor was successful in providing acceptable
QC data and reran the original 75 extracts, thereby eliminating
the need for GCA to analyze these samples. No additional
samples were sent to the subject laboratory because the program
was completed by the time acceptable QC was proved.
In addition to the analysis of the volatile, semivolatile, and pesticide
organics, and metals monitored by EMSL/LV, T^D and radioactivity were
measured in selected soil/sediment samples. As with the water medium, the
TCDD measurements were performed by an analytical subcontractor in a program
monitored by HERL/RTP. Similarly, radioactivity analyses were performed by
EMSL/LV and the procedures submitted to GCA for inclusion in the procedures
manual, Appendix B. GCA managed the TCDD subcontract but did not participate
in data evaluation.
Biota—
The analysis of biological materials in the Love Canal Study was
primaiily the result of a program conducted under the direction of EMSL/LV.
As in other media, GCA set up the analytical and internal QC protocols for
subcontracts under the direction of the EPA laboratory. GCA then served as
manager of the subcontracts, fieljing EPA and laboratory questions on analysis
and data repor' ing. During the course of the program, it was noted that
isophorone was the major organic component observed in the biota samples.
Because acetone was used in the procedure, GCA conducted an experimental
program to determine the extent of art ifactual formation of acetone
derivatives during biota sample preparation. This study is described more
fully below.
The original RFP from GCA to potential analytical subcontractors included
the analysis of semivolatile and pesticide organics in animal tissue. Two
methods were being considered and the decision on which to use was made by EPA
prior to the preparation of subcontract documents. Internal QC procedures
applied to these organic analyses consisted of general items such as GO/MS
tuning and calibration, method blanks and replicates and additional activities
specified by EMSL/LV (i.e., reagent purity verification and extraction
efficiency testing). Additional QC data were generated by a study conducted
93
-------
prior to the start of field sample analysis. MRI, the subcontractor chosen
for the animal tissue analyses, was required to procure mice, crayfish, and
earthworms from a local (Missouri) source and to use these animal tissues in a
study of blank levels of the hit list compounds and recovery of selected
compounds spiked into the tissues- The recovery program consisted of eight
hit list compounds spiked into nine samples (three each of mice, worms,
crayfish) at two concentration levels with analyses performed by the
subcontract extraction method and GC/Hall. Data from the recovery study were
used to set analytical control limit specifications.
The measurement of volatile organics in foodstuffs and metals in hair and
vegetation samples was not included in the GCA RFP to analytical
subcontractors. In tba course of the Love Canal Study, these methods were
finalized by EPA, the. number of samples determined, and the details of
internal QC set through numerous conversations with EMSL/LV. Prior to these
subcontracts being let, GCA was instrumental in identifying omissions or
inconsistencies in the internal QC requirements. One such situation is
described in the example below.
Date Notebook Entry
10/6/30 B. Myatt, D. Chase and G. Hunt obtained information from GCA
Contracts Office regarding the numbers of samples for
inorganics in hair and vegetation and volatile organics in
foodstuffs. Analytical procedures were verified through
EMSL/LV and bids were solicited from available subcontractors.
10/10/80 B. Myatt received quote from one subcontractor and also spoke
with Dr. S. Black, EMSL/LV, about: the number of hair and
vegetation samples currently planned for inorganic analyses and
what internal QC would oe required. The numbers of method
blanks, spiked samples and LCS to be run with each batch of
samples for inorganics were specified. In addition, external
QA was determined to be conducted by extracts of several
samples being split and one half of each sent to an EPA
laboratory for concurrent analysis.
10/15/80 The analytical subcontract for inorganics in hair and
vegetation was let based on the number of samples specified by
Dr. S. Black.
10/22/80 The internal QC procedures for volatile organics in foodstuffs
were discussed by B. Myatt and S. Black. Resolution was
reached on surrogates, laboratory control standards and control
limits. Internal standards were not applicable to the
prescribed analytical procedure. A subcontract was issued for
these analyses using the analytical and QC procedures
determined this date.
94
-------
ADDITIONAL RELATED ACTIVITIES
Artifact Formation of Acetone Derivatives
In the sp-ing of 1981, GCA conducted a study initiated by direction from
the EPA to determine the extent of artifactual pioducfion of acetone
derivatives during Soxhlet extraction of environmental samples. The results
of this study included both theoretical and experimental data concerning the
formation of acetone derivatives as well as data obtained from the examination
of 3C/MS results for the Love Canal biota samples for the presence of possibly
artifactual acetone derivatives.
Initially, data from two biota samples analyzed by capillary GC/MS and
found to contain significant levels of isophorone were examined for the
presence of acetone derivatives and for verification of the presence of
isophorone. The samples selected were B30039 (crayfish) and B30078 (raous>e).
Subsequently, several experiments were performed simulating the
conditions found in a Soxhlet extractor and measuring the production of
isophorone. Several conclusions from the study were made, including
• Isophorone is formed in the soxhlet extraction of both acidic and
basic purified sand in significant pmounts.
• The mass spectral data from the biota samples indicated the presence
of acetone derivatives such as mesityl oxide, diacetone alcohol,
phorone and isophorone.
Most importantly, the very real possibility of isophorone production in the
soxhlet extraction of the biota was raised. A summary report entitled
Investigation of the Artifactual Formation of Acetone Derivatives in Biota
Samples from the Love Canal Study was submitted by GCA to EMSL-Las Vegas in
May 1981.
Evaluation of GC/MS Data
In accordance with a request from the Love Canal Study Project Officer,
GCA/Technology Division completed a limited study of mass spectral data from
each of the laboratories performing organic analysis for the Love Canal
Study. Since time and cost precluded extensive examination of all mass
spectral data supplied, two analyses were selected from each laboratory for
verification. Selection criteria for data to be verified were arbitrary but
were influenced by several considerations, including:
• Levels of pollutants reported
• Number of compounds reported
• Toxicity of compounds reported
95
-------
Using these criteria, two samples from each of the laboratories were
chosen for examination. Included in this list were analyses representative of
all sample types collected in the study.
The data were examined primarily to verify compound identifications.
Quantitation of the substances found would have required data inputs not
immediately available. The data required include such factors as:
• Amount of sample extracted
• Concentration volume of sample
• Internal standard spike level
• Instrument response factors
• Surrogate spiking levels
While these data can be made available through interpretation of
laboratory notebooks, time constraints prohibited this process.
Upon completion of this task, a summary report, Verification of Selected
Mass Spectral Data From the Love Canal Study, was submitted to the Office of
Research and Development, U.S. EPA, Washington, D.C. in May 1981.
Subsequent to this report, in June 1981, GCA was requested by EPA to copy
a number of GC/MS data files on magnetic tape which had been submitted to GCA
by the various analytical subcontractors. Approximately 150 sample riles were
requested with about 75 percent of these being soil and sediment samples.
Affiliated blanks, standards and calibration runs brought to 300 the total
number of files to be transmitted to EPA. These files were to be copied and
sent to EPA in the following manner:
- All data on water samples were sent to EMSL-Cincinnati.
All data on soil and sediment samples were sent to EMSL-Las Vegas.
GCA1s role was complicated oy the late arrival of apes from some
subcontractors, but eventually all GC/MS data due under the program were
received on magnetic tape. Other problems included the inability of GCA to
copy files on volatile organic analyses from Gulf South Research Institute due
to format incompatibilities, lost data due to magnetic tape anomalies, and
various computer difficulties. GCA supplied all taaed information possible,
as well as data on site locations, strata and analytical procedures.
Transmittal of GC-ECD Data (Chromatograms) to EPA
In early August 1981, GCA was informed that EPA was to screen all samples
analyzed by Method 608-S (soils and sediments) for pesticides and PCBs in an
attempt to determine the possibility of tetrachlorodibenzodioxin (TCDD)
96
-------
contamination. GCA was asked Lo submit all chromatograms of samples analyzed
by this method with associated standard chromatograms. A total of 278 samples
were analyzed via Method 60S-S and all av; ilable chroma tog rains, with data
report forms, were sent to EMSL-Las Vegas. GCA also checked into s- ~e
problems and questions which EPA had discovered, such as the use of internal
standards for identification and the lack of a DDT standard in PJB
Laboratories standard solution.
In early December 1981, GCA received from EMSL/LV a list of 49 samples
for which no chromatographic tracings had been submitted to EPA under the
audit. These are accounted for in the following manner:
Chromatograms sent to EMSL-LV but listed as missing 8
Radiation sample 1
Analyzed by EMSL-LV (no chromatogram received by GCA) 1
Semivolatile sample (no GC/ECD analysis) 1
Nonvalid sample number 2
Chromatograms not received by GCA from the
subcontracting laboratories 36
Further checking on the 36 missing Chromatograms indicated the number of
missing samples from each laboratory was:
• Accurex 25
• Compuchem 1
• GRSI 1
• SWRI 7
• PJB 2
Each laboratory with missing Chromatograms was contacted and all claimed
that all documentation had already been sent to GC^. EMSL/LV was notified of
this and no further action was taken by GCA.
97
-------
I
(•:
!J
SECTION 7
SAMPLE BANK
The Love O.nal sample oank was established at a vacant Niagara Falls
laboratory building. The Bell Aerospace/Textron-Wheatfield plant ser/ed as a ;
convenient base for sample inventory and shipping. It is located \
approximately 1 mile northeast of the Canal area and convenient to the ; •
sampling areas, field office and Federal Express shipping facilities. A : *
refrigerated truck and trailer were also used for sample bank operations. The ' *
truck was used to hold and transport samples from the field office to the |
sample bank. The trailer served as storage space for the samples at 4°C until I
properly processed for shipping. Both vehicles were kept locked at all times,
accessible only GCA personnel. An additional large chest freezer was also i
kept for storing biota samples and ice for packing and shipping. |
The building was under 24-hour guard and had limited access providing
_deq!:ate security. It was equipped with a ventilated hood system, storage
area, office space, and laboratory facilities, making this building a good
location for the sample bank. In addition, the building was shared by an
independent laboratory, Advanced Environmental Systems (AES), which was
contracted to handle certain needs during the sampling campaign, providing
chemicals, pipets, sample container cleaning, acid an organic waste disposal
and analysis of pH and conductivity. An industrial waste service (CECOS) was
also contracted to pick up wastes at the end of the field sampling operations. ;
The sample bank was established in order to receive, inventory, label,
package and distribute the thousands of samples collect . over the 3—aonth
sampling program. Samples were delivered to the sample bank from the field ;
office daily. All samples were logged into a Master Log Book as shown in j
Figure 7.1. The samples were assigned sample bank numbers and cross-checked
for the proper information on sample tags, analytical tags and chain-of-custody
forms.
Samples were then prepared for shipment to the appropriate laboratories <
using specific Department of Transportation (DOT) procedures for packaging. |
Under DOT Title 49, Code of Federal Regulations Parts 171-177, procedures for
packing, labeling and shipping hazardous materials. Samples were shipped out
in three categories, "environmental samples" (nonhazardous), "corrosive," and i
"poison B." j
!
Most were shipped out in three categories, "environmental samples" I
(nonhizardous), "corrosive," and "poison B." |
98
-------
O ~J
•r4 Q,
*J E
I i?
I °£
( «-
! *s
60
O
(1)
U
ca
«
E
QJ
00
10
(X
(X
99
-------
Most soil, surfacewater, drinking water, groundwater and air samples fell
into the environmental sample category. These samples we-' packaged and
shipped to prevent breakage or leakage according to the National Enforcement
Investigations Center (NEIC) recommendations. The sampler > re doublebagged
and packed in metal surfaced Coleman coolers witii "blue ic packets to keep
the samples at 4°C. Regular ice was used if "blue ice" was not available.
The cooler drainage holes were taped shut in case a sample leaked into the
melting ice.
Some groundwater samples were found to have a pH greater than 12. Under
DOT classifications these samples are deemed "corrosive" and must be handled,
packaged and shipped with special precautions. These groundwater samples were
pH adjusted and shipped as environmental samples for all analyses except the
semivolatiles and volatiles. These /ere packaged and shipped under DOT
regulations for "corrosive" materials as follows. The doublebagged samples
were placed in 1-gallon metal containers which in turn were packed in Coleman
coolers with ice. The samples could be handled under the "limited quantities"
DOT regulations because they were unanalyzed hazardous waste site samples.
The total sample content of each cooler could not exceed 1 quart because of
this restriction. The following labels were attached in compliance with the
DOT regulations:
• "CORROSIVE"
• "CORROSIVE LIQUID - NOS"
• "LIMITED QUANTITY"
• "INSIDE CONTENTS COMPLY WITH DOT REGULATIONS"
In addition, a shipper's certification for hazardous materials was
required to be completed and attached to the shipping container.
All sump water, sewer water, sediment samples and selected soil, surface
water and ground water samples which were most Ikely to contain TCDD, were
packaged and shipped as "POISON B" samples. These samples also fit the
"limited quantities" category. These samples were restricted to 1 quart or
less per gallon container. Each sample was doublebagged, as the others, with
the analytical sample tag between the bags. These samples were then packed in
1 gallon metal paint cans with sufficient packing material to prevent
breakage. The metal cans were closed and taped shut. Each can was labeled:
• "POISON"
• "DANGER-PELIGRO"
• "FRAGILE-THIS SIDE UP"
These cans were placed in metal surfaced Coleman coolers with sufficient
ice to keep the samples at 4°C during shipping and ethafoam to prevent
shifting. The maximum allowable volume of "poison B" samples per cooler was 1
gallon. The coolers were labeled as follows:
100
-------
• "POISON LIQUID-NOS"
• "LIMITED QUANTITIES"
• "INSIDE CONTENTS COMPLY WITH DOT REGULATIONS"
These samples were restricted to cargo-only aircraft and can only be sent
by Federal Express on Monday, Wednesday and Friday. These samples also
required a shippers certification form.
Biota samples were shipped and packed in dry ice to keep them frozen
during shipping. If five (5) pounds of dry ice or less is used there are no
restrictions on shipping other than a label indicating the amount of dry ice
used. If more than 5 pounds is used then a shipper's certification form must
be completed. All samples were shipped following strict chain-of-custody
protocol.
The sample bank also served as a supplier for the sampling crew. It
provided sample tags, custody forms, log books, evidence tape, locks, coolers,
and clean sampling containers. Sample bank personnel also aided by
precleaning and checking Tenax and polyurethane foam plugs and preweighing
hi-vol filters for the air sampling campaigns.
Quality Assurance samples were also processed through the sample bank to
the appropriate laboratories. The true values for the quality assurance
samples were stored under lock and key at the sample bank or the EPA
laboratory.
All sample data from the program was transferred from the sample bank to
the GCA computer. Sample bank numbers, sample tag numbers, and site ID
numbers were verified for every sample entered into the computer. Inventory
was done to cross-check all samples, labels, custody forms and log books. Ten
percent (10%) of the data was cross-verified for quality control purposes.
101
-------
SECTION 8
DATA MANAGEMENT
DATA HANDLING SYSTEM
The Love Canal software system was designed with the capacity to handle
and store over 330,000 data records in three separate files. These were:
• Raw data
• Verified data
• Validated data
This section provides an overall description of the data system,
including the inputs, the software, and the outputs. Details on each
component in the system are provided in subsequent sections of this report.
The Data Flow System
Figure 8.1 presents a flow diagram of the data handling system which was
developed for the study. The key features of this system include:
• Input of all data to the system via computer cards;
• Editors for all sample data, analysis data, and analysis
verification action data cards entering the system;
• Maintenance of the raw data in multi-record card-image format;
• Storage of all disk files in the PANVALET^ file management system
for security and ease of maintenance;
• Conversion of a completed sample to a single record at the
verification level;
• Capability to produce data reports summarized by any field on the
sample card;
102
-------
I
1
H
r— r
fo\
v
r~~~
o
LU —
ct t- a. a:
ec
o
f-~
Ji!
1
©A-J
•*-
I
I ' 1
©
0
I-..
I- J
1-.
1
£ E
—i a.
i
00
o
e
.
in
M
•a
c
00
a
S
00
•H
6-,
103
-------
4-1
a
o
u
00
-------
•o
s
c
a
o
o
CO
-------
I
'i
t
i • Generation of raw data tapes, verified data tapes, and a tape of
j validated data for use by EPA;
• Analyses of internal QC, external QC and duplicate/triplicate QC
data.
Each significant processing step noted in Figure 8.1 is assigned a
number. A brief description of each of these steps is given below.
1. The Sample Data Reporting Form was completed by GCA Sample Bank
personnel and sent to GCA, Bedford, for keypunching and
verification. It included information such as sample ID, location,
date, time, source, instrumpriC readings, and tag numbers.
2. The Coordinate Cross-Reference file (COORDS) was an on-line table of
the state planar coordinates associated with each site ID in each
stratum. These coordinates were inserted into the sample data
record at the time of sample data editing. (This processing step
was added to the system mid-way through the study.)
3. The Sample Data Edit (SAMEDIT) program checked all sample cards
entering the system to ensure that mandatory fields were correctly
supplied. (See Appendix D-l for a list of checks made.) Any errors
or omissions were flagged for review by GCA Data Management. The
Julian date on which data were entered into the system was added to
each record at this step.
4. The temporary sample file (SAMCARDS) was used to accumulate the 1
edited sample data records prior to the periodic (more or less 1
weekly) run which merged these data into the master raw data file. .-|
5. The Analysis Data Reporting Forms were coded by the analytical |
laboratories (or in some instances by GCA Data Management), and were J
sent to GCA, Bedford, for keypunching and verification. They 1
included concentrations of pollutants for field samples and QC j
samples, as well as analysis method, analysis date, and sample size. |
6. The Pollutant Cross-Reference file (POLL XREF) contained information |
for each pollutant identified by EPA for study. It was used to ]
check the validity of pollutant codes on analysis data forms, to •?
label computer output, and to flag data entries with concentrations ;
exceeding "alert levels" stored in the POLL XREF file. ;
7. The Analysis Data Edit (ALEDIT) program checked all analysis data f
cards to ensure that mandatory fields were correctly supplied. (See ,1
Appendices D-2 and D-3 for a list of edit checks.) The POLL XREF |
file was used to check that the pollutant codes on analysis data '|
cards were valid. Errors or omissions were flagged for review by
GCA. The Julian date on which data were entered into the system was
added to each record at this step.
106
-------
8. The temporary analysis data file (ALCARDS) was used to accumulate
the edited analysis data records prior to the run which merged these
data into the master raw data file.
9. The temporary analysis reject file (ALLCARDS.BAD) was used to set
aside analysis data for which sample data were not yet present in
the raw master file. Each time ALEDIT was run this file was updated.
10. The Raw Data sort/merge program (RAWl) sorted the edited sample data
and analysis data and merged them into the raw data master file.
The RAWl program also generated a tape of raw data for EPA, and it
added all new sample data to the Temporary Verification file
(TEMPVER).
11. Each time the RAWl program was run, a new tape of raw data was
generated for EPA. One file contained all sample data added to the
system since the previous RAWl run, and a second file contained all j
new analytical data. The card-image format was maintained in these j
tapes. I
\
12. The Raw Data file (RAWMASTER) contained all edited sample and |
analytical data, sorted by Sample ID, card type, and sequence
number. This cumulative file preserved the card format in which the ;
data were originally coded. The file was maintained by using the j
PANVALET data handling system.
13. The Temporary Vncification (TEMPVER) file contained verified data in
card-image format. Sample data were added to this file by the RAWl <
program and were verified by the Sample Data Verification Program j
(description 14). Verified analysis data were added to TEMPVEK by
the VERAP program (description 26). The data in this file were ',
deleted as the verified data file was updated with these same data. j
j
14. The Sample Data Verification Program used the PANAVALET update •
process to correct the sample data in the Temporary Verification j
File (TEMPVER). Corrections were supplied by the GCA Sample Bank. j
I
15. The RAWSUM program generated three reports (descriptions 16, 17, and j
18 below) to allow review of the raw data. This review stage
enabled laboratories to check their results as they were added to :
the system and to correct any errors in coding, interpretation, or
transcription.
16. The ID Number Cross-Reference report listed all cross-reference I
numbers supplied on the sample cards, including the Sample ID, tag, J
and chain-of-custody data. The report included lists sorted by each j
of these numbers. The ID cross-reference information was also kept !
in a disk file (ID XREF) for checking the H> numbers of all new data !
entries. The Sample ID served as the basis for cataloging all the '
cross-reference numbers. f
107
-------
17. The Alert Report was a special listing in which all observations
above a pre-deterrained level were printed so that special attention
could be directed to the sampling site by GCA and EPA. At the raw
data stage, the alert report was also used by Data Management to
help in identifying outliers resulting from transcription or
keypunch errors.
18. The Raw Data Reports listed sample data and analysis data for all
samples for which analysis data had been entered into the system
since the last listing. One part of this report listed all new
data, sorted by Sample ID; this report was distributed to EPA and
GCA. The second part of the report listed all analysis data and
selected sample data, sorted by analytical laboratory; these reports
were sent to the appropriate analytical laboratories for
verification.
19. The Love Canal QC1 (LCQC1) program analyzed the recovery rates for
internal and external QC samples, while the Love Canal QC2 (LCQC2)
program compared results for laboratory-generated duplicates and
their mates.
20. Reports generated by the LCQC1 and LCQC2 programs gave recovery
rates by medium, analysis method, and laboratory, and listed pairs
of results for laboratory-generated duplicates.
21. The eight KM programs were designed to sort the sample data on any
field and to list selected data for review by GCA and EPA.
22. Different KM reports were used for different purposes. For example,
listings sorted by site ID were used to monitor sample collection,
and lists sorted by laboratory were used to monitor laboratory
workload and response. Lists by sample tag number were used by the
GCA Sample Bank to verify the sample data.
23. The Verification Action Reporting Forms were completed by GCA Data
Management in response to the verification of the raw data listings
by the analytical laboratories. Any changes to the raw analysis
data were specified on these forms.
24. The Analysis Data Verification Edit (VERED) program checked all
codes and functions included on the verification forms to ensure
accuracy. At this stage, the legal functions were "Add," "Delete,"
and "Change." (See Appendix D-4 for a complete list of edit checks.)
25. The Temporary Verification Action file accumulated the analysis data
changes for all the samples being verified in the current run. The
contents of this file were deleted each time the VERAP (description
26) program was run.
108
-------
26. The Verification Application (VERAP) program selected from the raw
data file the analysis data for all samples being verified in ,.:.--
run, applied the indicated changes to them, and added these data t<
the TEMPVER file (description 13). (The raw data file was not
altered in this process.)
27. The GCAVER2 procedure and its major program, VERFILE, selected from
the TEMPVER file all samples with analysis data, and added them to
the verified data file. In doing this, the program converted all
the card-image records (both sample data and analysis data) for each
sample into a single-record format. GCAVER2 also generated a tape
of verified data for EPA, a cumulative list of all verified samples,
and three other reports on the verified data (descriptions 17, 29,
and 31).
28. Each time either the GCAVER2 procedure or the Verification
Correction Program (description 32) was run, a tape was created for
EPA which contained the verified data for all samples in the current
run. All data for each sample were stored in a two-record format
designed to meet EPA's specifications.
29. The delinquency report, generated by the VERFILE program, listed all
samples for which analysis data were overdue. Identification of
these samples was based on the date on which the sample data were
entered into the system. Any sample remaining in the TEMPVER file
for over 30 days, without accompanying analysis data being supplied,
was included in the report.
30. The verified data file (VERIFY3) was created and updated by the
GCAVER2 procedure. It was a cumulative file of all verified data,
with the data for each sample being stored in a single
variable-length record.
31. The Verified Data Listings were similar in format to the Raw Data
Listings. One part of the verified data report was sorted by Sample
ID, and the second part was sorted by sampling medium. Both parts
of the report were sent to EPA for use in data validation. Included
in these reports were all samples included in the current run of the
GCAVER2 or Verification Correction programs.
32. The Verification Correction program used PANVALET update procedures
to make corrections to the verified data file. This step was added
to the data handling system when it became evident that changes and
corrections were being submitted to, or identified by, GCA Data
Management after samples had alreeady been verified. Either sample
data or analysis data could be corrected by this procedure.
33. Invalidation instructions from EPA consisted of lists of samples or
types of samples which were to be wholly or partially invalidated.
Partial invalidation was typically based on laboratory, medium,
analysis date, analysis method, and pollutant.
109
-------
34. The Validation programs were developed to apply EPA's invalidation
instructions. Partial invalidations were accomplished through
PANVALET updates and through programmed identifications of
pollutants. Samples which were wholly invalid were identified by
Sample ID number.
35. The Validated data file was in the same format as the verified data
file. It contained all data in the verified data base which had not
been invalidated by EPA. :
36. The Validated data tape for EPA was in the same format as EPA's tape
of verified data. The tape contained all data in the verified data
base which had not been invalidated by EPA.
|
37. The QC Cross-Reference (QC XREF) Reporting Forms, which contained |
the ID numbers of sets of duplicate and triplicate samples, were j
submitted by the Sample Bank as a means of identifying duplicate and j
triplicate samples generated by the Bank. j
t
t
38. The QC Duplicate/Triplicate Analysis programs compared the analysis j
results obtained for each set of samples listed on the QC j
Cross-Reference Form. •_
39. The report on the QC Duplicate/Triplicate Analysis listed the
results of the comparison of data on each set of samples on a
pollutant-by-pollutant basis. ;
Data Flow Timetable .
Table 8.1 summarizes the timetable for the flow of data through the raw '
data processing system for a typical sample. Elapsed time includes the time
necessary for transmitting data from one activity site to another, as well as
for producing or processing data at each site. ;
While many samples followed this timetable fairly closely, the flow of j
other samples deviated from it substantially. The major variations occurred J
in the time elapsed between sampling and the completion of the Sample Form. \
This affected the overall timetable only when the delay was 4 weeks or more
(such delays were common for air samples, in particular). Further delays
occurred when analysis results were not submitted by the laboratories within ;
the specified time period.
GCA Data Management initially expected to be sending out verified data i
listings approximately 17 days after the raw data listings for the j
corresponding samples had been transmitted, or 59 days after sampling. This |
allowed 7 days for the laboratories to verify and transmit the raw data j
listings, 7 days for GCA to code and keypunch the Verification Action Forms
and enter them into the system, and 3 days for GCA to create and transmit the
Verified Data Listings and Alert Reports. In fact, the first Verified
Listings were not sent out until December 20, which was 11 weeks after the
s
5
110 ;;
\
-------
first set of raw data reports was sent out. Several factors contributed to
this delay, including the fact that coordinates and other sample data were not
verified until after the sampling program was completed and the Sample Bank
had returned to GCA. The major factor, however, was the staggered submission I
of results by analytical laboratories. Because the results for many samples ;
were submitted over a protracted period of time, it was hazardous to verify j
results too swiftly: more results might have been on the way.
TABLE 8.1. TIMETABLE FOR RAW DATA FLOW j
________^—___—_———————______—_^___—_—_ j
i
Days after ;
sampling Activity j
2 Sample Bank completed Sample Form. ;
!
I
3 Sample Forms received at GCA. j
10 Sample Forms keypunched and entered into data j
management system; ID Cross-Reference generated. :
32 Analysis data forms received at GCA. j
f
39 Analysis data keypunched and entered into system. j
42 Raw data Listings and Alert Reports generated and '•
transmitted.
Because large volumes of samples were processed in the initial \
verification runs, the elapsed time between the transmittal of raw data
reports and verified data reports was promptly cut to 4 weeks. By the end of ;
January, this gap was narrowed to only 1 week. j
!
Figure 8.2 shows the actual processing schedule which as achieved. The '
raw data portion of the graph is drawn for samples which were complete at the \
raw data stage; that is, they had both sample data and analysis data present. j
At any given ti.me there were also inco-nplete raw data results in the system, |
waiting to be matched with the corresponding analysis data or sample date. ' |
Figure 8.2 reflects the rate at which raw analysis data were received '
from the analytical laboratories. Results came in very slowly until >
mid-November; great quantities of results were received until mid-December and
then the flow dwindled again. This bulge in receipt of raw analysis data
shows up in Figure 8.2 as a bulge in complete raw samples processed between
late November and late Dec...nber. This bulge is repeated for the verified data
about 1 month later.
111
-------
I8/Z/9NOU.VOI1VA OMIHi
l8/il/£ NOIiVOIIVA ON033S
T3
(U
o
CO
60
C
•H
<0
w
<1)
o
o
M
Q.
tfl
U
rt
CM
00
0)
M
00
•H
JO
112
-------
Validation took place on four separate dates, January 28, March 17,
June 2, and November 3, 1981. In each case, all samples verified through that
date were pissed through the validation programs.
The Data Management Staff
Design and implementation of the data management system was accomplished
by a staff of seven oeople at Bedford. While the staff worked as a team, and
each member assisted with whatever tasks were currently pressing, each staff
member also was responsible for one of the following roles:
• Data Manager
• Programmer
• Re.w Data Processer
• Verification Coordinator
• Vsrification Data Processer
• Forms and Reports Clerk
• Keypuncher.
In addition, Data Management coordinated the coding of sample data at the
Sample Bank, which was conducted by one full-time and two part-time coders,
and which was supervised onsite by a Sample Bank member. Data Management also
employed two part-time coders at Bedford to assist with sample data coding.
During January 1981, when the major verification effort took place, the Data
Management staff was further augmented by seven other people, most of whom had
worke'l on the Love Canal Study during the earlier design and sampling phases.
CODING FORMS AND PROCEDURE
Coding Forms
Four basic types of data coding forms were required to report the
sampling data. These included forms for:
• Sample Data—Sample Collection and Chain-of-Custody information,
reported by the GCA Sample Bank;
• QC Cross-Reference Data—Sample ID numbers of sets of duplicate and
triplicate samples, as designated by the Sample Bank;
• Analytical Data—Analytical results on *ield samples, external QC
samples, and laboratory internal QC samples, reported by the
analytical laboratories (several '"subtypes" of analytical forms were
designed, based on the type of analysis being reported); and
113
*J
-------
• Verification Action Data—Data changes submitted by the laboratories
in response to computer listings of raw analysis data.
Because all data submitted were to be keypunched onto computer cards of
80 columns each for entry into the GCA computer, each form was divided Into
rows of 80 columns. Each row was assigned a "Card Type" code, which was
preceded on the form. Sample data required six separate card types, while
analytical data and verification action data each required five separate card
types. The master sample ID number, as well as the card type, was entered on
each row of coding, so that each computer card could be fully identified in
terms of source of data and type of data.
A set of identification codes appeared at the top left corner of each
reporting form. The parts of these codes were as follows:
LC 1-619-026 - / / / - / / / / - I I I I I I - J_l
Project Subcontractor Project Form Page
number document no. form serial number
(appears only number number
on analytical
data forms)
The project number and prc-ject form number were preceded on the reporting
forms. Individuals submitting the forms to GCA inserted the remaining
numbers. The subcontractor document number was assigned to the laboratories
by the GCA Quality Control Officer. Each coder also inserted a serial number
on each code sheet, with each type of code sheet used by the coder being
serialized separately over the course of the project. Because it was often
necessary to fill out more than one code sheet of a particular type for a
given sample and analysis method, a box was also provided for a page number.
Table 8.2 lists all the data reporting forms developed for the project.
Included in the table are the subtypes of analytical data coding forms.
Each of these forms is exhibited in Appendix F. These exhibits represent
the final format of each form. While most forms were developed in August
1980, some minor revisions were made to them during the course of che
project. A few fields were added to or deleted from some analysis data
reporting forms. The date of the final version of the form is given in the
lower left corner. For forms submitted to GCA prior to the issuance of the
final forms, entries were made or deleted by GCA Data Management to bring the
coding on those forms into conformity with the final format. The form types
are discussed in more detail in the sections which follow.
Sample Data Reporting Forms—
Only one type of coding form was used to report collection data and
chain-of-custody data on samples handled by the Sample Bank. Performance
Evaluation samples and External QC samples, as well as field samples, were
reported on the "Sample Data Reporting Form" (see Appendix F).
114
-------
TABLE 8.2. LOVE CANAL DATA REPORTING FORMS
Form
Ho.
E01
E-2
J01
JO 2
J03
J04
JOS
J06
J07
JOS
J09
J10
Jll
J12
J13
JU
J15
J16
Int.
J17
J18
Int.
Card
type(s) Type of d;
S1-S6 Sample
Q QC cross-
reference
Al Analysis
Al Analysis
Al Analysis
Al Analysis
Al Analysis
Al Analysis
Al Analysis
Al Analysis
Al Analysis
Al Analysis
A2 Analysis
A2 Analysis
A2 Analysis
A2 Analysis
A2 Analysis
A2 Analysis
A2 Analysis
A2 Analysis
ita "Form type
Sample data
QC cross-reference
data
Analyte concentration
Analyte concentration
Analyte concentration
Analyte concentration
Analyte concentration
Analyte concentration
Analyte concentration
Analyte concentration
Internal QC
Internal QC
Internal QC
Internal QC
Internal '^C
Internal QC
Int e rna 1 QC
Internal QC
(c
-
Medium
Any
Air (Poly-foam)
Air (Tenax* )
Water, soil,
Sed. , biota
Water, soil,
Sed., biota
Water, soil,
Sed. , biota
Air (Hi-Vol),
water, soil,
Sed., biota
Water
Sed.
Any
Air (Poly-foam)
Air (Tenax )
Air (Hi-Vol),
water, soil,
Sed.
Water
Water
Water
Water
ont inued)
115
IMMUMUBmMMWimmiHKBiawMIWmiNaB*^^
Form sub-type
Compounds analyzed
Any compounds not pre-coded
on forms J02-J10
Pest ic ides
Volatile organics
Volatile organics
Semivolat iles
Pesticides and PCBs
Inorganics-metals
Ino rganic s-anions
Special analysis — TOX, TOC
TCI) I)
Any compounds not pre-coded
on forms J12-J26
Pesticides
Volatile organics
Inorganics-metals
Volatile organics
Volatile organics
Semivolat i les
Semivolat iles
QC type
Any
LCS
LCS
LCS, spike
LCS, spike
Surf. ;
Std.
LCS, spike
Surr. ;
Std.
-------
TABLE 8.2 (continued)
F_ __
orni
No.
J19
J20
J21
J22
Int.
J23
J24
Int.
J25
J26
J31
J41
J51
J52
J53
J54
J-itL
J J J
1 J60
r _____
Card
type(s)
A2
A2
A2
A2
A2
A2
A2
A2
A3
A4
VI
V2
VJ
V4
V(I
vu
AO
Type of Uata
Analysis
Ana lysis
Analysis
Analysis
Analysis
Analysis
Analysis
Analysis
Analysis
Analysis
Verification
action
Verification
action
Verif ic t ion
action
Verif icat ion
action
action
Ana ly bis
FUITO sub-type
Form type MeJ ium Compounds analyzed
Internal (JC Water Pesticides
Internal '^C Water Inorganics -an ions
Internal QC Soil, Sed. Volatile organics
Internal QC Soil,, Sed . Vo la t i le organic^
Internal QC Soil, Sed. Seioivolat iles
Internal '^C Soil, Sed. Semivolat iles
Internal QC Soil, Sed. Pesticides
Internal *^C Water, 50 i 1, Special analysis
Sed.
Qua li tat ive analysis Any Semivolat iles by GS /MS
Comments
Analyte concent rat ion
Internal QC
yualita: ive analysis
Co.ii.ient
lixt1-? rnal '^0
External <^C
^C type
LCS, spike
LCS, ipike
LCS, spike
Surr. ;
Std.
LCS, spike
Surr. ;
Std.
LCS , sp i ke
Any
116
-------
Card S2: Flow Information (Air Samples Only)
• Start Flow
• End Flow
• Average Flow
• Duration
• Total Volume
• Pump ID Number
Card S3: Custody Data
• Preparer's ID Lot (Air only)
• Chain-of-Custody Numbers:
- Preparer to Bank (Air only)
Bank to Sampler (Air only)
- Sampler to Bank
- Bank to Lab
Preparer to QC (Air QC only)
QC to Bank (QC only)
• Sample Tag
• Date Sent to Laboratory
• Analysis
• Well Number (Ground water only)
Card S4; Meteorological Data (Air Samples Only)
• Time
• Relative Humidity
• Temperature
• Wind Speed
• Wind Direction
(Above four fields repeated for each observation)
118
-------
• Met. Station ID
• Sequence Number of Card
Card S5: QA/QC Comment (QA/QC Only)
• Comments entered to identify blanks, duplicates, triplicates, PE
samples, and externally generated QC samples
• Sequence Number of Card
Card S6: Comments
• Any relevant comments about sample
• Planar Coordinates for surveyed sites
• Sequence Number of Card
Details on the information provided by the Sample Bank for each of the
above data items is contained in Part I of the "Love Canal Study Coding Manual
for Sample Data Reporting Forms, " which is reproduced in Appendix G. The
manual also specifies which fields within each card type were mandatory under
the various sampling conditions, including QA/QC sample preparation.
QC Cross-Reference Reporting Foxms—
The QC Cross-Reference Reporting Form (see Appendix F) was used by the
Sample Bank to identify sets of duplicate and triplicate field samples. These
samples, collected as part of the QA/QC program, were identified on this
reporting form by Sample ID number only. All other information about the
samples was reported on the Sample Data Reporting Form. Only one Card Type
(Q) was necessary for the QC Cross-Reference Reporting Form, since all
information regarding a set of duplicate or triplicate samples was entered on
one row on the coding form. The information coded included:
• Two sample IDs for a pair of duplicates or three sample IDs for a
set of triplicates;
• Medium;
• Comments (optional).
Part II of the "Love Canal Study Coding Manual for Sample Data" (see
Appendix G) contains information on the preparation of the QC Cross-Reference
Reporting Form.
/.nalysis Data Reporting Forms—
GCA Data Management developed the Analysis Data Reporting Forms in close
cooperation with those GCA staff members who were responsible for specifying
the analytical procedures for the Love Canal Study. The analysis data were
119
-------
reported on five separate types of forms: "Analyte Concentration" (Card Type
Al); "Internal QC" (Card Type A2); "Qualitative Analysis" (Card Type A3);
"Comments" (Card Type A4); and "External QC" (Card Type AO).
To facilitate coding, separate Analyte Concentration Forms (Al) and
Internal QC Forms (A2) were developed for specific analysis methods, where
possible. In these cases, the compounds to be analyzed were listed to the
left of the coding rows, and their respective pollutant codes were preceded on
the forms. Nine specific Analyte Concentration Forms (J02-J10), and fifteen
specific Internal QC Forms (J12-J26) were developed. In some cases, the list
of preceded compounds which might be identified during an analysis of a sample
was quite long, and several coding sheets were needed for the complete lists.
This was true for forms J03 through J06.
For each sample, columns 1-6 provided a unique identification. This
information, together with the information in columns 7-32 (sample medium,
analysis method, lab ID, analysis date, analysis time, and sample size), was
entered once on each coding fortu for the sample and analysis method. Thus,
each of the five types of coding forms (card types AO, Al, A2, A3, and A4)
contained the same information in columns 1-32 for a particular sample and
analysis method. When the data were keypunched, the information in columns
1-32 was punched on all cards for that sample and method.
The following information was entered on Analysis Data Reporting Forms:
All A Card Types (AO-A4)
• Sample ID
• Medium
• Method of Analysis
• Laboratory ID
• Date of Analysis
• Time of Analysis
• Sample Size
Card Type AO; External QC
• Pollutant Code
• Observed Concentration
• Expected Concentration (true value from EPA)
• Comments about Pollutant
• Sequence Number of Card
120
-------
Card Type Al; Analyte Concentration
• Pollutant Code
• Concentration
• Method of Analysis Specific to Pollutan1-
• Not Detected/Trace/Qualit?cive Indicator
• Confirmation Indicator
• Percent Recovery (Dioxin only)
• Minimum Detectable Concentration (Dioxin only)
• Comments about Pollutant
• Sequence number of Card
Card Type A2: Internal QC
• Pollutant Code
• Observed Concentration
• Expected Concentration
• Method of Analysis Specific to Pollutant
• Internal Standard/LCS/Surrogate/Spike Indicator
• Percent Recovery (Dioxin only)
• Comments about Pollutant
• Sequence Number of Card
Card Type A3; Qualitative Analysis
• CAS No. or Descriptive Identification of Compound
« Match Score
• Analyst's Judgment of Identification
• Verification of Identification
• Comments about Pollutant
• Sequence Number of Card
121
-------
Card Type A4: Comments
• Method Blank Associated with Sample
« Any other Relevant Comments about Sample
• Sequence Number ot Card
The External QC Form (card type AO) was filled out for all external QC
samples. The Analyte Concentration Form (card type Al) was filled out for all
field samples and for internal QC method blank samples. The Internal QC
Report Form (card type A2) was filled out for all other QC samples generated
internally by the laboratory. It was also filled out to report the surrogates
and internal standards which the analytical laboratories introduced into field
samples. The Qualitative Analysis Form (card type A3) was not filled out for
internal QC samples; it was completed only for analysis of seraivolatile
organic compounds in water, soil, and sediment samples. The Comment Form
(card type A4) was required for all field samples, except Tenax samples,
because it was used to report the method blank associated with each sample.
The Comments form was also used to make comments about any samples, as deemed
relevant by the analytical laboratories.
GCA Data Management coded the External QC data on Card Type AO, while the
analytical laboratories were generally responsible for coding card types
A1-A4. Analysis Reporting Forms were provided to the analytical laboratories
as appropriate to their analysis responsibilities in the Love Canal Study.
Details on the analysis information provided by the analytical
laboratories is contained in the "Love Canal Study Coding Manual for Analysis
Data Reporting Forms," which is reproduced in Appendix G. This manual also
specified which fields within each form or card type were mandatory under the
various analysis situations.
Verification Action Reporting Forms—
Five basic types of Verification Action Forms (card types VO-V4) were
developed, to correspond directly to the five basic types of Analysis Data
Reporting Forms (card types AO-A4). Because these forms were used to enter
corrections to analysis data, the format of the V cards was almost identical
to the format of the corresponding A cards. The V cards, however, had one
additional field; this was the Verification Action Code, in Column 72. This
indicated whether the values entered in other columns represented additions,
deletions, or changes.
Only those fields involved in the corrective action were coded on the
Verification Forms; thus it happened only very rarely that all fields in a row
of coding would be filled out. The following information ws coded:
• Sample ID
• Pollutant Code on Card Types VO, VI, or V2;
or CAS No. on Card Type V3
122
-------
f
u
• New Value (in any field or fields on form)
• Action Code (for Add, Delete, or Change)
Any number of Verification Action cards could be coded for a specific
sample, and these might include all types of actions. The section of this
report on Verification Procedures provides details on GCA1s coding and use of
these forms. It also describes coding for samples for which no corrective
action was necessary.
Coding Manuals
GCA Data Management prepared coding manuals to guide the Sample Bank and
the analytical laboratories in coding the Love Canal Reporting Forms for which
they were responsible. Data Management staff consulted with GCA Sample Bank
and analytical personnel throughout the preparation of the manuals. These
manuals, which were issued first in August 1980, were updated periodically
during tbe course of the project. The final versions of the "Love Canal Study
Coding Manual for Sample Data Reporting Forms" and the "Love Canal Study
Coding Manual for Analysis Data Reporting Forms" are reproduced in Appendix G.
Coding Manual Revisions—
Reasons for altering the coding manuals included:
• Addition of coding fields (e.g., "date to lab," "analysis," and
"well number" on card type S3);
• Addition of codes for certain fields (e.g., addition of
subcontracting laboratory codes);
• Deletion of codes for certain fields (e.g., deletion of method codes
625BC and 625DW on Analysis Data Reporting Forms);
• Change in coding units (e.g., change of water temperature from
°F to °C);
• Change in coding requirements (e.g., number of meteorological
observations required).
In general, a change in coding instructions did not insinuate that data
coded according to the old instructions were incorrect. That is, alterations
in coding instructions were generally issued at or before the first occurrence
of the coding of the information involved in the change. For example, new
laboratory ID codes were added when subcontracts were signed with the
laboratories and before any samples had been sent to them; recording units for
water temperature were changed to correspond to the units being measured in
the field and recorded by the Sample Bank; and the "Well Number" field was
added to the S3 card when the first ground water sample was coded.
123
-------
In three instances, however, changes in coding procedures did necessitate
changes in data already submitted to GCA and entered into the GCA raw data
base. These changes were made by GCA Data Management through additional '.
coding and computer programming. The first problem occurred when the "date to ;
laboratory" and "analysis" fields were added to the S3 card after data
processing had commenced. GCA's corrective action for the problem is
described in the section below on preprocessing of sample data. The second .'
problem occurred when it was realized that method codes 625BC and 625DW were ?
ambiguous and would have to be altered to properly designate the methods used ;
by the laboratories in analyzing the samples involved. GCA's corrective
action for this problem is discussed in the section of this report on •
verification procedures. The final problem occurred when several compounds
were deleted from the list of compounds which EPA wanted the laboratories to
measure in the samples. The section on the verification edit program ;
describes GCA's corrective action for this problem. !
Use of the Coding Manuals— j
The "Coding Manual for Sample Data" was designed to guide the GCA Sample i
Bank in its coding of the "Sample Data Reporting Form" (Form E01, for card
types S1-S6), and the "QC Cross-Reference Reporting Forms" (Form E02, for card
type Q).
The "Coding Manual for Analysis Data Reporting Forms" was designed to
guide both EPA and subcontracting analytical laboracories in coding the
"Analysis Data Reporting Forms." Forms covered by the manual were: Analyte
Concentration Forms J01-J08; Internal QC Forms J11-J25; the Qualitative
Analysis Form J31; and the Comments Form J41. Analysis Reporting Forms not
covered in the coding manual were Special Analysis Analyte Concentration Forms
J09 and J10, Special Analysis Internal QC Form J26, and the External QC Form
J60, all of which were coded only by GCA Data Management. Similarly, no
coding manual was prepard for the Verification Action Reporting Forms, since
these also were coded only by GCA Data Management. Coding of all forms not
covered in manuals is discussed below in the section on coding.
As well as serving as a guide for those responsible for coding
information on sample data and analysis data, the coding manuals served as a
reference for GCA and EPA staff who examined the data reported out by GCA Data
Management. That is, the manuals provided the means for interpreting codes
and for explaining the presence or absence of certain information under
various circumstances.
For the Analysis Data Reporting Forms, the Analysis Data Coding Manual
first provided instruction for coding columns 1-32, which contained the
information which was keypunched on all Analysis cards (card types AO-A4).
Then coding instructions were provided for each card type in turn, with the
coding of each field being described for each card type. Similarly, for the
Sample Data Reporting Form, the Sample Data Coding Manual first provided '
instructions for coding columns 1-10, which contained the information to be j
keypunched on all Sample Data Cards (card types S1-S6). Again, instructions {
were then given for each field on each card type. Finally, instructions were i
provided for coding each field on the QC Cross-Reference Reporting Form (card j
type Q). S
12A |
I
-------
The following categories of information were provided for each field on
each card type:
• Applicability—specified the circumstances in which the code had to
be entered;
• Recorded by—indicated the category of person responsible for
recording or assigning the information;
• Description—described the field;
• Units—specified reporting units, when applicable;
• Coding Instructions—indicated the proper use of columns and
characters, and gave acceptable codes;
• Examples—provided column-by-column examples.
Coding Procedures
This section describes the coding of the Sample Data, QC Cross-Reference,
and Analysis Data Reporting Forms. Coding of the Verification Action
Reporting Forms is described in the discussion of Verification Procedures in a
later section.
Sample Data Coding—
GCA Data Management had the overall responsibility for Sample Data
coding. However, since most coding took place at the Sample Bank, a GCA
Sample Bank staff member supervised coding onsite. Coding was done at the
Sample Bank rather than at Bedford because log hooks and other written records
maintained by the Sample Bank were the primary source of data, and onsite
personnel with first-hand knowledge of field procedures were consulted
continuously daring the coding process.
GCA Data Management prepared the Sample Data Reporting Form and Coding
Manual during August 1980, and one Data Management staff member traveled to
the Sample Bank at the end of August to instruct the onsite supervisor in
coding procedures. The supervisor then trained the coders, who were hired
locally for this task. One full-time coder began work at the Sample Bank at
the beginning of September, and two additional coders were hired in
raid-October and worked for the last few weeks of operation of the Sample Bank
at Love Canai. GCA Data Management conferred with the coding supervisor daily
by telephone during this coding period, and on three occasions Data Management
staff traveled to the Sample Bank to observe coding procedures, solve
problems, and assist with coding.
Conceptually, the coding of sample data for water, soil, sediment, and
biota samples was a straightforward process. All data to be cooed, with the
exceptio.i of planar coordinates and the depth of ground water samples, were to
be coded directly from the Master Logs maintained by the Sample Bank. Planar
125
J
-------
..,
coordinates and ground water depths were maintained on cross-reference
sheets. In practice, many exceptions to the normal process occurred. For
example, log book entries were frequently incomplete, either because required
data had not been supplied by the sampling technicians, or because Sample Bank
personnel were too pressed for time to make all entries while packing samples
for shipment to the laboratories. Particularly at the outset of the project,
the coders often found that planar coordinates had not yet been plotted for
the sites being sampled. In all these instances, coders had to set aside
partially-completed code sheets until the information could be obtained, and
the coding supervisor spent considerable time in locating the missing data.
Changes which occurred during the coding process also interfered with
efficient coding. For example, designations of site codes were altered for
some sites near the beginning of the program and coders were required to go
back through completed forms to make the necessary changes. (In some
instances, these codes were changed back to their initial values, requiring
yet another set of changes.)
Coding of air samples was more complex and more fraught with problems.
Many data items required on the coding forms were not entered in the log books
and had to be located from other sources. In addition to making one-for-one
entries from log book to code sheet, and looking up the planar coordinates
from a reference sheet as was done for other samples, the coders had to
perform these extra steps for air samples:
• Refer to incoming chain-of-custody records to obtain the preparer's
lot number and the Preparer-to-Bank (or QC-to-Bank) chain-of-custody
number.
• Refer to Bank-to-Sampler chain-of-custody records to obtain this
custody data item.
• Refer to flow data sheet (provided by sampler) to obtain the air
pump number, start flow, and end flow for the sample, based on the
sampling date, site ID, and location at site.
• For indoor samples, refer to the "sample data sheet" (one sheet per
sample provided by the sampler) to obtain meteorological readings.
• For outdoor samples (and for indoor samples with missing information
on the sample data sheet), refer to meteorological data sheet
(provided by the sampler) to obtain meteorological readings.
• Derive relative humidity from wet/dry temperature readings, if not
derived by sampler.
• Convert temperature readings from °F to °C, if not converted by
sampler.
• Identify duplicate samples by finding log entries for "livingroora 1"
(Ll) and "livingroom 2" (L2) for the same date, stratum, site ID,
and sampling medium.
126
-------
coders at Che Sample Bank concurrently with the coding of the Sample Data I
Reporting Forms. That is, it was intended that, as duplicate and triplicate I
samples were encountered and coded, entries would immediately be made on the *
QC form. However, because of the press of time and because incomplete log •=
book entries made it difficult to identify sets of samples, the coding of the '.
QC Cross-Reference sheets was postponed until all Sample Data coding was
complete. (In ract, these data were not needed until late in the project,
since they were used in a computer program which accessed the verified or \
validated data files rather than the raw data file.) \
i
Coding of the QC Cross-Reference Reporting Form was done by the Coding |
Supervisor, who examined each log book to identify sets of duplicate and
triplicate samples. For water, soil, sediment, and oiota samples, the !
identification was made directly from notes in the "comments" column in the
log book. Here the duplicate and triplicate status of the sample was
recorded, along with the sample tag number of other members of the set. For
air samples, identification was made just as it was when the Sample Data
Reporting Forms were completed. That is, duplicates were identified by
finding "LI" and "L2" entries for the same sampling date, site ID, and .
sampling medium.
Analysis Data Coding--
Coding Performed By the Analytical Laboratories—The analytical
laboratories were generally responsible for coding all analysis results for
the Love Canal Study onto the Analysis Data Reporting Forms supplied by GCA
Data Management, and in accordance with the instructions in the "Coding Manual
for Analysis Data Reporting Forms." Tnis involved coding analysis results for
1 to 300 pollutants per sample, using anywhere from 1 to 25 coding sheets to
complete the task for a single sample.
The laboratories were expected to code and submit to GCA the results for
the following types of samples:
• All field samples and field blanks sent from the Sample Bank;
o External QC samples sent from the Sample Bank;
• Internal QC samples sent from the Sample Bank for Tenax analysis;
• Duplicates of field samples created by the laboratories by runni ig :
second aliquots of the samples;
• Method blanks gener-rted by the laboratories;
• Laboratory Control standards (LCS) and spike samples generated by ;
the laboratories; i
i
• Results for surrogatej and internal standards added to the samples i
sent from the Sample Bank. I
127
J
-------
Because of these extra procedures, the coding of each air sample was a
tim-2-consuming matter. Efficient coding was further hampered by the fact that
flow data sheets and met daca sheets were frequently submitted very belatedly
by the sampler, and individual sample data sheets were often missing. Also,
because logging of air data often lagged, due to the high volume of samples
processed, identification of duplicate samples was frequently difficult.
In the preceding section, we mentioned that the coding forms and manuals
were updated periodically during the project, and that the fields labeled
"Date to Lab" and "Analysis" were added to the Sample Data Coding Forms after
coding was well underway. This particular change did not adversely affect the
coding process at the Sample Bank, however, since the addition of these data
to previously-coded samples was handled entirely by Data Management in
Bedford. The Sample Bank sent copies of coded log book pages to Bedford,
where the two new peieces of information were coded with the Sample ID for
each sample. A simple computer program was then written to insert the new
data into the appropriate spaces in the raw sample data records in the
computer.
Over the course of the sampling program many of the coding problems
eased, primarily through the efforts of the coding supervisor, who worked
persistently to obtain complete and timely data from the samplers. However,
initial delays created backlogs which were difficult to overcome. Measures
taken to speed up the rate of coding included the hiring of the two additional
coders at the Sample Bank in mid-October, and the transfer of some coding to
GCA Data Management in Bedford. All the biota coding and over half the air
coding were done at Bedford, using copies of log book pages and other
necessary records.
By the middle of November (2 weeks after the conclusion of the sampling
program), about 90 percent of all sample data had been coded. The coding of
the remaining samples, many of which had been held out initially because of
missing inforraaticn, was completed by mid-December. Sample Data Reporting
Forms were coded for a total of 8714 samples, with the breakdown by medium as
follows:
Medium Number of Samples Coded
Air 3090
Water 3296
Soil 1503
Sediment 353
Biota 472
Total 8714
QC Cross-Reference Data Coding—
Entries were made on the QC Cross-Reference Reporting Form for every set
of duplicate samples collected by the Sample Bank, for Quality Control
purposes. It was originally expected that these sheets would be completed by
128
-------
With only a few exceptions, these requirements were met. A few samples
were nevar analyzed (because of such reasons as laboratory accidents or
expiration dates exceeded), and therefore nc results were submitted. Also,
the Sample Bank sent some Tenax samples to the laboratories for use as
calibration standards; results for some of these were reported in the standard
format, while for others, no results were subaitted. Results for a few
external QC samples were submitted to the GCA Quality Control Officer in
nonstandard format; these were coded by Data Management onto the appropriate
forms.
Analysis results for the Performance Evaluation (PE) samples sent to the
laboratories by che Sample Bank were to be submitted to the GCA Quality
Control Officer (as well as to EPA) in EPA reporting format. Some
laboratories also submitted these results on standard GCA reporting forms,
although this was not required. Initially, it was planned that these samples
would not be entered into the GCA data base. However, this decision was later
reversed by the Project Manager, and, as a consequence, any PE results
submitted by the laboratories only in EPA format were recoded by Data
Management onto the appropriate Love Canal forms.
A final deviation from the standard submission process occurred in the
case of the EPA Laboratory at Las Vegas. This laboratory, which was
responsible for all radiation analyses, submitted results to GCA Data
Management on a computer tape in GCA standard format. For these samples, the
steps of keypunching and key verification were by-passed.
Laboratories identified results by Sample ID number according to the
following plan:
• All samnles sent from the GCA Sample Bank were coded with the Sample
ID assisted by the Sample Bank.
• Duplicates created by the laboratories by running second aliquots
were labeled with the GCA Sample ID number, except that the first
character of that ID was changed from its initial value to "2."
• Reports on internal standards and surrogates were coded with the
Sample Bank ID number of the sample involved.
• Samples created internally by the laboratories were assinged numbers
by the laboratories from a list of reserve numbers provided by GCA
Data Management. A list of the numbers assigned to each laboratory
is shown in Table 8.3.
Shortly after the coding forms and manuals were d*stributed to the
laboratories in September 19"0, GCA Data Management called each laboratory to
see if they had any questions about coding procedures. Many questions were
answered at this time, but most of the laboratories continued to contact GCA
Data Management as questions arose throughout the course of the project. In
addition, GCA Data Management called the laboratories i2 any data forms were
129
-------
TABLE 8.3. RESERVE LIST OF SAMPLE ID NUMBERS FOR SAMPLES
GENERATED INTERNALLY BY ANALYTICAL LABORATORIES
Reserve numbers
Laboratory
Q00001-Q01000
QOI001-Q01500
Q01501-Q02000
Q02001-Q03000
Q03001-Q03200
Q03201-Q04300
Q04301-Q04500
Q04501-Q04600
Q04601-Q06000
Q06001-Q07000
Q07001-Q07500
Q07501-Q07800
Q07801-Q08000a
Q08001-Q08500
Q08501-Q08550a
Q0855J-Q086003
Q08601-Q09000
Q09001-Q09110
Q09111-Q09149
Q09150-Q09199
Q09200-Q09500
Acurex Corp.
Battelle Columbus Labs
Energy Resources Co.
Compuchem/Mead
Gulf South—Lafayette
Gulf South—New Orleans
(Not assigned)
Midwest Research Institute
PJB Laboratories
Southwest Reserach Institute
PEDCo Environmental
EPA-EMSL-Cinc innat i
EPA-EMSL-RTP-for Tenax
EPA-EMSL-RTP-FOR Hi-Vol
EPA-ERL-Duluth
EPA-HERL-RTP-for Dioxin
Advanced Environmental Systems
TRW, Inc.
Wright State University
GCA
EPA-Corval^is
aNo numbers were actually used from these sets.
130
-------
filled out incorrectly. These calls were made both to obtain necessary
information for the samples involved and to attempt to prevent similar coding
errors in subsequent submissions.
Most of the systematic coding problems were quickly cleared up, but a few
lingered. A major problem was that some laboratories persisted in failing to
identify the method blanks associated with samples being reported. (GCA's
coding system for method blanks was poorly designed and contributed to this
particular problem.) Some other problems stemmed from coders' apparent lack
of familiarity with data entry requirements. More than one character might be
squeezed into one box on the coding form, for example, or a decimal point
might be placed on the line between two boxes. Such entries could not be
keypunched. The use of subscripts, superscripts, and Greek characters was
also troublesome to keypunchers. Other errors in coding presented no trouble
in keypunching, but caused problems of interpretation once entered into the
comp.iter. For example, comments which ran from one line to another on the
analyte form might be separated from each other when sorting of cards took
place; or a comment which the coder entered on one line on a form, but which
applied to all lines on the form, ended up, of course, on only one card.
GCA attempted to catch such errors through visual inspection of coding
forms before the forms were sent to keypunch, and we also attempted to
describe these problems, and ways of avoiding them, to the laboratory coders.
A coding training session for coders from all participating laboratories would
have been a valuable aid in preventing such problems.
Before laboratories submitted their coding forms to GCA, they made two
copies, one of which was submitted to EPA, and one: of which was retained in
the laboratory files. The copies retained by the laboratory were to provide
back-up if loss occurred during transit (which, fortunately, it never did),
and they were to serve as a reference for the laboratories if questions arose
about the data they had submitted. GCA Data Management did, in fact, call
upon the laboratories to check certain file copies from time to time. This
was particularly important in the few instances in which the Sample ID number
was entered incorrectly on the reporting form, since in these cases the
laboratory log books could not have yielded the correct information required.
The analytical laboratories sent their Analysis Reporting Forms to GCA
via Federal Express for prompt and guaranteed delivery. (ERGO, situated
nearby GCA, used regular mail service.) All forms were sent to the attention
of GCA's Love Canal Document Control Officer, who logged them in by sender,
date, package number, and general contents, before handing them over to Data
Management. The Data Mangement log system for coding forms is described in a
separate section below.
Coding Performed by GCA Data Management—GCA Data Management assumed
responsibility for coding the following data:
• PE data submitted in EPA format only;
• Data for all samples analyzed by GCA;
131
-------
• A set of results submitted by PJB Laboratories for samples extracted
by New York State;
• All date submitted by the EPA Laboratories at Ada, Oklahoma, and at
Corvallis, Oregon;
• TSP data for HiVol samples;
• All data from Wright State University (WSU) and Advanced
Environmental Systems (AES);
• Dioxin data submitted by EPA-HERL at RTP;
• All data on the External QC Reporting Forms (Form J60, card type AO);
• Certain Comments Forms (card type A4).
We mentioned above that the laboratories were not required to submit PE
data on standard Love Canal Reporting Forms. If results were submitted in
nonstandard form, GCA Data Management coded them onto the appropriate Analysis
Reporting Forms. Since the true values were not available to GCA for these
samples, the standard reporting form used was the Analyte Concentration Form
(card type Al). GCA coded these forms in accordance with the "Coding Manual
for Analysis Data Reporting Forms."
Similarly, procedures in the coding manual were followed in the coding of
GCA samples, the PJB/NYS-extract samples, the samples from the EPA
Laboratories at Ada and Corvallis, and the TSP data for HiVol samples. GCA
assumed responsibility for coding the PJB/NYS-extract data because results for
these 21 samples were submitted in nonstandard form. This happened because it
was not clear from the outset whether or not these samples were to be entered
into the GCA data base. EPA resolved this by requesting GCA to enter these
data into the GCA raw data base, but not verify them. GCA Data Management
coded the samples from Ada and Corvallis as a time-saving measure. These
samples were ready for coding only a few days before the March 18, 1981,
deadline for data validation. Since these data had to be passed through the
raw data and verified data stages before validation could take place, haste
was necessary. Because the two laboratories were not acquainted with Love
Canal coding procedures, it seemed expedient to perform the coding at GCA.
Therefore, the results were sent to GCA in nonstandard form via Telex. TSP
data for HiVol samples were coded by GCA Data Management because these data
were submitted to GCA via Telex in nonstandard form after the Analysis
Reporting Forms had been submitted.
GCA's reasons for coding the WSU (dioxin), AES (TOX, TOG), and
EPA-HERL-RTP (dioxin) data were different. At the outset of the program we
were uncertain as to the data fields which would have to be coded for the
special analyses performed by these laboratories. As it happened, some extra
fields were required, and when this was established, the coding manuals had
already been issued. Rather than updating the analysis data coding manual to
132
-------
cover these special fields, the decision was made to code the data at GCA.
This decision seemed particularly reasonable from GCA's viewpoint, because
very little data (only one to four pollutants) were reported for each of the
samples involved.
The Analyte Concentration Form J09 (card type Al) and the Internal QC
Form J26 (card type A2) were designed specifically for reporting Total Organic
Carbon (TOC) and Total Organic Halide (TOX) data submitted by AES. The fields
on Form J09 were nc different from the fields on Analyte Forms J01-J08, but
the columns were arranged differently on J09. Here it was designed so that
more than one sample could be reported on each sheet, provided that the date
of analysis remained constant. TOC and TOX (pollutant codes D01 and D02) were
reported in g/1 (ppb) as were other pollutants in water. Conductivity
(pollutant code C02) was reported in ohms, and pH values (pollutant code C01)
were expressed in standard form. The Internal QC Reporting Form J26 had one
more field than did the other Internal QC Forms (J11-J25). This field was
"% Recovery," which was used to report the percent recovery of the surrogate !
reported on Form J26. This was coded in columns 66-69, with the decimal point
indicator falling between columns 68 and 69. More than one sample could be '
coded on this form, and the analysis dates did not have to remain constant. ;
i
The Analyte Concentration Form J10 (card type Al) was designed for |
reporting dioxin (TCDD) results as submitted by WSU and by EPA-1IERL-RTP. Two
more fields appeared on this form than appeared on other Al forms (J01-J09).
"Percent Recovery" of the internal standard (37Cl4~2,3,7,8-TCDD) was coded ;
as a whole number, right-justified, in columns 56-58. "Minimum Detectable
Concentration" was coded in parts per billion (ppb) in columns 59-63, with the
decimal point indicator falling between columns 60 and 61. Concentrations of ;
dioxin (pollutant codes D03 and D04) were also expressed in ppb. More than j
one sample could be reported on Form J10, provided that the analysis date I
remained constant. I
GCA's other major coding effort was the completion of the External QC i
Reporting Form (card type AO) for all external QC samples. GCA assumed this \
responsibility since the laboratories were not provided with the true values j
for these samples. Coding of this form was not covered in the Analysis Data ,
Coding Manual, but it was completed in a manner similar to the completion of
the Internal QC Reporting Forms. The major difference between the two form \
types was that the External Form required neither the "specific method"
designation nor the indicator for type of QC sample. Thus the comments field
on the AO form extended from column 57 through column 72. True values were
coded in the "Expected Concentration" field. More details on GCA's coding and
processing of external QC samples are provided in a later section of this
report.
Finally, GCA coded Comments Forms (card type A4) in certain
circumstances. First, if a laboratory submitted information about method
blanks in separate lists, or over the phone, GCA Data Management coded the
required A4 forms. Second, GCA sometimes added comments about the results
submitted by laboratories. This was most frequent for External QC samples, i
133
-------
where a comment was added if the GCA Quality Control Officer felt the QC
samp1'1 was inappropriate or if conversion of units was required to bring
reported results into conformity with the true values. Finally, GCA coded
comment cards to explain the absence of results for some samples. This was
done if the sample was voided by the Sample Bank or by the analytical
laboratory, or if no results were expected because the sample was a back-up
duplicate whose mate had already been analyzed. These comment cards were
added for two purposes: to provide information about the lack of results; and
to "close" the sample from the perspective of the data handling system.
(Closure was implied because once any "A" cards were attached to a sample, it
was assumed that analysis results for that sample were completed. In fact,
laboratories frequently submitted results in piece-meal fashion and, as a
result, the closure concept was violated.)
Any forms coded by GCA Data Management were subsequently treated in the
same mannev as were the laboratory-coded forms. They were entered into the
raw data system, and raw data reports on the samples involved were sent for
verification to the laboratories which had originally submitted the results or
who had custody of the sample.
RAW DATA PROCESSING
Raw data processing for a sample was initiated by the receipt at GCA,
Bedford, of Sample Data or Analysis Data Reporting Forms. Raw data processing
activities included logging and inspecting the reporting forms, keypunching
and key-verifying them, edit-checking the punched cards in the computer,
solving any apparent problems, entering the new data into the master raw data
file, and generating computer reports on the new raw data. Each of these
activities is described below.
Log System for Data Reporting Forms
All Sample Data and Analysis Data reporting forms received by GCA Data
Management were individually logged in when they arrived, and their path to
the computer was also tracked in the log book. The Love Canal Data Clerk was
responsible for the entire log system for data form. Figure 8.3 illustrates
the form which was designed for the study.
Rationale for the Log System—
The log procedure for data reporting forms was designed as part of the
document control program for the Love Canal Study. The code numbers on the
top left corner of the data forms were arranged in accordance with this
document control system. In addition to the required document numbers, each
form also carried a serial number which Data Management used in logging and
tracing the data sheet.
Each data sheet was logged individually, because during the data
preparation phase individual sheets were often separated from their original
packages. The system was designed so that these separated sheets could be
re-filed with their proper packages when desired, and so that the loss of
individual data sheets could be detected. It was important, for example, to
134
-------
l.OVE CANAL
DATA REPORTING FORM LOG
Sori.i! Nos.
•
Receivj
Onto
^d From Lab Sent to Keypuncher Rec'd from Keypunch
(
I'.ickii'o IK'AA
r-j |-
1"
Date
Bar , i ii.u . i'-n-k..; i
k _--. ,- 'f- — - —
|
I
i
j
. _ . .
Date to
( \VT1PI|I t- t
- ^
"
COMMENTS:
Figure 8.3. Love Canal Data Reporting Form Log.
135
i
i
j
f
-------
be sure that every sheet sent out for keypunching was actually returned to
GCA. The system was also intended as a mechanism for tracking down individual
data sheets if questions about their contents arose at any point.
As Figure 8.3 illustrates, the serial number was used as the key
identifier in the log system. When the log system was designed, the
advantages and disadvantages of using the serial number as the key were
weighed against the advantages and disadvantages of using the Sample ID
number. The Sample ID number appeared preferable for many reasons,
particularly since all data in the computer system were refarenced by this
number. We judged, however, that logging large numbers of data sheets by
serial number would prove unmanageable. We imagined that a batch of analysis
results would include samples with widely varying ID numbers, each of which
would have to be entered individually. On the other hand, we imagined that in
a particular batch of forms, each type of form might be sequentially numbered,
and therefore we could enter the serial numbers as a range. Because we were
expecting to make multiple entries for approximately 50,000 data sheets over a
4-month period, the option of logging by range of numbers was judged as
critical. Hence, the data reporting form log system was keyed to the serial
number on the form, and, in most instances, we achieved the advantage we
sought. The Sample Bank and most laboratories did send their forms in batches
which were sequentially numbered by serial number.
The Log Entry System—
The data form log book was set up as a loose-leaf notebook with separate
sections for the Sample Bank and for each of the EPA and subcontractor
laboratories which submitted data forms to GCA. Within each laboratory's
section, a separate page (or set of pages) was maintained for each type of
"J-form" used (J01-J08, J10-J25, J31, J41).
When a package of forms was received, information entered into the log
book for each sheet included thf serial number, the date received, and the
package number which had been assigned by the Love Canal Document Control
Officer.
The next entry was made when the forms were sent to a keypuncher; this
entry included an i idication of which keyp'uncher (in-house or external) was
selected, the date the forms were sent to the keypuncher, and the batch number
of the forms shipment. *(Generally the package number was used also as the
batch number.) Each data sheet was individually logged by date when it was
returned from the keypuncher, and the final entry was made in the log book
when the cards associated with the code sheets were entered into the
computer. When the data sheets had passed through the Data Management log
system, they were transmitted to the Love Canal Document Control officer for
filing by package number.
It should be emphasized that while we saved a great deal of time by
entering ranges of numbers in the log book we were careful to make sure that
every single data sheet included in that range was present at every step of
the logging process. Further, for multiple-page forms, we checked to be sure
that every page of each set was present at each logging step. If a sheet
136
-------
which was first entered as part of a range was later separated from its
package, an asterisk, was placed in the "Serial No." column, and notes about
this sheet were made on the reverse side of the form.
Use of the Log System—
The log book was used routinely to keep track of the Love Canal work load
of the in-house and external keypunchers, so that decisions could be made on
which keypuncher should receive the next packages of data. Also, the log book
was reviewed regularly to make sure that batches of data had been returned
from the Veypunchers within a reasonable period of time. When a package had
been out for a week, the data clerk contacted the keypuncher about it.
A very frequent and crucial use of the log book occurred when it was
necessary to check computer reports of data against the data sheets which were
submitted to GCA. From the computer report we could ascertain the date on
which the data were entered to the computer, as well as the laboratory and the
type of analysis performed. Using this information we could turn to the
correct laboratory and "J-Form" page in the log book, and use the "Date to
Computer" column to isolate the package (or packages) which could have
contained the desired sample. We could then go directly to the files of data
forms to look up these packages.
The log system was, of course, convenient to use when questions arose
about data sheets with particular serial numbers. This type of question was
most commonly raised by the originating laboratory. We were able to look up
the package number associated with the serial number, and go directly to the
package files to find the desired sheet.
We also learned that when we. made the initial log entry, we should check
to make sure that the same serial numbers had not been submitted in a previous
package. Some laboratories had a habit of re-submitting some forms, either
because some data sheets "belonged" with both packjages (e.g., method blanks),
or because they used this as a mechanism for submitting revisions to their
data. By using the log book to catch and analyze double submissions before
keypunch, we saved a great deal of confusion and effort later on.
Problems with the Log System
The major problems with tha data log system occurred when data sheets for
a single sample were distributed through more than one package, or when Data
Management processed sheets for the same sample on different dates. In either
case, it became quite difficult to track down a desired data sheet. A logging
(and filing) system keyed to the Sample ID Number would have alleviated this
problem, but as was pointed out above, such a system was not practical.
One improvement to the actual log system could also have helped somewhat
in alleviating the problems. Namely, if the serial number and/or the package
number had been keypunched and entered as part of the computerized data, then
tracking down data sheets would have been a much simpler matter.
137
-------
Computer Reference Files
Before discussing the details of the processing of raw data, it is
appropriate to describe three reference files which were accessed by the
system software during processing of sample data and analysis data. Two of
these (COORDS and POLL XREF) could be updated independently of the flow of raw
data, while the third (ID XREF) was generated during raw data processing. The
coordinate reference file (COORDS) was used to update the planar coordinates
for sample data, and the pollutant code reference file (POLL XREF) was used to
check the validity of pollutant codes on analysis data and to label output
reports. The ID cross-reference file (ID XRE?) was used to make sure that
sample data were not entered into the system more than once, that sample data
were present before analysis data were present, and that each sample was
verified only once.
Coordinate Reference File—
The sample data coding plan called for coders to enter planar coordinates
on the SI card for each sample coded. For ground water samples, coordinates
which were accurate to a tenth of a foot were also entered on the S6 card.
This system was problematic for two reasons: first, coordinates were not
always available at the time of coding; and second, some coordinates were
changed during the course of the study. Therefore, we created the coordinate
reference file (shown as Item 2 on Figure 8.1) so that coordinates could be
added to sample data that had already been entered into the raw data system.
After this file was created, coders were no longer required to enter the
planar coordinates on the sample data report forms.
The coordinate reference file is listed in Appendix H. The data fields
in the file include:
• Stratum
• Site ID
• Easterly coordinate
• Northerly coordinate
* Well number for ground water samples or medium for biota samples or
for soil samples taken from house sites (this field is blank for all
other samples).
The medium or well number field was necessary because at some sites sampling
took place at more than one location, and thus more than one set of
coordinates was available. For instance, at a home site, sampling for air and
water inside the house would require one set of coordinates, while soil
sampling in the yard would require a different set of coordinates. For ground
water, two wells were typically located on the same site, and each had its own
set of coordinates.
138
-------
When a file of sample data was run through the program which inserted the
coordinates into the coordinate field(s), the first requirement was that the
stratum and site ID on the sample card had to match the stratum and site ID of
an entry in the coordinate file. Further, for ground water, coordinates were
applied only if the well number of the sample matched a well number in the
coordinate file. For biota samples, coordinates were applied only if the
medium in the coordinate file specified "BIO1". For soil samples, if a set of
coordinates with the medium specified as "SOIL" were available, these were
applied; otherwise, coordinates with the final field left blank were used.
For air samples, sediment samples, and water samples which were not ground
water, this final field was required to be blank. If the proper combination
of stratum, site, and medium could not be made, no coordinates were inserted,
and an error message was printed.
The coordinate file was first used after a large portion of sample data
had already been entered into the system and was residing in the temporary
verification (TEMPVER) file (step 13 in Figure 8.1). Therefore the entire
TEMVER file was processed with the coordinates file so that all samples
processed up through that date would have correct coordinates. After that,
all sample data were processed through the coordinates application program at
the time of data editing (step 3 in Figure 8.1).
The coordinates file was updated as necessary during the remainder of the
study. Coordinates were added for new sites, and some coordinates were
changed for existing sites. When coordinates were changed, this required
changing the coordinates in the sample data for samples taken at these sites
and already entered into the system. This was done through the sample data
verification process (step 14 in Figure 8.1) if the sample had not already ;
been verified, or it was done through the verification correction process ! i
(step 32 of Figure 8.1) if the sample had been verified. ! '
Pollutant Code Reference File— [ \
The pollutant code reference file was established at the outset of the | 1
study, and it was updated as needed throughout the study. It is listed in its ' i
final form in Appendix H. The file contained the following information for ! i
each pollutant identified by EPA for the study. , i
j ?
• Pollutant Name ! |
• CAS Number j j
• Three-character H> code assigned by GCA J ]
• Alert level concentration. ' ]
i {
I )
The first character of the three-character code assigned by GCA was
designed to indicate the type of medium and/or pollutant involved. The
characters were assigned as follows:
139
-------
A - Acid portion of semivolatiles
B - Base portion of semivolatiles
C,D - Conductivity, dioxin, and other special analyses
F - Foam plugs; pe'sticide analyses
I - Inorganics
P - Pesticides in water, soil, sediment and biota
S,H - Surrogate's, spikes, internal standards, LCS compounds
T - Tenax ; volatile analyses
V - Volatiles in water, soil, sediment and biota.
The pollutant reference file, which is item 6 on Figure 8.1, was accessed by
four computer programs in the data handling system. It was used by the
analysis data edit program (ALEDIT) and the verification action data program
(VEREP) (steps 7 and 24 in Figure 8.1) to check that the pollutant codes
entered on the analysis data and verification action report forms were valid.
If pollutant codes on these forms could not be located in the POLL XREF file,
the edit programs printed an error message.
The POLL XREF file was used for two purposes by the RAWSUM and by the
VERFILE programs (steps 15 and 27 on Figure 8.1). When these programs were
preparing the raw data listings and verified data listings, t'.ay searched the
POLL XREF file for the CAS number and pollutant name of each pollutant code
encountered and printed all three pieces of information on the output
reports. Finally, the alert levels in the POLL XREF file were used by the
RAWSUM and VERFILE programs to prepare the Alert Reports for raw and verified
data. Each quantified concentration for each sample was checked against the
alert level for the appropriate pollutant in the POLL XREF file. Entries were
made on the alert report for each concentration exceeding the specified alert
level.
ID Number Cross-Reference File—
The Sample ID Number cross-reference file (ID XREF) was built up during
the course of raw data processing. Every ti
-------
referred to the ID XREF file to make sure that sample data for a single sample
were entered into the raw data system only once. The analysis data edit
program (ALEDIT) checked the ID XREF file to be certain thet sample data were
already present for all analysis data being edited. (if sample data were not |
present, then processing of the analysis data had to be postponed.) The j
program which applied verification changes to the raw analysis data (VERAP; [
Item 26 on Figure 8.1) also accessed the ID XREF file, and changed the initial
character of the Sample ID to a "V" to indicate that the sample had been
verified. Finally, the edit program for verification action cards (VEKED)
referred to the ID XREF to make sure that each sample was only verified once.
Pre-Processing of Sample and Analysis Data
Pre-processing of sample data and analysis data was comprised of a set of
activities designed to make the data as complete and correct as possible
before they were merged into the master raw data file. Sample data and
analysis data passed through separate but parallel pre-processing paths.
Visual Inspection and Correction—
After the data reporting forms were logged in by jthe data clerk they were
visually inspected by the raw data staff for omissions and errors. On sample
data sheets, QC samples were checked fo accompanying QG comment cards (S5
card); air samples were checked for accompanying flow data and met data; all
samples were checked for the presence of the stratum, site ID, sample date,
and other common information. On analysis sheets, analyte forms were checked
for the presence of a concentration or a results indicator, internal QC forms
were checked for the presence of a QC-indicator; qualitative analysis sheets
were checked for the presence of CAS numbers; and comment sheets were checked
for serial numbers. A check was also made for valid codes on the data sheets;
it was particularly common for lab names and analysis methods to be mis-coded,
shifted, or missing.
In many instances the errors detected during this rough inspection could
be corrected by GCA Data Management withouth contacting the originator of the
data form. Shifted codes, mis-coded media, or missing card sequence numbers,
for example, could he easily fixed by GCA. For missing fields, however, calls
to the Sample Bank or the laboratories were necessary. Sheets with problems.
or packages with numerous errors, were set aside until answers to questions
could be obtained.
Keypunch and Key-Verification—
Once the data sheets were cleaned up, they were returned to the data
clerk for logging to keypuncher. Keypunching was done either in-house at GCA
or by an external agency which was selected for the Love Canal work. Any data
sheets with special instructions were routed to thn in-house keypuncher so
that verbal instructions could be given, and questions could readily be
answered. The external keypuncher also called GCA if any questions in
interprecation arose, and they flagged any data sheets which were illegible or
which had obvious errors (e.g., missing Sample ID). Thus the keypunch stage
itself served as a further net for error detection. Computer cards were
141
-------
key-verified as well as keypunched before they were returned to Data
Management. Data forms were logged in by the data clerk as the next activity
in the raw data pre-processing phase.
Selection of Analysis Data for Processirr —
The next step for Sample data was t -ry of the cards into the computer,
bat analysis data went through one more i ep before they were submitted to the
computer. This step was to select for processing those analysis data for
which sample data were already present in the computer. The data handling
system was set up on the premise that sample data would always precede
analysis data into the raw data system. This assumption was based on prompt
sample data coding, and a 30-day analysis period before analysis data were
submitted. In fact, the coding of sample data was very delayed in many cases,
and laboratory analysis was sometimes completed quite swiftly. Therefore it
was extremely common for analysis data to arrive at GCA prior to the arrival
of the sample data. To prevent mismatching of sample data and analysis data
it was necessary to make the manual search for acceptable analysis data. The
search was quite time-consuming, since it involved cnecking the Sample ID
cross-reference list against the analysis data reporting forms, and then
pulling out the appropriate computer cards for selected samples. We tolerated
this system for some time, but in December, when some sample data were still
not coded and the volume of incoming analysis data ballooned, we made
adjustments in the computer programs to handle the problem. When the raw data
editing program was run, analysis data which could not be matched with sample
data were set aside in a separate file, where they were held until the next
raw data run (step 9 in Figure 8.1, "Data Flow Diagram").
Data Editing—
Analysis data selected for processing, as well as all sample data, were
next submitted to the computer for processing in their respective edit
programs (ALEDIT and SAMED1T). These are shown as steps 7 and 3 in Figure
8.1. A more detailed illustration of the edit portions of the data flow
system is also shown in Figure 8.4.
The edit programs checked to make sure chat required information was
present, and that all information present was acceptable. For example, the
Sample ID and medium were mandatory on all cards, and error messages were
printed if these were missing. The ID number, the medium, and many other
fields were allowed only specific ranges or values, and error messages were
printed if unacceptable data were present. Error messages were .also printed
if data fields were in conflict with one another (e.g., if a concentration was
supplied on an analyte form and the "Not Detected" column was also marked).
In some cases the edit programs made changes in the data. For example the
correct code for the medium "Water" was "H20," where the final character was
alphabetic. Frequently this character was keypunched as a zero. This
particular error was automatically corrected by the edit program. A complete
list of edits and checks made by the ALEDIT and SAMEDIT programs is given in
the Appendix.
142
<"t
l&
J
-------
_l
'0
01
0)
o
o
60
•S
4-1
•H
73
-------
The printout from each of the edit programs included:
• List of cards as submitted to the program
• List of error messages
• Cumulative summary of the number of cards of each type in the run
• Cumulative list of cards in the edit run, sorted by Sample ID, card
type, and sequence number, with each card identified by line number
in the temporary data file.
The raw data team examined each error message and made any changes necessary.
As with the visual inspection, many errors could be fixed immediately, while
others required tracking down necessary i.nf orraation. Frequently the data team
traced back to the data reporting form to see if the error message resulted
from a keypunching error. If the data had been correctly punched, then
telephone calls to the data originators were necessary to resolve the problems.
Corrections were made to the data through a PANVALET update process.
Correction cards were prepared on the basis of the line number of the
incorrect data card in the temporary data file. When these cards were
submitted to the computer, additional raw data cards could be added to the
deck. When the edit program was run again the error list would be based on
the new cards as well as on the cards previously entered. This cycle was
repeated until all data cards were thoroughly edited.
All data cards were accumulated in temporary data files (see Figure 8.4)
until the raw data manager elected to proceed to the next'.phase of raw data
processing (usually a cycle lasted about one week). All edited sample data
were accumulated in one temporary file. At the outset of the study, all
edited analysis data were accumulated in one temporary file, but in a
modification designed to replace manual selection of analysis data, the
analysis data were later separated into two files. One of these held data for
which sample data were present in the raw master file, and the other file held
"rejected" analysis data, for which sample data were not present yet. The
"rejects" file was reprocessed each time the ALEDIT program was run. This is
illustrated in Item 2 of Figure 8.4b.
A slight exception to the usual processing pattern occurred when the EPA
lab at Las Vegas submitted radiation results. In this instance data were
submitted on computer tape, so that the input to the sort program was the
computer tape instead of the data cards. A special editing procedure was also
carried out on these data by the LCFIXRD program. This program converted all
concentration measurements from the normalized form in which they were
reported to the standard Love Canal reporting format.
Problems Encountered during Pre-Processing—
The major problem experienced during pre-processing of data was already
described above, namely, the problem of trying to identify analysis data for
which sample data were already in the system. This problem was eventually
solved through additional computer programming.
144
I;
a
-------
An additional problem, and one for which there was no simple solution,
was identified during pre-processing. This was that the laboratories
frequently did not send all analysis results for a particular sample in one
package. Results were often senr in different packages over a period of
weeks. Sometimes we identified missing &?-e such as specification of a
method blank. Other times, as when a sec extraction was run, v/e had no jay
of knowing that the first set of results receivei did not present a complete
set of results for a sample.
If we identified specific data as missing, and could obtain them readily
over the telephone, we included them in the run. Otherwise, we let the sample
go through, but made notes '.-hich we passed on to the verificati'.-n team.
The staggered-results syndrome slowed down the pre-processing a little,
but it caused no other difficulty at this stage. It wap perfectly simple to
accumulate analysis data in the raw master file over a period of time. The
real difficulties with the syndrome were to come at the verification stage.
For example, it was possible to verify a °ample withovt realizing th°t the
laboratory had not even submitted all results for that sample. The ways in
which these problems were handled are fully discussed in the section on
verification procedures.
Missing sample data also posed problems. Again, if information could be
obtained readily, it was included in the run. Otherwise, notes were made, and
data were added through sample data verification procedures. Data most
commonly missing were comment cards for QC samples, ar.d meteorological data
cards for air samples. Additional sample cards could not be added through the
raw data system. When a sample was first entered into the raw data system,
its ID number ^as placed in the cross-reference list. Software was
specifically designed to reject any cards with this ID in future runs, to
prevent double entry of sample uata. Therefore, complete cards, as well as
data updates, were added through verification procedures.
Raw Data Final Processing and Report
At intervals of about 1 week from October 1, 1980, through the end of
January 1981, and at somewhat less frequent intervals during February and
March, the edited raw data which had been accumulating in the temporary sample
data and analysis data files were nerged into the raw data master file.
Reports based on this updated file were issued, and a tape of new raw data was
generated for EPA. These activities are shown as steps 10 to 13, and 15 to 19
on Figure 8.1. These steps are reproduced in Figure 8.5 for convenient
reference.
Final Processing of Raw Data—
The raw data sort/merge program read as input the edited sample data and
analysis data files, sorted them, and wrote a tape (Item 2 of Figure 8.5) in
card-image format which contained the data from these two files. The sample.
data were rewritten as the first file on the tape, and the analysis data were
written as the second file. Tapes were sent from GCA*s Computer site at Oak
145
-------
oo
c
•H
CO
CO
0)
o
o
a,
(fl
.5
00
a
1
1A6
-------
Brook, Illinois, via Federal Express to EPA at RTF on the day on which they
were generated. A log of these transraittals were maintained by GCA Data
Management (see Figure 8.6).
The sort/merge program (RAW1) also merged all new sample data into the
temporary verification file (TEMPVER; see Item 4 of Figure 8.5). Finally, the
sort/m,;rge program merged both the sample data and the analysis data into the
raw datn master file (item 3 of Figure 8.5). This file was sorted by Sample
ID, card type, and sequence number. The card types were sorted so that all
sample cards (S1-S6) preceded all analysis cards (AO-A4) for a given sample.
Table 8.4 summarizes the growth of the raw data master file over the course of
the study.
Generally, the only alterations made to the raw master file were through
the raw sort/merge program, which made additions to the tape, but no changes
or deletions. For a period of time near the beginning of the study, however,
certain fields of information were added to the sample data in the master file
via a special program. The fields "date to lab" and "analysis" were added to
the S3 cards in the raw master file for the first set of samples, which had
been coded without this information. The S3 cards were matched with the new
data fields by Sample ID number. After the decision was made to add these
fields to the S3 cards, the data were coded directly onto the coding forms,
and the raw master file did not have to be updated for these later samples.
Raw Data Reports—
Running of the raw sort/merge program completed the actual processing of
raw data; the data next passed through the verification process. However,
after each run of the raw sort/merge program, a set of report-generating
programs was run, using the updated raw master file as input. Recipients of
all of these reportr were cautioned that the reports were based on raw data
and that, therefore, some inaccuracies should be expected.
Raw Data Summary Program—The Raw Data Summary (RAWSUM) program (item 5
in Figure 8.5) produced three reports. One of these was the cumulr.tive ID
Number Cross-Reference Report, which listed for each sample in the raw master
file all the cross-reference numbers supplied on the sample cards. The report
contained seven separate reports, each sorted by one of the following
identification numbers:
• Sample ID
• Preparer's ID Lot
• Sample Bank to Field Chain-of-Custody Number
• QC to Sample Bank Chain-of-Custody Number
• Sample Bank to Analytical Lab Chain-of-Custody Number
• Preparer to QC Chain-of-Custody Number.
147
-------
LOVE CANAL
TAPE LOG
Tape No.
Date sent
Sent to
Contents
Date back
COMMENTS:
Figure 8.6. Love Canal Tape Log.
148
Use tape
-------
TABLE 8.4. GROWTH OF RAW DATA MASTER FILE
(IN NUMBER OF CARD-IMAGE RECORDS)
Raw data
processing date
10/01/80
10/06/80
10/09/80
10/16/80
10/24/80
10/31/80
11/07/80
11/12/80
11/20/80
11/28/80
12/06/80
12/12/80
12/19/80
12/27/80
1/03/81
1/11/81
1/22/81
2/18/81
3/06/81
3/10/81
3/11/81
3/13/81
4/30/81
No. records added
to raw master
1,536
2, 346
2, S>05
4,096
9,074
11,442
10, 996
15,906
22,149
29,833
60, 442
76, 548
42, 900
6,635
6,612
2,215
1,447
16,523
4,157
992
960
250
677
Cumulative size
of raw master
1, 536
3,882
6,787
10,883
19,957
31,399
42, 395
58,301
80,450
110,283
170,725
247,273
290,173
296,808
303, 420
305, 635
307, 082
323,605
327, 762
328, 754
329, 714
329, 964
330, 641
149
-------
A copy of this report was sent to the Sample Bank for their use in cross-
referencing samples.
RAWSUM al.so routed the ID Cross-Reference information to an online disk
file (Item 6 in Figure 8.5) which was used for checking the ID Numbers of all
raw data entered into the ALEDIT and SAMEDIT programs.
The other two reports generated by RAWSUM were based on new analysis data
just merged into the raw master file. Each time the program was run, the
current date was preserved by the program. Since all data in the raw master
file were labeled with the date they entered the computer system, the RAWSUM
program was able to compare dates and select for output only those samples
added to the file since the last RAWSUM run. It should be noted here again
that the system expected all analysis data for a particular sample to be
submitted in one run, so that reports on an individual sample would be
generated only once by RAWSUM. Analysis data for individual samples were
often submitted in piece-meal fashion. RAWSUM reported these out as they were
received, so sample reports were frequently incomplete.
While the RAWSUM program could not detect whether or not analysis data
for a sample were complete, it did check to make sure that sample data were
present for all samples for which analysis data had been added to the raw
master file. As discussed above, selecting analysis data for processing was a
manual operation in the early stages of the study, and sometimes analysis data
were entered prematurely. We depended on RAWSUM to alert us to these
slip-ups. When they occurred, we made appropriate notations by hand on the
reports generated by RAWSUM.
One RAWSUM report based on new analysis data was the Alert Report, which
contained an entry for every pollutant which exceeded the alert level
specified in the pollutant cross-reference file. The Alert Report was
transmitted to GCA and EPA for their use in identifying potential hazardous
sites. The Alert Report was also used by Data Management as one aid in
identifying potentially erroneous information.
The second RAWSUM report based on new analysis data was the Raw Data
Listing, which was produced in two parts. The first part, which was
transmitted to EPA and GCA personnel, contained most sample data
(chain-of-custody information were not included), and all new analysis data,
along with pollutant names and CAS numbers for each pollutant code. The
second part of the report, which was separated by analytical laboratory, was
sent to the appropriate labs for verification. All analysis data were present
on these reports, but only a limited set of sample data were included.
Information about the sample which might have biased the laboratories in their
review was excluded.
Raw data listings and alert reports were sent to EPA and the
subcontractor laboratories via Federal Express (an exception was nearby ERGO,
for which we used regular mail). Cover letters were included with all
150
-------
packages sent to subcontracting laboratories. The letter which was initially
sent to all laboratories requested them to verify the accompanying data
listings and to return them to GCA via Federal Express. Some laboratories
declined to verify the reports; in subsequent shipments to these labs, we
substituted a cover letter which acknowledged their position on verification,
but urged them to check any items which GCA had flagged on rhe sheets.
Procedures used by Data Management in reviewing and flagging all raw data
listings before transmitting them to laboratories are described in the section
on verification procedures.
A log of the raw data listings which were sent to laboratories (including :
EPA labs) for verification was maintained by the Data Management data clerk. ;
This log is exhibited in Figure 8.7. It was maintained in loose-leaf format, |
with one page for each laboratory. As well as entering the processing date of !
the report and the date on which it was sent out from GCA, the clerk entered
the number of pages in each report. This was done so that when the verified
reports were returned by the laboratories, we could check to make sure that we
had received the entire report. As the Figure shows, this same form was also
used to log in the reports when they were received back at GCA from the j
laboratories. Information logged at that time included the date received and '.
the package number assigned to it by the Love Canal Document Control Officer.
Raw Data Quality Control Programs—Two quality control programs (Item 10
of Figure 8.5) were routinely run on the raw data, and both of these included ;
the entire file in their analysis each time they were run. One program ;
(LCQC1) analyzed recovery rates of the laboratories' internal QC samples and
of external QC samples, while the other program (LCQC2) compared analysis
results for laboratory-generated duplicates and their mates. '
LCQCl presented its findings on internal QC samples in sections by
medium, where the samples created internally by the laboratories (and labeled
from the reserve "Q" list) were treated as a separate category. Within each
category, findings were separated by type of QC sample (LCS, surrogate, or
spike), and then by analysis method, laboratory, and pollutant. The results
then presented for each pollutant included:
• Pollutant name and code
• Number of samples (for that pollutant, lab method, type of QC sample
and medium)
• Average recovery rate (mean ratio of observed to expected
concentration)
• Standard deviation of recovery rate.
If a sample were processed through more than one analysis method, or if more
than one extraction of the sample was analyzed by the same method, each
analysis was treated as a separate sample by the LCQCl program. Error t
messages were presented by LCQCl when revoery rates could not be calculated.
This happened when expected concentrations were absent or coded as zero, or
151 !
-------
LOVE C.-UAL STUDY
LOG OF SAW DATA REPORTS
SENT TO LABORATORIES
LABORATORY:
DATE
SENT
SENT
F.K.
VIA
REG.
PROCESS I:;G DATES
REPORT
PAGES
DATE
RECEIVED
»Y GCA
PACKAGE
#
Figure 8.7. Love Canal Study log of raw data reports sent to laboratories.
152
j
-------
i
when observed concentrations were absent. These error messages enabled GCA
Dat. Management to detect errors not found by the edit programs. The most
frequent cause of error messages was a coding of zero for both the observed
and the expected concentrations, in instances in which the compounds were not
actually used for QC purposes by the laboratories. GCA Data Management
deleted the entries for these compounds from the data base during verification.
The final section of the LCQC1 report was a summary of recovery rates for
external QC samples. This report was also separated into sections by medium,
and within each medium the findings were presented by analysis method,
laboratory, and pollutant. The same data were presented for external QC
samples as for internal QC samples, and error messages were printed if
recovery rates could not be calculated. (Towards the end of the study, a »
program was written to analyze the accuracy of external QC results through an
analysis of differences. This program (QSPECA) used the validated data as ;
input). *
s
The LCQC2 report first printed pollutant-by-pollutant results for each ;
laboratory-generated sample and its mate. The information printed included: ?
• Laboratory ;
i
• Sample ID J
'; 3
• Pollutant Code j |
• Concentration for original sample j j
4 i
• Concentration for duplicate aliquot (sample ID beginning with Z) j? i
I j
• Difference between the two concentrations. | .
Entries were made in the report only if a quantified (nonzero) concentration } <
was reported for at least one member of the pair. I I
The second part of the LCQC2 report was a statistical analysis of these i
intralab duplicates, summarized for each medium by laboratory and pollutant.
Information for eo.;h pollutant included: t
I
• Laboratory ; j
• Pollutant name and code
• Number of observations
• Mean difference between observations
• 95 percent probability limits
• Standard deviation divided by J2.
153
-------
'1***^1'*^^
Results of the LCQC2 program were reviewed by EPA and GCA Quality Assurance ]
staff to monitor laboratory perfonnance while the study was ongoing. When ]
substantial amounts of data had beer, verified, another program (QSPECZ) was •
written to analyze the validated (or verified) data in the same way LCQC2 :
analyzed the raw data. ;
Sorted Lists of Sample Data--A variety of sets of selected sample data, •
sorted on different fields, were generated by a set of programs called the ]
"KM" Programs (Item 12 of Figure 8.5). These programs were written in j
response to requests from GCA staff (and named for the first person who made a j
request), who used them in a variety of ways during the study. Typically, the j
reports were used to monitor and review sample data, and the only information \ •
printed about analysis data was the Julian date on which they were entered j
into the computer (or this field was left blank if no analysis results had
been received). A summary of the KM Reports and their uses is given in Table
8.5.
~i
Processing of External QC Samples 1
\
External QC samples were handled in a special way, since the observed j
values were reported by the analytical laboratories, while the true values had j
to be added to these data by GCA. The Sample Bank supplied Data Management j
with lists of QC samples .Td their true values. To expedite processing of QC j
samples as they were sent in by the laboratories, Data Management prepared I
External QC Reporting Forms (card type AO) ahead of time. A sheet was j
pre-coded with pollutant codes and true values for each of the QC samples j
which was due to arrive at GCA. ]
)
The next step was to identify the QC samples when they arrived at GCA, so
that their observed concentration values could be coded on the AO sheets next
to the pre-coded expected concentrations. Some results were sent from the
laboratories directly to GCA's Quality Control Officer. These results were
copied and- the copy passed on to Data Management, waere the observed
concentrations were coded on the AO forms.
The remainder of the QC sample results were submitted to GCA on standard
Analyte Concentration Reports Forms (card type Al), with no special
identification. It was impractical to search for QC samples among incoming
data sheets. Instead, we compared our list of QC samples against the reports
produced after each update of the raw data master file. Thus, these QC
samples were identified after they had already been entered into the raw data
base. The observed concentrations were then copied from the raw data listings
onto the AO sheets with the pre-coded true values.
Whether the QC samples were coded from the raw data listings or from
sheets provided by the Quality Control Officer, all AO sheets were transmitted
to the Quality Control Officer for review before they were keypunched. This
step was taken to ensure that true values were correctly entered for each :
sample and that observed values were reported in the same units as were the '
true values. Procedures used during this quality control review are described i
in the Quality Control section of this report. j
!
154
-------
KM3
KM4
KM5
KM6
KM7
KM8
TABLE 8.5. KM REPORTS AND THEIR USES
fl
Samples
Report included
Sort field
Use
KM1
KM2
All
All
Stratum,
Medium,
Medium,
Site ID,
Sample ID
Saraole
Monitor
Monitor
sampling program by
sampling program by
site
date
All
All
Ground
water
All
Ground
water
QC (Q in
Col 72,
SI Card)
Date, Stratum,
Site ID
Medium, Lab Name, Monitor analysis load of laboratories,
Sample Date, keep track of accounts with labs
Date to Lab
Medium, Sample ID Look up information by Sample ID
Well No., Analysis Monitor results for each well
Sample tag
Sample ID
Verify sample daa
Look up ground water informtion by
Sample ID
Medium, Sample ID, Monitor receipt of external QC analysis
Stratum, Site ID data
155
-------
AO sheets were returned to Data Management with the signature of the
Quality Control Officer on each sheet, indicating that the review was
complete. The sheets were then checked by Data Management, for any changes or
additions required. Frequently, the QC officer requested the addition of
comments about the sample, for example. Once such adjustments had been made,
the AO sheets were submitted for keypunching. From that point on, the AO data
traveled through the raw data processing system along with other analysis
data. For those QC samples which Data Management had identified through raw
data listings, the end result was a set of "AO" data and a set of "Al" data,
with the observed concentrations on the AO cards being the same as the
observed concentrations on the Al cards for the corresponding pollutants.
Data coded on AO sheets appeared on the raw data listings under the heading of
"External QC Results."
A total of 322 external QC samples were processed, of which 284 were
one-part samples, 33 were two-part samples and five were three-part samples.
For an additional 17 samples (13 one-part and 4 two-part samples) which were
on the Sample Bank's list of External QC samples, no results were submitted to
GCA by the analytical laboratories. Table 8.6 gives a breakdown by medium and
type of analysis of the external QC samples for which results were received
and processed. Table. 8.7 gives a similar breakdown for samples for which
analysis results were not received.
Several types of problems were associated with the processing of external
QC sample results; the consequences of these problems were usually felt during
the verification process. The first type of problem which occurred was that
results for some samples were submitted both on standard forms to Data
Management and on EPA forms to GCA's Quality Control Officer. In these cases,
AO sheets were generally coded twice, and the results entered twice into the
rav data system. The verification team had the task of spotting duplicate
entries and deleting one set.
The second type of problems occurred when laboratories changed the
observed concentrations submitted for the QC samples. This happened several
times when "TRACE" valued were changed to quantified values. (This is
discussed thoroughly in the Quality Control section of this report.) When
changes were made before the AO sheets were entered in the raw data master
file, then the verification team still had to change the values already
submitted on Al cards. When labs made changes after the AO cards had been
submitted to the computer, the verification team had to change the values on
both the AO cards and the Al cards.
A third problem with the processing of external QC samples stemmed from
the fact that AO and Al cards for a sample were typically submitted to the
computer on different dates. There was the constant danger that the sample
would be verified before the AO cards were submitted to the computer. This
did, in fact, happen in several instances, and the AO data had to be added
through special corrections to the verified data (Step 32 on Figure 8.1).
-i----r".
-------
TABLE 8.6. QC SAMPLES WITH ANALYSIS DATA
Sample
type
TEHAX
Foam
Hi-Vol
RAD
TCDD
FL/NT
Vol
Vol-Purge
PCB
Pest /625
Chlor Hydro
Acid + B/N
Phenol + B/N
B/N only
Phenol only
ICP379,
ICP476 and
AA476
AA476 and
AA379
AA476 only
Water Soil/sediment
Soil/
One- Two- Three- Water One- Two- sediment
Air part part part total part part total
76
75
6
.3 355
12 12
13 13
8 8 12 12
5 i 5 5
1 ill
13 13 12 12
52 73 3
8 8 11
5 5 56
2 2
2 211
5 5
4 4 77
8 8
(continued)
Grand
total
76
75
6
8
12
13
20
10
2
25
10
9
11
2
3
5
11
8
157
\
-------
TABLE 8.6 (continued)
Sample
type
AA379 only
ICP379 only
ICP476 only
Total
One-
Air part
9
6
157 73
Water
Two- Three- Water
part part total
9
6
19 5 97
Soil /sediment
Soil/
One- Two- sediment
part part total
1 1
54 14 68
Grand
total
1
9
6
322
Total one-part samples - 284
Total two-part samples - 33
Total three-part samples -5
158
-------
TABLE 8.7. QC SAMPLES WITH NO ANALYSIS DATA
Sample
type
TENAX
Foam
Hi-Vol
RAD
TCDD
FL/NT
Vol
Vol-Purge
PCB
pest/625
Chlor Hydro
Acid + B/N
Phenol + B/N
B/N only
Phenol only
1CP379,
ICP476 and
AA476
AA476 and
AA379
AA476
Water Soil /sediment
Soil/
One- Two- Three- Water One- Two- sediment Grand
Air part part part total part part total total
1 1
1 1 1
444
11 1
11 1
11 1
1 1334
1 1 112
112
(continued)
159
..^
-------
TABLE 8.7 (continued)
Sample
type
AA379
ICP379
ICP476
Total
Water Soil/sediment
Soil/
One- Two- Three- Water One- Two- sediment
Air part part part total part part total
1 1
1310 4 93 12
Grand
total
1
17
Total one-part—no results - 13
Total two-part—no results - 4
'.,
1
160 ''
,f—,——
-------
The final problem associated with the processing of the QC samples was
the vulnerability of the system to human error. An automated system could
have ensured that all external QC samples were identified when they were first
entered into the raw data processing system, that no QC samples were verfified
without the AO data, and that the AO data were identical to the Al data. The
manual system was instituted because external QC results were already in the
computer when the QC sample list and true values were transmitted to Data
Management, and the potential for difficulty was not fully appreciated at the
outset.
DATA VERIFICATION
Verification of sample data and of analysis data were handled separately,
and verification of sample data was a far simpler process than that of
verifying analysis dac?. This was true partly because the volume of sample
data was much smaller than the volume of analysis data. But the verification
of analysis data was also more couplex because the analytical laboratories
were involved in the process, whereas GCA verified all sample data
internally. The two verification processes are described below.
Verification of Sample Data
Verification of sample data appears as step 14 on Figure 8.1. The steps
related to sample data verification are shown also on Figure 8.8.
The first and most critical step in sample verification was the review of
the data by the Sample Bank. The "KM6" report was produced for this purpose.
This report listed selected data for all samples in the raw master file, and
it was sorted in order of the sample tag number. This enabled the Sample Bank
to proceed systematically through their files of sample tags to verify the
information on the KM6 report. The data fields '-/hich the Sample Bank verified
for each tag number were:
• Sample ID
• Stratum
• Site ID
• Sample Collection Date
• QA/QC Status of Sample
• Well Number (for ground water).
Any errors which were detected were written directly on the KM6 report,
and the report was transmitted back to Data Management when the review was
completed.
Tha first such review was conducted by the Sample Bank at the end of
November, and reviews were conducted on a weekly basis until the end of
December, when all sample data had been entered into the raw master file.
161
-------
CO
4J
to
-------
i When errors were discovered in the raw sample data, no changes were made
I to the raw data master file itself. Rather, changes were made to the sample
j data residing in the temporary verification file (TEMPTER). Data Management
! used the update process available in the PANVALET™ (file management system
| to make the required changes to the TEMPVER file (see Item 1 on Figure 8.8).
Changes were accomplished as follows:
• A listing of the TEMPVER file was printed on the computer.
• Errors marked on the KM6 report were transferred to the TEMPVER
listing.
9 PANVALETW update cards were coded, identifying changes,
additions, and deletions by line number in the TEMPVER file and by
column number in the line.
• The PANVALET® update cards were keypunched and submitted to the
computer.
• A printout of the updated records was reviewed by Data Management.
This same update method was used when errors in the raw sample data were
detected through means other than the formal verification review by the Sample
Bank. Errors were discovered at various times by EPA and GCA staff.
A special program, rather than the PANVALETW update process, was used
on one occasion for a one-time update of coordinates for all samples in
TEMPVER. This action was taken when a sizeable number of errors was
discovered during verification of the coordinatesa at a time when a large
amount of sample data had been entered into the data system. The coordinate
update procedure, which is shown as item 2 in Figure 8.8, is discussed more
fully in the section above on the Coordinate Reference File.
Verified sample data were retained in TEMPVER until the corresponding
analysis data had also been verified and placed in TEMPVER. The analysis data
verification procedures and the ensuing creation of the verified data file are
described in the following section.
Verification of Analysis Data
Laboratory Review of Raw Data Listing Sorted by Lab—
The Raw Data Listing Sorted by Lab (hereafter referred to as Raw Data
Lists) and the corresponding Alert Reports were two of the listings generated
weekly by GCA Data Management in a program designed to report Love Canal
sampling data to the EPA. Selected sample information and complete analysis
information were listed for every sample on the Raw Data List.
The sample da;a printed on the Raw Data Lists included the sample ID,
medium, collection date, sample time, source and analysis lab. Other types of
sample information, such as coordinates, deptu, and sample location (which
163
-------
were printed on the Complete Raw Data Listings), were not provided on the Raw
Data Lists, so as not to bias the laboratory review. Exhibit 8.1 provides an
example of the sample information included in the Raw Data Lists.
All' analysis data submitted to GCA by the laboratories were included in
the Raw Data Lists. The Lists also included GCA1s coding of true values for
external QC samples, and GCA1s coding of Performance Evaluation results which
were submitted to the Quality Assurance Manager at GCA. In addition, the
Lists included full reports on laboratories for which GCA coded all analysis
data (AES, WSU, EPA at Corvallis, EPA at Ada, Oklahoma, dioxin analysis by EPA
at RTP, and NYS extractions analyzed by PJB Laboratories).
On the Raw Data Lists which were generated during the first seven raw
data runs of the study, the analysis method was used to identify the analysis
results. When it was realized that the method did not provide a complete
identification of results, the analysis date was added as a further
identifier. The date of analysis was included on the Raw Data Lists beginning
with the run of November 12, 1980.
The purpose of the Alert Report, in which all concentrations above a
predetermined level were printed, was two-fold. First, the report was used by
GCA ana EPA to identify potentially hazardous sites at Love Canal. The report
was also used by the GCA Data Management Group to identify outliers* in the
form of unusually high concentrations which resulted from transcription or
keypunch errors.
For instance, if the Alert Report identified compounds that had
consistently high concentrations throughout the Raw Data List, Data Management
assumed that the unusually high concentrations were specific to that group of
samples and no action was taken. If a concentration were unusually high in
comparison to other concentrations for that compound, then Data Management
compared that concentration to the concentration on the original coded sheet.
As a result of this comparison, one of two things occurred. If the
concentrations differed, then the unusually high concentration was attributed
to a keypunch error. The Raw Data List was corrected, and the sample ID for
which there was an incorrect concentration was listed in the verification
problem book** to be corrected by the GCA verification coordinator during the
verification process. If the concentration on the Raw Data List was the same
as that on the original code sheet, then the concentration was flagged in red
on the Haw Data List. This indicated to the laboratory that they should
scrutinize that sample ID. In addition to marking potential errors identified
through the Alert Report, the verification coordinator flagged other potential
problem data and missing data.
*0utliers which were in the form of unusually low concentrations could not be j
identified through the Alert Report. i
i
**The verification problem book was a record, listed by lab, of all samples for i
which there were data processing errors which were to be corrected during '
verification. j
164
-------
_l
<
a
OJ
o
.o
1/1
5
in
V.
-------
After the Raw Data Lists were reviewed by the verification coordinator,
they were logged into the Love Canal Verification Form Log (illustrated in
Exhibit 8.2) by the data clerk and sent to the laboratories via Federal
Express. Information recorded in the log book included the date that the raw
data listing was sent out and the number of pages in the report.
The Raw Data Lists were sent out to all the labs for their review.
However, not all labs reviewed or verified their Raw Data Lists. The GCA Data
Manager requested, through correspondence (see Exhibit 8.3), that the
verifyijTg laboratories compare the Raw Data Lists itera-by-item with the
information originally used to prepare the coding sheets. If the information
were correct, then a check mark was to be placed beside the entire sample. If
the information were incorrect, then a line was to be drawn through the |,
incorrect information, and the correct information was to be written on the jj
Raw Data List. ij
Jj
Raw Data Lists were also sent to the laboratories that were not :; j
verifying; accompanying these lists was a request that they review any data i j
items that had been marked in red ink (see Exhibit 8.4). The laboratories T\
were also urged to indicate any additional corrections to the data on the Raw } j
Data Lists. Gulf South Research Institute at Lafayette was the only '
nonverifying lab that responded to this request. ,' j
; i
The following list specifies the subcontractor !«•*>->. ratories which j
performed analysis on Love Canal Sampl -; Idborat: -ies which did not verify
their Raw Data Lists are marked with an asterisk:
'. i
- s
• Acurex Corporation j
• Advanced Environmental Systems 3
!
• Battelle Columbus Laboratories* * ]
• Compuchem/Mead Technology Labora.ories*
s
• Energy Resources Company, Inc. ;
• Gulf South Research Institute at Lafayette |
• Gulf South Research Institute at New Orleans* j-
.<
• ITT Research Instituce I
I
• Midwest Research Institute* |
• PEDCo Environmental, Inc.* |
I
• PJB Laboratories
166
-------
VERIFIED DMA FORM LOG
Date
Sent Out
Pages
Date
Revd.
Pkg.
No.
Date To
Coder
Comment
1 I
Exhibit 8.2. Page from Love Canal verification form log.
167
-------
C.CA co-'1:'
Technology Division
TO: ANALYTTCAL LABORATORIKS
The Qualify Assurance Program for the Love Canal Monitoring Program
requires active verification of all information by the originating party.
For analytical data, this involves review of a raw data report of all
results submitted by thn annlytical laboratory. The goal of this task is
assurance that t'ie data in the system accurately reflect the results of
the laboratory analysis.
The procedure to be used in verifying results is to compare the raw
data report item-by-ltem with the information originally used to prepare
the coding sheets. If all information is correct, then place a check mark
(l/) next to the entire sample. Draw a line through incorrect information
and write the corrected value on the raw data report,
CCA will make notations in the form of written comments or arrows
pointing to certain observations. These values should be more diligently
checked as they appear to be anomolous. Air samples will be reviewed by
checking the column marked "REPORTED CONC".
This information has already been edited and verified several tines;
however, each laboratory is ultimately responsible for the accuracy of
results which they have provided. To maintain the tight time schedule
required by EPA, please return the verified raw data report to Diana Timlin
at GCA using Federal Express within 48 hours of receipt of the data.
Mary C, Havelock
Exhibit 8.3. Letter to verifying laboratories.
168
-------
1
Technology Uiyjjion
I am enclosing a report on the raw data from your laboratory which
we most recently processed for the Love Canal Study. I understand
that you feel verification of these data is beyond the scope of your
contract with GCA. Nevertheless, you may want to review this and
subsequent raw data reports for your own infonnaiion, and we would
appreciate your reviewing any data items that we have narked in red
ink.
If you should want to make any changes in the data you have ubmitted
to GCA, please indicate these changes on the raw dcta reports. Please
return the reports to Diana Timlin at GCA as soon as possible, even if
you do not wish to make any changes, so that we may complete the
verification process for these samples>
Sincerely,
Mary C. Havelock
MCH/dlt
Exhibit 8.4. Letter to non-verifying laboratories.
169
-------
• Southwest Research Institute
• TRW , Inc .
• Wright State University /Brehm Labs
Five EPA Laboratories performed analyses on Love Canal samples:
• EMSC/EPA Cincinnati, Ohio
• EPA/Corvallis, Oregon
• EPA/Ada, Oklahoma
• EPA/EMSL/RTP
• EPA/HERL/RTP
All these EPA laboratories verified their analysis results either in
writing or by telephone. (The labs at Ada, Oklahoma, and at Corvallis,
Oregon, verified their results by telephone at the end of the verification
process.)
GCA Data Management requested all the laboratories to return their Raw
Data Lists via Federal Express within 48 hours of receipt oi the data. The
verifying laboratories which did not return their Raw Data I \sts within 1 week
were contacted by telephone to determine when the Lists would be returned. T>
expedite the verification process during the period of heaviest volume of
verification (the week of January 4-11, 1981, when 8244 samples were
verified), some of the Raw Data Lists were verified by telephone and later
returned to GCA via Federal Express.
GCA Reviaw of Raw Data Lists —
The verifying laboratories returned the Raw Data Lists to the Document
Control Officer at GCA via Federal Express. At GCA, the Document Control
Officer assigned package nurubers to each raw data listing, as well as to all
other incoming documents related to the Love Canal Project. These packages
were logged into the Love Canal Incoming Log prior to their distributicn to
the Data Management data clerk.
Before routing a package to the Verification Coordinator, the Data Clerk
first used the Love Canal Verification Form Log (see Exhibit 8.2) to record
the package number, the date the package was received, and the date it was
given to the Verification Coordinator. The Verification Coordinator, referred
to as the "coder" in the Love Canal Verification Form Log, then logged the
package number onto the appropriate page (designated by laboratory) in the
Love Canal Study Log of Raw Data Verification (refer to Exhibit 8.5).
After the log-in process, the Raw Data Lists that had been verified by
the laboratories were reviewed by the Verification Coordinator and her
assistants. Changes, additions and deletions indicated by the verifying
laboratories were coded by the verification team onto the appropriate
170
s~
-------
LOVE CANAL STUDY LOG
OF RAW MTA REPORT
VERIFICATION'
LABORATORY:
PKG. NO.
CODER
DATA
CODED
DATA
KEYPUNCHED
TO
DATA
PROCESSING
COMMENTS
Name provided when many people working on same lab.
Exhibit 8.5
Log sheet used by Verification Coordinator during
raw data list review.
171
-------
Verification Action Forms. In addition to reviewing the raw data lists for
corrections indicated by the verifying laboratories, the verification team
reviewed the raw data lists for accuracy and completeness. If the raw data
lists appeared complete, and no additional problems were identified, the coded
verification forms were logged into the Love Canal Log of Raw Data
Verification (see Exhibit 8.5) and routed to kepunch. Problem samples, which
were not resolved until after the major verification effort in mid-January,
were listed on the front of the Raw Data Lists.
After being returned to the verification coordinator, the coded
verification forms, along with their keypunched cards and their corresponding
Raw Data Lists, were logged into the Love Canal Study Log of Raw Data
Verification as having been keypunched and key-verified.
The review of Raw Data Lists from laboratories that were not v r.fying
was similar to the review of Raw Data Lists from labs that were veritying,
except for two differences. First, laboratories that were not verifying did
not always return their Raw Data Lists. In these cases, the GCA copy of the
Raw Data List was reviewed. This list was logged into the Love Canal Study
Log of Raw Data Report Verification by the Verification Coordinator, as was
done with Raw Data Lists verified by labs. In cases in which nonverifying
laboratories did return their Raw Data Lists, these lists were entered into
the verification log books and used for the verification review.
The second difference between review of verified data listings and the
review of unverified data listings was that unverified listings were
scrutinized particularly carefully to ensure accuracy and completeness. This
procedure turned up many questions and problems, and telephone calls were made
more frequently to thu nonverifying laboratories than to the verifying
laboratories. Otherwise, the unverified data listings were treated similarly
to the verified data listings in terras of coding and computer processing.
While more problems were identified on unverified Raw Data Lists than on
verified Raw Data Lists during the verification review, the types of problems
encountered among each group were similar. The problems identified included:
• Missing information, such as method blanks*, CAS numbers (on A3
cards), analysis card ID block information; and expected
concentrrtions for internal QC information;
• Incorrect labeling of internal QC information and analyte results;
• Incorrect sample ID numbers;
• Incomplete data for samples;
• Duplicate data for samples.
*EPA laboratories were not required to submit method blanks with each sample,
whereas the subcontractor laboratories were required to do so.
172
^J
\
-------
Identifying and correcting these types of problems required particular
vigilance on the part of the verification team. A further set of problems
required systematic editing by GCA, some of which was done manually, and some
of which was done by computer programs. Finally, a special editing problem
occurred as a result of GCA1s having provided two ambiguous method codes: all
samples coded with methods 625DW and 625BC had to be changed.
For all of the problems except those of ambiguous method codes, the
unavailable CAS numbers, and the specific problems that were corrected through
editing, solutions were sought by referring to the original coded sheets
(j-forms), KM4 Report, Sample Bank Log Books, and other summary listings
prepared by Data Management. If these references failed to help solve the
problem, the laboratory was telephoned. The laboratories submitted the
information to GCA either by telephone or by sending the information via
Federal Express. The corrections were coded on verification forms and
processed in the fashion described above.
The problem involving method codes 625DW and 625BC was handled in a
special way. Three labs, PJB/Jacobs Engineering Group, Inc., Gulf South
Research Institute at Baton Rouge, and Acurex Corporation, were affected by
the method code problem. The 625DW method code problems encountered by PJB
and Gulf South Research Institute were corrected by the VERED Program during
verification data processing. This will be discussed at length in the section
entitled Data Processing of Verified Data.
The verification team identified the problematic method code 625BC as it
was used by Acurex, and corrected it during the verification coding stage.
For this lab, method code 625BC was changed to 625BS when code 625CS was
already used for that sample, and to 625CS when it was not already used for
that sample.
A related problem occurred for data analyzed by Southwest Research
Institute using the 625 method. In this case the problem was not that SWRI
was using an invalid method code, but that it was using a valid method code,
625BS, improperly. Method Code 625BS was being used for samples having double
listings for acid and base neutural fractions. When this occurred, the
verification team combined the double-listed acid fraction and double-listed
base neutral fraction, and changed the method code from 625BC to 625CS.
Missing CAS numbers were another problem that was given special attention
by GCA. With assistance from GCA1s laboratory personnel, a search was made to
identify CAS numbers for compounds for which the numbers had not been provided
by the laboratories. In most cases this lengthy search proved successful. In
other cases no CAS numbers could be identified, usually because a specific
isomer of the compound had not been identified by the laboratory. In these
cases the verification team entered an abbreviation of the compound name in
the CAS number field, or, if this was impractical, they assigned a special
number to the compound and entered this number in the CAS field. This number
173
V
-------
(typically the Sample ID number with a hyphenated numerical suffix)* was then
referenced and explained in a comment card which the verification team added
to the sample. la this way, compounds were reported which did not have NBS
assigned CAS numbers. The CAS Number problem was treated similarly for both
the verified and unverified data.
As mentioned above, specific problems were identified during the
verification program which required simple editing. That is, the problem and
its solution were known and the problem was corrected either by making a
deletion or by combining two concentrations, or both. The specific problems
included inappropriate reporting of acid and base neutral extractions by
Southwest Research Institute, Gulf South Research Institute, and Acurex
Corporation; inappropriate reporting of pollutant code HOI (bexafluorobenzene)
by Battelle Columbus Laboratories; coding of pollutant code V13 (this
pollutant was eliminated by EPA after its initial inclusion in the study); and
the reporting of "0" for expected concentrations in the Internal QC Reports.
The first edit was made to semivolatile samples analyzed by Gulf South
Research Institute, Acurex Corporation and Southwest Research Institute.
These labs analyzed the samples by extracting both the acid and base neutral
fraction for method codes 625BW or 625BS. In doing so, Gulf South Research
Institute and Acurex Corporation reported both the acid and base neutural
fraction results for each pollutant in the Internal QC Report analyzed by
method 625BW. Southwest Research Institute reported the acid and base neutral
fraction results for all compounds on the Analyte Concentration Form as well
a,-, for those on the Internal QC Report, for all samples analyzed by method
625BS.
The inappropriate reporting of semivolatile analysis data was rectified
by the verification team during the review of the Raw Data Lists. The Acurex
samples analyzed by method 625BW were corrected by creating a new card which
combined the observed concentrations of the acid and base neutral fractions,
while keeping the expected concentrations the same (i.e., the expected
concentrations were not combined). The cards representing the individual acid
and base neutral fractions were deleted.
The editing for Gulf South Research Institute was different from the
editing for Acurex. The Gulf South Research Instituf Internal QC Report
included pollutant codes S32-S37. Pollutant codes S3-, S33 and S34 were
combined in the same manner that the Acurex samples were combined. However,
pollutant codes S35, S36 and S37, which represented internal standards
specifically designated as acid or base neutral fractions, were not combined.
That is, the fractions were analyzed separately and the internal standards
were added to each fraction individually. Therefore, they were listed
separately.
*For instance, W21073-001, W21073-002, W21073-003, etc.
174
-------
As mentioned above, Southwest Research Institute listed the acid and base
neutral fractions separately on the Analyte Concentration Form and on the
Internal QC Report. The Internal QC Report data were corrected similarly to
those of Acurex. That is, the observed concentrations were combined and the
expected concentrations remained the same. Tne cards representing the
original acid and base neutral fractions were deleted. Analyte Concentration
Reports were corrected in a different manner. The correct way of reporting
analyte concentration data was to list the acid fraction analysis data for
pollutant codes AO-A14, and the base neutral fraction results for pollutant
codes B1-B54 and B56. Instead of reporting the analysis data in this way,
Southwest Research Institute reported both the acid and base neutral fractions
for each pollutant. To rectify this situation, the acid fractions for
pollutant codes B1-B54 and B56 and the base neutral fraction for pollutant
codes for A01-A14 were deleted. The verification team determined the correct
fractions for each pollutant code by referring to the original code sheets.
Simple dsletions were another type of editing performed during the
verification program. For instance, the verification team deleted reports on
pollutant code V13. This pollutant, known as 1,2-dichloroethene, had two
isomers, cis-1,2-dichloroethene and trans-1,2-dichloroethene, which were being
reported as V07 and V09. The reporting of V13 constituted a duplication and
consequently V13 was deemed invalid. (Unfortunately, some samples were
verified before this duplication was realized. Therefore, the verified data
base includes some samples for which pollutant code V13 was not deleted.)
Pollutant code HOI constituted another simple deletion. Battelle
Columbus Laboratories reported hexafluorobenzene, or pollutant code HOI, as an
area count rather than as a concentration for all samples. While their
reporting method was a valid method, it was not the method eventually chosen
for the Love Canal Study. As a result of this decision, midway through the
project, all of Battelle Columbus Laboratories' reports of the HOI compound
were deleted from the data base.
Finally, the verification team was instructed to delete, from the
internal QC data, every pollutant that had a "0" for an expected
concentration. This task was performed because only the pollutants tiiat had
an expected concentration greater than "0" were actually used as internal QC
compounds.
There were three groups of data which ware not necessarily problematic,
but which were handled differently from other data during the verification
process. The three groups were GCA samples, Tenax internal QC samples, and
external QC samples. The GCA Raw Data Lists were verified by Data Management
by comparing the sample and analysis data to the GCA laboratory records.
Corrections and processing were then handled in the standard manner.
The majority of Tenax internal QC information was coded by the
laboratories. In a few instances, however, the laboratories supplied only the
observed concentrations, and in these cases the verification team supplied the
175
-------
f
expected concentrations. These true values* were coded from a list provided
by Research Triangle Institute both to GCA and to the two laboratories that
were participating in the Tenax analysis: PEDCo Environmental, Inc. and
Battelle Columbus Laboratories.
The review of Tenax internal QC samples during verification was different
from the norm. During GCA verification review, every true value for each
sample designated as a Tenax internal QC sample was compared to the list
provided by Research Triangle Institute. If incorrect true values appeared,
they were corrected by using the verification action form and then processed
similarly to the other samples.
The other group of QC samples that were handled differently during
verification were the external QC check samples. In this case, each external
QC sample was reviewed by the verification team to make certain that the true
values had been coded, that the observed concentrations on the external QC
cards (AO) matched the observed concentrations for the same pollutants on the
analyte concentration cards (Al), and that external QC information was not
present in duplicate. These measures were necessary because of the way the
external QC data were submitted to GCA and handled during the raw data
processing.
As was described earlier, some external QC data were submitted on
standard reporting forms, some were submitted in nonstandard format, and some
were submitted in both forms, causing duplicate (but not necessarily
identical) data. If two sets of data were submitted, one set had to be
deleted during verification data processing. When there were discrepancies
between the two data sets, the correct set was identified by consulting with
GCA1 a QC Officer, by contacting the laboratory, or both.
Other problems arose when laboratories submitted revised measurements for
the external QC samples. In these instances, the data submitted on analyte
concentration forms (Al), as wall as the data coded on the external QC
reporting forms (AO), had to be corrected. If the sample had not been
submitted to raw data processing, then the corrections were made at this
point. If the samples were in the raw data file but had not been verified,
then the correction was made during the verification process. If the sample
had been verified, then the corrections were made at a later date. Whichever
was the case, handling of the external QC data tended to be cumbersome.
Verification Coding—
As mentioned in the section of this report on Love Canal Coding Forms,
the format of the Verification Action Forms was similar to that of the
Analysis Data Reporting Forms. Table 8.8 gives further information about the
Verification Action Forms. Copies of each form can be found in Appendix F.
*The true values were referred to as the expected concentrations on the coding
sheets.
176
-------
V
R
t
1
«
CO
0
S J °g
M J .S
S ! I
i ^
§ .5
M
H J 'S
E s"
M { a.
<&
S '
t** i
1
o ;
e f
t> j a
o Eg.
} ""
0
S «3»
O CM
H 01
H."« o
U r-»
s
o
CO
a
c
o
• ' 4J
OO . Q.
00 u
10
1 . :
g s
H o>
*j
t*
g .
*'.s g
0] W -C «
•^ ••* C bO •** ^ V
£ u 1^ £ T3X- £ ^ 3 C
1 -r* O luOlu-iC 1 c< *> O
r^. iw n, r-. .,4 > .,4 o •— 4 ..«4 .^J
u o« «M 3) e *J .M e •-*
Co. :ao c: o» .JQ -a "O c ft) C 6 O« O -H tj E o. W <0
9 O '^ 3* *3 3 r-fl "W ^
O ft) U -o O « 0) -C C O TJ -» <•» w-i
OOr 1
-3 . 0 -C *j ,0 -^
ft) O O T3 -O VOOl'U OIQCT3 *-
-" U •-" •- CO > < O 41
«Qt3o)> «a-d*o rtdu^ -o
M4>4-)O »-(4) t( l-J Q.
J=«J JT 4-» G QJ J= a JD 3
a> AJ •-• tj D tjj ij ^-* o js Uuk> e
t3 (dC^S -O «--J « OfiJ
O •*-* C O^M 4_t *J J3 uj j-i tfl 0
°J3 w yJou.3 5 as a c
>ti U3 * *-* u £ 2 1-* O
— "oooe i-tooc oo>ai g
o-o cr o-o 00 -3-o o o
n. D eo o t« -^ O *J M 4J Cfl
COcOuju Cfl H) D O (94J C
O) *J C ••-' 0) iJ 4J UJ 0) (j Bl ij) *-J
i— 40>3D*~4* •— « O J L> C «— * CJ **
u C *J C
•— I ft) «-i g
-* CM «v> "^ ->
r^ C J
rt ^5 "
1 0 V I
Pv ^ S •
"J •- u J
c cj -a •-. *
c a. o ! ^
2 71 -o " r—
»-* - ry "
O 13 1' ~y . S
o ts -~* ; e
u o > 59
c ° y 2 5 ^
^ ^ S. j 5
-4 o «• o ! t:
at o c -c J -j
*W jS *-J J TJ
• » te 1 o»
*a n -o = ! "
M c; 0 4 U)
u-t k. I ••-'
• H CU 01 aj J «-t
ft) *J .— 0 J «
O i*-i *J S
° c u c
• U --4 u O
-IOCS •-«
i-H JO ft,' 4J
O "O *J U
Q, 1) BD *" tfl
U --4 0
•» o> u ^ • * ai
to * w 5 "
01 4J U •— t C1
a-c-^r-j ex J J «
J JC
1 s
o *
" 1 "
! o
0'
U r-l
o -S
** ^ ^
Q
a
C -a •
c 1 •§ g
o •-* 2 o
•H O l-i M-
u U 0
cd l u-i c
Eo o
oo < .;;
O AJ
*M cy D o
C m L. <
M 0 ! 01
0. f > C
U h O
O" 3 m -r*
& C u
-« O «
(9 —••'-< O
C <9 u •!•<
U U (j IM
0) J} to -^
4J C U
JO
flj ft)
— l
— 1 < *0
0>
•o
o
o
e
•-<
S
0)
0)
01
t)
o
w
01
(9
U
1^
o
UJ
•o
V
m
3
tJ
8
V
£
4J
. J
OO
m
•o -o
c -o •
~ - B
f- .-* O
<9 u-t
CO
C V* F-*
§o •-«
u-< a
•o c
cu o
" *i-i •&
3 O1
ty t*
2'g.
•; S ^
% o>
01 ft)
5
S D<
^ C H
•a «) TJ
tt y 14
«0 C CO
U 0 U
3
JC ^"S
M Wl tfl
jO u
0) •
V
a.
i :
0 ' ,
•3 ;
s
CU ,
a 1
IB |
\
3 !
4J
0) , j
CL ;
an
J=
X '
^
$
o
0)
CO
3
a»
u
• 5
o »
o e .
r-4 WO
-0 OB)
3 c-2
E co
0) 3 4J
5 o-S
t i S
U OO Id
C '
O» 0> V
M « J= ' 1
0) O u
^ £X ,
O< k> C 'i
0) 3 .^ >, j
4) . t \
t*4 <~t tJ ,
r*l to (B i
I U U .'
^4 ft) 3
g | S ••' j
S el !; 1
•o * tf ]
^^
' J
^
-;
'
177
-yr, !-»»>?•»,«.-*«
' j
A^Hiii ^I'l i
-------
The Verification Action Form used by the Verification Team to make j
corrections was dependent upon the type of data that was being corrected. The
Verification Data Action Forms corresponded directly to the analysis cards so j
as to facilitate any additions, deletions or changes that may have been \
necessary to attain the goal of data accuracy and completeness. -
The sample ID number, which could not be changed during verification, was
the only piece of information from the ID block (columns 1 through 32 of each
"A" card) that was necessary to code when corrections were being made to the
analysis data. The VERAP1 Program, which will be discussed in the section of i;
the report entitled Data Processing of Verified Data, copied the ID block j
information from the analysis data already present in the system. This :
facilitated the verification coding process. ] '
An addition or change to the ID block was achieved by listing the sample
ID, the change or addition (action code C or A), a V in column 76 and nothing !
in column 77. That is, a change or addition to the medium, method, Lab ID, : j
date, time or sample size of the analysis data for an entire sample was I ;
achieved by using the format shown in Exhibit 8.6. j ]
As is seen from looking at Exhibit 8.6, an altered J51, 52, 54 or 55
Verification Action form was used to code the additions and changes made to
the analysis c?rd ID block. Since this occurred for only a small proportion
of samples, a special form was not created. By not specifying the card type
number found in column 77, global changes or additions were applied to all of
the analysis cards for the sample. For instance, the medium in the first '
example in the figure was missing for all the analysis data. The above format : .
was utilized to correct the situation. The Action was "C" for "change" ' i
instead of !'A" for "add" because a field was being changed; the entire line
was not being added. In example two, an incorrect Lab ID was being changed to ;
the correct one. It is necessary to note that the method, date, time and :
sample size for samples with more than one method were not changed or added as ; •
an ID Block correction unless the change or addition applied to all the !
methods. One or more additions or changes to the ID Block were allowed. j
Deletions were not possible as ID Block corrections. j
: i
A change acti m code was used on J51, 52, 53 and 55 forms when ' ]
information was being added to an existing line (or card) or when a field on )
that line was being changed. In the frist case, the sample ID, pollutant , ]
code, new data and action code "C" were coded on the Verification Action j
Form. The latter case entailed the coding of the sample ID, pollutant code or j
CAS Number, the new data, and action code "C". The only way to change the 1
comment cards was through the use of action codes "D" for "delete" and "A" for {
"add". That is, if a change was necessary, all existing comment cards had to !
be deleted, and then all comments appropriate for the sample were added.
A "D" for "delete" was used when an entire line was to be removed. The
coding requirements for deletions were sample ID, pollutant code or CAS
number, and action code.
178
Ll
-------
in
m
in
m
1
to
(M
0
1
to
1
1 2 3 4 5 6 7 8 9 10" 12 13 14 13 16 17 18 192021 2223242326272829303132 COLUMNS 72 73747576777B79Sol
ON
03S
oavo
g =Ji
y, NOliDV
>
^
\\
0
33-91 ARE
IRRELEVANT
FOR
HE TASKS BEING DISCU
H Ul
Z -
CO
Z1 »
z
Ul
»-
i
o
M
CD
-1
O
0
X
111
z
z
3
O
111
Z
H
kl
-1
a
z
CO
O
(VI
r
(VI
f*.
(O
—
(VI
^
EXAMPLE
1
>
\\
>
v\k
N\\\
V\\K\
o 1
f. . 4<4
—
—
^
CD
-5
a
•JD
ro
O
O
(M
^
EXAMPLE
2
^9
NS
>
>. >
\\
>
^\?
^^
•>
^\
N\
\\
I^cf -*• -^
\S,
•^
\\
^
\S
>
x\\
\v
7V\
\N
o
o
(0
•H
CO
(t)
(0
O
<0
60
G
a
o
vO
•
00
179
-------
The action "A" for "add" was used when a new line was to be entered. The
requirements for additions included the coding of the sample ID, pollutant
code or CAS number, action code, and specific fields on forms J51, 52, 53 and
55. On the J51 forms, either the concentration field had to be coded, or one
of the three fields, "not detected", "trace", or "qualitative" had to be
checked. On the J52 form, the observed and expected concentrations, as well
as one of the internal QC descriptions, had to be coded. The judgment and
match scores on the J53 form had to be coded. It was necessary to code the
observed and expected concentrations on the J55 form. If these fields were
not coded, the VERED program (to be discussed in the following sections) would
identify and flag those samples as being inaccurate.
Analysis data were added to a sample during verification only when other
analysis data for that sample had been entered into the raw data base through
the raw data process. That is, no type of analysis data could be added to the
system during verification unless other analysis data for that sample were
already present in the system. If no analysis data had been entered into the
raw data base through raw data processing, the sample was deleted from the
verification process until the data were processed through the raw data system.
Verification Data Processing—
Verification computer processing was the next step after verification
review and keypunch. This process was scheduled to start at the beginning of
November when the review of Raw Data Lists began. However, due to unforeseen
circumstances, the verification data process began on December 20, 1980, when
the first verified data tape was sent to EPA. However, most of the samples
were verified by mid-January. Table 8.9 presents a complete timetable of
verification data processing.
The delay in verification data processing from the beginning of November
to December 20 was due primarily to the laboratories' submitting incomplete
analysis information to Data Management. For instance, Analyte Report Forms
ware frequently submitted without the corresponding method blank and/or
Internal QC Report. A more ambiguous data flow problem occurred when the
results of analyses by different methods were submitted by the laboratories to
GCA ou different dates. Data Management had no mechanism for determining
which samples would receive multiple shipments of results. The lag time
between the receipt of the first pieces of data for a sample and receipt of
the data that would make the sample complete was up to 6 weeks. Therefore,
the verification data process was delayed until it appeared that most of the
samples in the data base were complete. Any samples which appeared to be
incomplete were held out from verification computer processing until they
appeared complete.
The keypunched verification cards and Raw Data Lists for samples which
appeared to be complete were submitted to the Verification Processor for
computer processing. Before baing routed to the Verification Processor,
180
-------
TABLE 8.9. TIMETABLE OF VERIFICATION DATA PROCESSING
Date of verification
12-20-80
12-27-80
1-2-81
1-8-81
1-9-81
1-10-81
1-11-81
1-12-81
1-14-81
1-15-81
1-19-81
1-21-81
1-23-81
1-29-81
2-9-81
2-25-81
3-11-81
3-11-81
3-16-81
No. of
samples verified
772
775
1,054
1,577
859
672
1,080
715
191
240
70
67
172
141
429
494
547
73
161
Cumulative No. of
samples verified
1,547
2,601
4,178
5,037
5,709
6,789
7,504
7,695
7,935
8,005
8,072
8,244
8,385
8,814
9,308
9,855
9,928
10, 089
181
-------
though, the Raw Data Lists and corresponding verification forms and cards were
checked against the problem book* and were then logged into the Love Canal
Study General Log of Data Processing of Verified Samples (see Exhibit 8.7). ^
batch number (starting with 1 and ending with 49) was assigned to every group
of Raw Data Lists forwarded to the Verification Processor. The Verification
Coordinator recorded the groups of Raw Data Lists frou batch 1 to batch 25.
The batches after batch 25 were small and consisted of individual samples that
had been problematic samples identified during data processing. The
Verification Processor recorded the batch numbers, but did not record the
individual problem sample ID'S.
Once the Raw Data Lists were logged in by the Verification Processor, the
verification computer processing began. The verification process was a series
of computer programs designed to apply any changes, deletions or additions
identified during the verification review and to generate verified data lists
which were the updated versions of the sample and analysis information. A
corresponding Alert Report was also printed and a tape of verified data was
generated for EPA. In addition, a cumulative file of all processed verified
data was created for GCA. Four major computer programs, VERED, VERAP1,
VERAP2, and VERFILE were used to accomplish this task. Figure 8.9 shows in
detail the portion of Figure 8.1 which pertains to the verification of
analysis data and the generation of verified data files and reports.
The first step in verification computer processing was to run the
verification action cards through the VERED program. VERED was a three-part
program designed to edit the cards. The first part of the program listed the
actions being performed. The second section reviewed these actions, performed
editing on the cards, and printed error messages if any problems were
identified. The third section of the program generated a pvint-out of the
card images.
The VERED piv?rara checked the verification action cards for the validity
and presence of various parameters. This program flagged invalid or missing
parameters. Table 8.10 summarizes the parameters chedked by the VERED
program. Editing was another task performed by the VERED program. Invalid
pollutant codes, B55 and B57 through B62, were deleted. A complete l*st of
checks and edits performed by the VERED program is provided in Appendix D-4.
The verification processor reviewed the VERED printout and compared the
verification action forms to the card image section. If any errors were
identified, new cards were keypunched and all the cards were run through the
program again, repeating the process of review and editing. This process was
repeated as many times as necessary to make the cards correct. If the cards
appeared to be correct, the VERAP1 program was run.
*The verification problem book contained notes on all samples for which any
corrections were necessary. If the corrections were easily accomplished, then
the corrections were completed and the saraple was submitted to verification
data processing through the appropriate channels.
182
-------
LOVE CANAL STUDY GENERAL
LOG OF DATA PROCESSING
OF VERIFIED SAMPLES
[LAB ORATORY
NAME
PACKAGE
NUMBER
BATCH
NUMBER
TAPE
COMMENTS
Exhibit 8.7. Verification data processing log kept by
Verification Coordinator.
183
-------
(0
-------
TABLE 8.10. PARAMETERS OF VERED PROGRAM REVIEW
Presence and Validity
of Parameters :
Sample ID number
Medium
Method codes
Laboratory ID
Action command
Card type
Pollutant code
Presence of Parameters:
Concentration or not
detec ted /trace /qua 1.
Observed concentration
Expected concentration
Type of QC sample
CAS Number
Match score
Judgment
Verification action forms
J51 J52 J53 J54 J55
X X X X X
X X X X X
X X X X X
X X X X X
X X X X X
X X X X X
XX X
X
X X
X X
X
X
X
X
185
-------
The VERAP1 program searched through the master raw data file for the
analysis cards of samples being verified. Without changing the raw file, it
copied these data into a temporary file named "LASTVER." It then applied to
this data any changes, deletions, or additions specified on the verification
action forms. The VERAP1 program also did some editing; the problem of method
code 625DW, mentioned in the previous section, was solved by this program, as
shown in Table 8.11.
TABLE 8.11. CONVERSION OF METHOD CODE 625DW
Fields as coded
Fields changed to:
Laboratory
PJB
PJB
PJB
GSLA
Method
625DW
625DW
625DW
625DW
Column 1
(sample ID
prefix)
W
Z
Q
ANY
Method
625BW
625CW
625CW
625CW
Column 1
(sample ID
prefix)
W
W
Q
No change
Finally, the VERAP1 program generated a printout that listed all samples
in the run and printed out the analysis data for those samples having changes,
deletions and/or additions made to them, or whose raw analysis data were
processed on different Julian Dates. The verification processor reviewed
these samples for completeness and accuracy.
Any problems identified while running the VERED and VERAP1 programs were
solved by referring to the Raw Data Lists or by using the references utilized
by the Verificition Coordinator. Once the information needed to solve the
problem was acquired, a LASTVER printout was generated and corrections were
applied using the PANVALET update process. These corrections were keypunched
and run through a PANVALET update program which applied the corrections to the
LASTVER file. The problem samples that were not resolved were deleted from
LASTVER and were processed at a later date.
When all the samples appeared to be correct, the VERAP2 program was run.
The VERAP2 program closed the run by merging the analysis data that was
located in LASTVER into the TEMPVER file, where the verified sample data were
located.
VERAP2 generated a short printout, which was reviewed by the Verification
Processor to ensure that the merging action was satisfactorily completed.
186
-------
The last program run in the verification processing sequence was the
VERFILE program. This program selected from the TEMPVER file the following
samples:
• Samples with Sample Bank ID numbeis for which sample daca and
analysis data were both present;
• Analysis data for laboratory-generated internal QC samples (Sample
ID beginning with QO);
• Analysis data for duplicate samples generated by the laboratories.
(Sample IDs beginning with Z).
(These data were deleted from TEMPVER, leaving in TEMPVER only verified sample
data for which analysis data had not yet been verified).
In its next step, the VERFILE program converted all selected data to a
basic format of one record per sample.* It then merged these data into a
cumulative disk file of all verified samples, which was maintained on the GCA
computer. The same selected data were also written to a tape for EPA in a
basic format of two records per sample.** The formats of the GCA and EPA
verified data are shown in Table 8.12. A log of verified data tapes sent to
EPA was maintined in the notebook which also housed the log of raw data tapes
sent to EPA.
Finally, the VERFILE program generated verified data listings and an
Alert Report, which were similar to the reports generated during Raw Data
Processing. It also produced a Delinquency Report which listed samples for
which sample data had resided in TEMPVER for over 30 days and for which no
analysis results had as yet been verified.
After the verification process was completed and a tape was made, the Raw
Data Lists and coded verification forms were returned to the Verification
Coordinator, who filed them by laboratory in Raw Data List folders. The Raw
Data Lists which had problem lists attached to them were not filed in the Raw
Data List folder until the problem samples were resolved. The verification
forms for miscellaneous problem samples were filed in a separate folder.
When the verification process terminated on March 17, there were only six
samples remaining which still could not be verified because of unresolved
problems. These were all Compuchem Mead samples for which method blanks were
never submitted. On instruction from EPA, another group of samples were not
*For samples with more than 81 cards (including sample data and analysis
data), additional verified records were formed.
**The first record contained sample data, and the second record contained
analysis data for samples with more than 81 analysis cards, additional
verified records were formed.
187
-------
TABLE 8.12. FORMAT OF VERIFIED DATA RECORDS
Data fields
Column in
verified record
A. GCA's Disk File
1. Sample ID and Medium
2. No. of sample cards (NSAM)
3. No. of analysis cards (NAL)
4. Date verified
5. Continuation character3 (blank if only
one record for sample, "A" if first record
of multi-record sample, "B" if second record
of multi-record sample, etc.
6. Sample data (Columns 11-72, 76-77 from
each sample card)
7. Analysis data
a. Sample ID, lab, time, size
b. Repeating data: cols. 33-72, 76-77,
11-15, and 21-24 from each analysis
card
B. EPA's Tape File
1. Sample ID and Medium
2. Number of data records (N)
3. Verification date
4. Continuation character3 (Always "S" for
Sample Data, "A" for first analysis data
record, "B" for second analysis data
record, etc.)
5. Outfield: 64 characters - columns 11-72,
76-77 from each sample card or analysis card
1-10
11-12
13-15
16-18
19
19 through PTS; where
PTS = 19 + 64 x NSAM
(PTS + 1) through PTA,
where PTA = PTS + 12
(PTA + 1) through (51 x
NAL + PTA)
1-10
11-13
14-16
17
18 through (17 + 64 x N)
aA maximum of 81 raw data cards (sample or analysis) were included in a
single verified data record.
188
-------
verified; these were samples which were extracted by New York State and
analyzed by PJB Laboratories. Table 8.13 lists all samples for which analysis
data were received but which were not verified.
On the whole, verification data processing went extremely smoothly.
There were, however, two problems which required changes in computer
programs. These included the failure to provide for more than one date of
analysis (this occurred in the first three verification runs) and a problem
which occurred during the first VERFILE run, when analysis cards for
laboratory-generated duplicate samples were matched with sample cards. (These
sample cards should have been matched with analysis results for the original
samples.)
When the VERFILE program was originally designed and coded, the
programmer understood that only one date of analysis would be associated with
each sample. Thus, when the VERFILE program restructured the record for a
sample, it saved the analysis date only from the first analysis card it
encountered. In fact, a different date of analysis was possible for different
methods and for each pollutant measured in the sample. After the first
verification run, EPA identified this problem, and the program was immediately
corrected. In addition, the following steps were taken to "fix" the data
already verified.
• A list of samples which had already been verified and which had more
than one date of analysis in the Raw Master File was generated.
• For the samples so identified, a data file was constructed from the
Raw Master File which contained dates of analysis for each measured
pollutant. This file was transmitted to EPA for correction of the
erroneous verified data on their tape.
• A correction program was written by Data Management which read the
correction file (described above) and GCA's verfified data file, and
which applied the corrections to produce an updated verified data
file.
The other problem which arose during verification action processing
concerned the mib.natching of sample cards with duplicate "Z" analysis cards
passing through the VERFILE program. When the VERFILE program was coded, an
assumption was made that the duplicate "Z" samples would not be verified
before their mates. When this assumption turned out to be untrue, the result
was that the analysis cards for the "2" samples were matched with the sample
cards for the original sample. In consequence, no sample cards remained to be
matched with the analysis r- suits for the original sample.
Only about 30 samples were processed in the first verification run, but
some of these did happen to be duplicate "Z" samples which were mismatched
with sample cards which did not belong to them. The data processed in this
run were corrected as required and a change was made in the VERFILE program to
prevent reoccurrences of this event. The new version of the VERFILE program
189
-------
TABLE 8.13. SAMPLES NOT VERIFIED
Sample ID
Reason not verified
Q05500-Q05506
Z21817
S40619
S45088
Z45214
S45509-S45510
S45514-S45515
S45523-S45524
S45528-S45529
245529
S45538-S45539
S45543
Z45543
S45544
S45547-S45548
S50348-S50349
NYS extraction; PJB analysis.
Compuchera: No method blank provided.
Compuchera: No method blank provided.
Compuchera: No method blank provided.
Corapuchem: No method blank provided.
NYS extraction, PJB analysis.
NYS extraction, PJB analysis.
NYS extraction, PJB analysis.
NYS extraction, PJB analysis.
NYS extraction, PJB analysis.
NYS extraction, PJB analysis.
NYS extraction, PJB analysis.
NYS extraction, PJB analysis.
NYS extraction, PJB analysis.
NYS extraction, PJB analysis.
Compuchem: No method blank provided.
190
-------
checked to ensure that sample and analysis cards matched perfectly. Also, the
amended program assumed that the duplicate "Z" analysis cards did not require
sample cards and therefore did not match them with any.
Finally the verification team had to be alert to the timing of
verification of certain samples analyzed by PJB Laboratories. As previously
mentioned, PJB Laboratories had used the ambiguous rauthod code 625DW. For
some parts of the analysis report, the first character of the sample ID was
changed from "W" to "Z" when this method was used, while the "W" retained for
other parts of the report for this same sample. The VERAP1 program was
rewritten to correct this situation by changing the method code and the Sample
ID as appropriate (see Table 8.11). Thia meant that all PJB water samples
having duplicate "Z" samples had to be verified in the same VERFILE run as the
original "W" samples in case the change and merging had to be performed.
The following duplicate "Z" samples were merged with the original sample
(as described above):
• Z20062 • Z20181
• Z20069 • Z20194
• Z20070 • Z20387
• Z20074 • Z20408
e Z20075 • Z20410 I
• Z20076 • Z20411
• Z20111 • Z20680
• Z20174 • Z20853
• Z20177 • Z20954
Corrections to Verified Data
Unfortunately, not all samples which were processed through the
verification programs were totally correct, and an additional set of |
procedures had to be developed to make corrections to the verified data. Some j
isolated errors were discovered by EPA during their review of the data (for j
example, the mislabeling of the type of internal QC data), and some errors j
were discovered which applied Co several samples (e.g., the miscoding of the
sampling data for air sample collection extending over a 2-day period). GCA
identified several QC samples which had been verified without their true
values (the data supplied to GCA on AO cards), and some laboratories submitted
revisions to some of their QC results. However, the majority of changes in
verified data were made on the TOX data submitted by Advanced Environmental
Systems (AES), and to the Hi-Vol analyses submitted by EPA-EMSL, at RTP. The
changes to the TOX data were necessary because AES had made a systematic error
191
-------
in calculating these results; thus over 240 AES samples required changes.
Corrections to the Hi-Vol data involved additions, rather than changes. In
this case, results for arsenic and chromium, wi.ich were measured by the
lengthy neutron activation method, were submitted to GCA after the Hi-Vol
samples has been verified in January. Consequently, about 220 Hi-Vol samples
had to pass through the verification correction procedures.
Two types of procedures were used in making corrections to the verified
data base. In a few instances "global" changes were made; for example, for
all sediment samples whose source was originally coded as "RIVER," the source
was changed to "STRM" (in response to a request by EPA). Such changes were
made by a simple computer program which tested each sample in the entire data
base and which made the desired corrections when the specified conditions were
met.
Most corrections, however, were made to specific samples, which were
identified by Sample ID Number. This verification correction process is shown
as step 32 on Figure 8.1, and it is illustrated in more detail in Figure
8.10. Once a group of verified samples had been identified as needing
corrections, their Sample ID numbers were punched on cards. These cards,
along with the verified data file, served as input to a program (item 1 of
Figure 8.10) which searched the verified data file for the required samples,
reformatted them into card-image records, and wrote them onto a temporary disk
file of "FIXCARDS." Using a listing of the FIXCARDS file, Data Management
prepared PANVALET updates (item 2 in Figure 8.10) in a manner similar to the
process used in updatng the TEMPVER file. The records in the updated FIXCARDS
file were then reformatted to the verified record format and merged back into
the verified data file (item 3 in Figure 8.10.).
Data listings and alert reports for the updated samples, in the same
format as those produced by the VERFILE program, were also generated during
this final step in the correction process. Table 8.14 shows the dates on
which the verification correction process took place, and the number of
samples processed on each date.
Because the data which were undergoing the verification correction
process had a1read, been submitted to EPA as verified data, it was important
to keep EPA informed of the changes being made. The complete sample listings
and alert reports produced during the process were sent to the same EPA
personnel who had received the original verified data listings. Also, written
lists of corrections (with details on precise changes by field and column
number) were sent to the EPA Data Processing group, so that they could update
their computerized data base. (EPA chose this update method in preference to
receiving tapes of updated samples from GCA.) Corrections made to verified
data were, of course, included in the tapes of validated data which were sent
to EPA. Each time GCA submitted a tape of validated data to EPA, this tape
contained all verification corrections performed through that date.
i
192 .
-------
i
03
-------
TABLE 8.14. SCHEDULE OF VERIFICATION CORRECTIONS
Dace
2/11/51
3/03/81
3/10/81
3/16/81
3/16/81
6/02/81
11/03/81
No. of
samples
211
344
291
22
15
182
17
Cumulative
No. of samples
211
555
846
868
883
1065
108?
DATA VALIDATION
Invalidation Instructions
EPA was responsible for identifying invalid data in the Love Canal
verified data base, and for providing invalidation instructions to GCA. These
instructions took the form of lists of samples to be invalidated, or listJ of
criteria for invalidating data. Invalidations by the following sets of
criteria were included in the instructions:
• Sample ID
• Lab
• Well number
• Lab name, date, method
• Medium, method, compound
• Sample ID, compound
• Lab name, date, source, compound
• Medium, compound
• Medium, compound, concentration.
194
-------
****
When any of the first three sets of criteria above were given, the entire ;
sample was to be invalidated. For the other sets of criteria, only the 3
specified compounds, or the compounds associated with specific methods, were
to be invalidated.
Validation Data Processing
Validation was conducted by GCA at four points in the study: on January
13, 1981: on March 17, 1981; on June 2, 1981; and on November 3, 1981. The ;
first three dates were specified by EPA to meet their schedule requirements,
while the fourth validation was performed when all valilation instructions had
been submitted. In January, GCA assumed responsibility only for invalidation
of total samples by Sample ID number, while EPA was responsible for applying j
the remaining invalidation instructions. In later validations, GCA was i
responsible for applying all invalidation instructions. " ;
The steps involved in GCA's January validation eifort were very simple,
as shown in Figure 8.11. The Sample ID numbers oe. samples to be invalidated
were punched on cards; these cards, along with the updated verified data file,
served as input to the validation program. This program simply searched the
input file for the invalid samples, and passed all samples except these to the !
output files. The validated tape generated for EPA by this program was in the
same format as the EPA verified data tape, whle the GCA validated disk file :
was in the same format as GCA's verified data file. .
*
The validations conducted by GCA in March, June, and November were more !
complicated, with the firsC step being to interpret EPA's invalidation
instructions. When the entire sample was to be deleted (by Sample ID, by lab, /
or by well number), the instructions were clear and specific. However, far
the remaining criteria, the Project Manager and the Quality Control Officer j
made some interpretations of the EPA instructions. In accordance with these I
interpretations, GCA preceded as follows: i
• For invalidations by lab, date, source, and compound; and by lab,
date, and method: invalidations were applied to field Camples, to J
field blanks, and to the internal QC information supplied under the
original sample ID, but not to PE samples, QC samples, or internal
QO sample^ (with QO ID numbers). !
• For invalidation by sample II) and compound: invalidations were I
applied to the internal QC information supplied under the original
Sample ID.
• For invalidation by medium and compound: invalidations were applied
to all pollutant codes associated with that compound, and to all
samples of the -pecified medium, including PEs, QCs, and "QOs".
For invalidations by lab name, date and method, and by lab name, date,
source and compound, GCA identified all sample ID'a to be involved in the
invalidation by using listings of verified samples sorted by laboratory and
195 |
i
i
I
-------
00
Ov
c
CO
C
o
rt
•o
>
4J
(0
•H
P*
•
i-l
rH
•
CO
V
I-l
CXI
196
/
-------
date. Decks of cards listing Sample ID numbers of partially invalid samples
and of wholly invalid samples were prepared, and invalidation activities
proceded through three stages, as diagrammed in Figure 8.12.
The complete file of verified data served as input to the first stage,
which handled invalidations by compound and medium. Conditions for
invalidation were specified in the program "Valid 1" (Item 1 of Figure 8.12).
This program simply searched through the verified data file and withheld all
specified combinations of compound and medium from the data which it passed on
to a temporary validation file (Validl). It should be noted that the verified
data file was not altered during validation.
The temporary validation file produced by the first validation stage
served as input to the second stage, in which partial invalidations by Sample
ID were handled. The procedures used in this second stage CK validation were
exactly like those employed earlier in correcting the verified data base.
That is, selected samples were reformatted into card-image records (Item 2 of
Figure 8.12), and PANVALET updates were used to delete the card-image record;,
of invalid compounds (Item 3 of Figure 8.12). Updated samples were
reformatted to the verified format (Item 4 of Figure 8.12) and were merged
with the remaining verified records into a second temporary validation file
(VALID.FINAL1).
In the final validation stage, the samples which were being wholly
invalidated were handled. The procedure used here was the same as the
invalidation procedure used in January for this same purpose, except that in
this case the input data file was the temporary validated data file generated
by the second validation stage (VALID.FINAL1).
Corrections to Validated Data
As happened with the verified data, so it happened with the validated
data that some errors were detected after the validated data tape had been
generated for EPA in March. Exactly the same procedures were followed in
correcting the validated data as were followed in correcting the verified
data. That is, PANVALET updates were applied to the samples involved, which
had been written in a separate file in card-image format. The updated cards
were then re-formatted and merged with the remaining validated data. A new
tape of validated data was produced for EPA at this tine. The validation
correction procedures took place in June and in November, at the time of the
validation runs, (identical corrections were made to these same sanples in
the verified data base.)
QA/QC PROGRAMS
Earlier, in the section which described reports based on the raw data
file, we discussed two programs (LCQC1 and LCQC2) which generated some QC
information. LCQC1 generated reports on recovery rates of internal and
external QC samples, while LCQC2 compared analysis results for the two
aliquots of laboratory-generated duplicate samples. In the present section we
describe a set of programs which generated QC reports based on the validated
(or verified) data file.
197
-------
— _i —
oe > — i t
< Z 0. <
a. — z wi
X.
—
5
**- &.
*3 g
w < —
_J E U.
uj 2
-------
•o
1
00
0)
3,
fn
I
199
I
-------
The QA/QC programs which generated reports based on the validated data
file are grouped together as step 38 of Figure 8.12. In fact, several
separate QA/QC programs were designed, as summarized in Table 8.15.
TABLE 8.15. QA/QC PROGRAMS RUN ON VALIDATED DATA
Program
Purpose
Analysis performed
QSPECAO Accuracy
QSPECZ Intralab precision
SPREPD Intralab precision
SPREP Interlab precision
Analysis of differences between
observed and expected results of
external QC Samples.
Comparison of results of laboratory-
generated duplicate samples.
Comparison of results of duplicate
samples supplied by the Sample Bank.
Analysis of results of triplicate
samples supplied by the Sample Bank.
The QSPECAO program, designed to measure analysis accuracy, analyzed the
differences between observed and expected concentrations for each pollutant
reported for each external QC sample (as reported on "AO" cards). Input to
the program was a file of validated (or verified) data which had been
re-formatted to card-image records. This is illustrated in Figure 8.13.
QSPECAO searched through the card-image file for all data submitted on AO
cards, and reported these in a list of individual pairs. For each pollutant
in each sample, the following information was listed:
• Sample ID
• Laboratory
• Pollutant name and code
• Observed concentration
• Expected concentration
• Difference between observed and expected concentrations.
Pollutants with zero values for either the observed concentration or the
expected concentration were listed separately; they were not included in the
list of pairs or in the subsequent analysis of the AO data.
200
-------
O
D.
0)
o
u
w
10
c
M
(U
U
X
(U
60
201
-------
The second part of the QSPECAO report was a statistical analysis of
external QC results for each medium, by laboratory and pollutant. Information
listed for each such combination included:
• Laboratory
• Pollutant name and code
• Number of observations
• Mean difference between observed and expected concentrations
• Standard deviation
• Upper and lower 95 percent probability limits
• Upper and lower 95 percent confidence limits
The operation of the QSPECZ program, which compared the analysis results
of laboratory-generated duplicate samples, was similar to the operation of the
LCQC2 program, which performed these analyses on the raw data file. The data
flow for the QSPECZ program is shown in Figure 8.14. Like the QSPECAO
program, the QSPECZ program operated on a file of validated data which had
been reformatted into card-image records. The program searched through the
file to identify samples with ID numbers beginning with "Z," and printed a
listing of the observed concentrations of each pollutant found in the Z sample
and its mate. Information listed included:
• Original Sample 10
• Laboratory
• Pollutant code
• Observed concentration for original sample
• Observed concentration for duplicate aliquot (Z sample)
»
• Difference between the two observed concentrations.
Entries were made in the listing only if a quantified (non-zero) concentration
was reported for at least one member of the pair.
The program then performed a statistical analysis on these results, and
generated a report which «/as summarized for each medium by laboratory and
pollutant. Information listed for each such combination included:
• Laboratory
• Pollutant name and code
202
-------
M
O
O
•3
o
0.
co
r-l 0)
<0 r-4
l-i cu
u E
C rt
l-l /-N
O Nl
O
VH tt
(j
rt C
n)
-------
• Number of observations
• Mean difference of concentrations
• 95 percent probability limits
• Standard deviation divided by 2.
The SPREPD program was also designed to measure intralab precision, but
in this case the measurements were based on duplicate samples supplied by the
Sample Bank, rather than on laboratory-generated duplicates. The Sample ID
numbers of each pair of duplicates were provided to the SPREPD program on the
QC Cross-Reference Cards (Q-Cards). These cards, which had been prepared by
the Sample Bank, included all pairs of duplicates in the duplicate program,
and the pair of each samples sent to the same laboratory during the triplicate
program. The data'flow for the SPREPD program is shown in Figure 8.15.
The SPREPD program worked directly on the validated (or verified) data
file, searching through the file to identify each of the samples specified in
the "Q-Cards" file. One part of the report generated by this program listed
the observed concentration of each pollutant found in each pair of
duplicates. Each entry in this list contained the following information:
• Laboratory
• Sample ID of each sample in pair
• Pollutant code I
4
• Observed concentration for each member of the pair j?
^
• Difference between the two observed concentrations. |
A statistical analysis of these results was performed, and a report was j
generated which summarized findings for each medium by laboratory and *
pollutant. This report was identical in format to that of the statistical )
analysis report of the QSPECZ program. i
The final QA/QC program which was run on the verified or validated data
file was the SPREP program, which was designed to measure interlab precision.
Samples involved in this measurement were generated by the Sample Bank's
triplicate sample program. Each set of triplicates included one sample which
was sent to an EPA laboratory for analysis, and two samples which were sent to
the same sub-contractor laboratory. Triplicates were identified to the SPREP
program through the QC Cross-Reference cards file. The data flow for the
SPREP program, which worked directly on the validated (or verified) data file,
is shown in Figure 8.16.
204
-------
-------
CO
-------
Three types of reports were generated by the SPREP program. One was a
IJ.sting of individual pairs, with a separate entry in the list for each of the
two sets of pairs comprising the set of triplicates. (The EPA sample and one
subcontract lab sample formed one pair; the EPA sample and the other
subcontract lab sample formed the second pair). Each entry in the listing
contained the following information:
• Subcontract laboratory
• Subcontract lab Sample ID
• EPA Sample ID
• Pollutant code
• Concentration of subcontract sample
• Concentration of EPA sample
• Difference between subcontract and EPA concentrations
• Identification of set ("Set 1" jr "Set 2").
The second SPKSP report was a summary of a statistical analysis by medium,
lab, and pollutant, in which the findings were reported separately for each
set of pairs. The following information was given in each entry:
• Subcontract laboratory
• Pollutant name and code
• For Set 1:
- number of observations
- mean difference between concentrations
- standard deviation
• For Set 2:
- number of observations
- maan difference between concentrations
- standard deviation.
The final part of the SPREP report presented the results of a statistical
analysis by raediuu, lab, and pollutant, in which the findings for the two sets
207
-------
of pairs in each set of triplicates were combined, giving a pooled estimate of
interlab precision. The following information was given for each entry in the
listing:
• Subcontractor name and code
A Pollutant name and code
• Number of observations in Set 1
• Number of observations in Set 2
• Mean difference between the mean differences of the separate sets
• Standard deviation
• Control limits.
Appendix I provides a listing of the GCA Love Canal Software utilized in
the Data Management Segment of the study.
208
------- |