SERA

United States	Office of Research and	EPA/600/R-22/065

Environmental Protection	Development	July 2022

Agency	Washington, D.C. 20460

www.epa.gov/emergency-response-research

Data Management for Wide-area
Responses: Technology Evaluation and
Operational Expert Feedback

by

Timothy Boe1, Erin Silvestri1, Jamie Falik1 Matt Blaser2, Jim Mitchell2, Brian Cooper2, Leroy Mickelsen3,
Lieutenant Commander Clifton Graham4, Katrina McConkey5, Molly Rodgers5

^.S. EPA Office of Research and Development (ORD)

Center for Environmental Solutions and Emergency Response (CESER)

Homeland Security and Materials Management Division (HSMMD)

Durham, NC 27709

2U.S. EPA Region 5

3U.S. EPA Office of Land and Emergency Management (OLEM)

Office of Emergency Management (OEM)

Consequence Management Advisory Division (CMAD)

4U.S. Coast Guard (USCG)

5Eastern Research Group, Inc. (ERG)

Morrisville, NC 27560

Contract EP-C-16-015 to Eastern Research Group, Inc.


-------
DISCLAIMER

The U.S. Environmental Protection Agency (EPA) through its Office of Research and
Development (ORD) directed and managed this work. This study was funded through the
Analysis for Coastal Operational Resiliency (AnCOR) Project by the U.S. Department of
Homeland Security Science and Technology Directorate under interagency agreement IA
70RSAT18KPM000084. This report was prepared by Eastern Research Group, Inc. under EPA
Contract Number EP-C-16-015. This report has been reviewed and approved for public release
in accordance with the policies of the EPA.

Mention of trade names or commercial products does not constitute endorsement or
recommendation for use of a specific product. The contents are the sole responsibility of the
authors and do not necessarily represent the official views of EPA, DHS S&T, or the United
States Government.

Questions concerning this document, or its application should be addressed to:

Timothy Boe

U.S. Environmental Protection Agency
Office of Research and Development

Center for Environmental Solutions and Emergency Response
109 T.W. Alexander Dr. (MD-E-343-06)

Research Triangle Park, NC 27711
Phone 919.541.2617


-------
FOREWORD

The U.S. Environmental Protection Agency (EPA) is charged by Congress with protecting the
Nation's land, air, and water resources. Under a mandate of national environmental laws, the
Agency strives to formulate and implement actions leading to a compatible balance between
human activities and the ability of natural systems to support and nurture life. To meet this
mandate, EPA's research program is providing data and technical support for solving
environmental problems today and building a science knowledge base necessary to manage our
ecological resources wisely, understand how pollutants affect our health, and prevent or reduce
environmental risks in the future.

The Center for Environmental Solutions and Emergency Response (CESER) within the Office of
Research and Development (ORD) conducts applied, stakeholder-driven research and provides
responsive technical support to help solve the Nation's environmental challenges. The Center's
research focuses on innovative approaches to address environmental challenges associated with
the built environment. We develop technologies and decision-support tools to help safeguard
public water systems and groundwater, guide sustainable materials management, remediate sites
from traditional contamination sources and emerging environmental stressors, and address
potential threats from terrorism and natural disasters. CESER collaborates with both public and
private sector partners to foster technologies that improve the effectiveness and reduce the cost
of compliance, while anticipating emerging problems. We provide technical support to EPA
regions and programs, states, tribal nations, and federal partners, and serve as the interagency
liaison for EPA in homeland security research and technology. The Center is a leader in
providing scientific solutions to protect human health and the environment.

Through this effort, candidate tools were exercised and evaluated to assess the current state of
technologies to enhance the United States Coast Guard (USCG) and EPA's ability to respond to
and recover from a chemical, biological, radiological and nuclear (CBRN) incident. The
technologies and software recommended will be exercised through a complete data management
workflow during the Analysis for Coastal Operational Resiliency (AnCOR) field study that was
held in May 2022. The operational considerations illuminated through this study provided
invaluable information to ensure increased preparedness and, ultimately, more efficient, and
successful field data acquisition and management activities.

Gregory Sayles, Director

Center for Environmental Solutions and Emergency Response

111


-------
ACKNOWLEDGMENTS

Contributions of the following individuals and organizations to this report are acknowledged:

U.S. EPA Technical Reviewers of Report

John Archer (EPA/ORD/CESER/HSMMD)

Elise Jakabhazy (EPA/OLEM/OEM/CBRN CMAD)

U.S. EPA Quality Assurance

Eletha Brady Roberts
Ramona Sherman

iv


-------
TABLE OF CONTENTS

Disclaimer	ii

Foreword	iii

Acknowledgments	iv

List of Tables	vi

List of Figures	vi

Acronyms and Abbreviations	viii

Executive Summary	ix

1	Introduction	1

2	Quality Assurance/Quality Control	3

3	Data and Technology Assessment Demonstration Overview	3

4	Tools for Supporting Sampling Design and Implementation	7

4.1	MicroSAP	7

4.2	Trade-Off Tool for Sampling (TOTS)	7

4.3	Simple QUIck REad Label (SQUIREL)	8

5	Field Data Aquistion Software Evaluation	9

5.1	Esri's ArcGIS Field Apps Suite	10

5.1.1	ArcGIS Pro Linear Referencing (Create Routes)	10

5.1.2	ArcGIS Field Maps	11

5.1.3	ArcGIS Dashboards	12

5.1.4	ArcGIS Tracker	13

5.2	CBRNResponder	14

5.2.1	Event Management and Configuration	15

5.2.2	Field Data Collection	17

5.2.3	Event Dashboard and Map	19

5.2.4	Reports	20

5.3	Android Team Awareness Kit (ATAK)	21

5.4	EPA Scribe	22

6	Technology Evaluation	22

6.1	Global Positioning System (GPS) Devices	22

6.2	Mobile Devices	27

7	Data Management and Real-time Quality Control Measures	28

7.1	ArcGIS Field Maps Data Storage	28

7.2	CBRNResponder Data Storage	29

7.3	Real-Time Quality Control Measures	29

7.3.1 ArcGIS Field Maps QC	29

v


-------
7.3.2 CBRNResponder QC	30

8	DATA DAY Demonstration Participant Observations and Findings	30

8.1	General Observations and Participant Feedback	30

8.1.1	Demo Implementation	30

8.1.2	Operational Feasibility	31

8.1.3	Technol ogy Evaluati on	31

8.1.4	Mobile Devices	32

8.1.5	Software	33

8.2	Data Manager Observations	37

8.2.1	Operational Feasibility Considerations	37

8.2.2	ArcGIS Field Apps Suite	38

8.2.3	CBRNResponder	38

8.2.4	Other Noted Observations	39

9	Conclusions and Recommendations	39

10	References	44

LIST OF TABLES

Table 1. GPS Test Locations	23

Table 2. Average Horizontal Error (ft) by Location and Unit	27

Table 3. Additional Observations and Feedback	42

LIST OF FIGURES

Figure 1. Map of sampling, control, and test points throughout the RTP campus	4

Figure 2. Progression of stations	4

Figure 3. Priority evaluation criteria	5

Figure 4. Example TOTS screen illustrating a random sample design	8

Figure 5. Example TOTS screen illustrating a 20x20 foot grid-based sample design	8

Figure 6. Simple QUIck REad Label (SQUTREL) interface	9

Figure 7. Assigned team sampling routes	10

Figure 8. AnCOR DATA Demo Field Maps field data capture form	12

Figure 9. AnCOR DATA Demo ArcGIS dashboard	13

Figure 10. Example of misplaced samples	13

Figure 11. Example tracking route	14

Figure 12. CBRNResponder data fields	15

Figure 13. CBRNResponder sampling location import template	16

Figure 14. CBRNResponder assignment import template	17

Figure 15. CBRNResponder App screens for field data capture - Part 1	18

Figure 16. CBRNResponder App screens for field data capture - Part 2	19

Figure 17. CBRNResponder App event map for monitoring activities	20

vi


-------
Figure 18. CBRNResponder App supports generating sample labels	20

Figure 19. CBRNResponder chain of custody form example	21

Figure 20. Screenshot of ATAK viewer showing samples and route with aerial image overlay. 22

Figure 21. Leica Nova MS50 multi-station and associated prism marker	23

Figure 22. National Geodetic Survey benchmark	24

Figure 23. Horizontal error for test location A	25

Figure 24. Horizontal error for test location B	25

Figure 25. Horizontal error for test location C	26

Figure 26. Horizontal error for test location D	26

Figure 27. Horizontal error for test location E	27

Figure 28. Table showing a sample of ArcGIS Field Maps data collection results	28

Figure 29. Table showing a sample of CBRNResponder data collection results	29

Figure 30. AnCOR Data management tasks and supporting tools	41

vii


-------
ACRONYMS AND ABBREVIATIONS

AnCOR

Analysis for Coastal Operational Resiliency

API

application programming interface

ATAK

Android Team Awareness Kit

CBRN

chemical, biological, radiological, or nuclear

CESER

Center for Environmental Solutions and Emergency Response

CMAD

Consequence Management Advisory Division

COTS

commercial off-the-shelf

COVID

coronavirus disease

CSV

comma-separated value

DHS

Department of Homeland Security

DoD

U.S. Department of Defense

DOE

U.S. Department of Energy

EPA

U.S. Environmental Protection Agency

ERG

Eastern Research Group, Inc.

ERT

Emergency Response Team

FEMA

Federal Emergency Management Agency

FFR

full-face respirator

GIS

geographic information system

GOTS

government off-the-shelf

GPS

Global Positioning System

HSMMD

Homeland Security and Materials Management Division

ICS

incident command structure

OEM

Office of Emergency Management

OLEM

Office of Land and Emergency Management

ORD

Office of Research and Development

PPE

personal protective equipment

QR

quick response

RCRA

Resource Conservation and Recovery Act

S&T

Science and Technology Directorate

SQUIREL

Simple QUIck REad Label

TOTS

Trade-off Tool for Sampling

USCG

U.S. Coast Guard

viii


-------
EXECUTIVE SUMMARY

In the event of a chemical, biological, radiological, and/or nuclear (CBRN) wide-area incident,
the U.S. Environmental Protection Agency (EPA) has the authority to take actions to respond to
releases of hazardous substances, pollutants, and contaminants, including leading the response.
This response includes cleanup and waste management, which needs data collection, and data
quality checks to advise decision-making. The U.S. Coast Guard (USCG) shares this
responsibility for certain incidents in the maritime domain. This research aims to streamline and
improve the capabilities of USCG and EPA responders to a wide-area incident. Specifically, the
research aimed to identify and recommend user-friendly tools that more easily facilitate the
acquisition and subsequent management of field sampling data following a wide-area incident.
Tools identified through this research were then further evaluated during a technology
demonstration day hosted by EPA's Homeland Security Research Program, in association with
the Department of Homeland Security (DHS)/EPA-sponsored Analysis for Coastal Operational
Resiliency (AnCOR) Data Project.

Phase 1 of this project evaluated currently available commercial off-the-shelf (COTS) or
government off-the-shelf (GOTS) that appeared to have features that would meet the largest
number of needs. A subset of tools evaluated were recommended for further evaluation,
including Esri's Survey 123/Collector/Field Maps Suite, Android Team Awareness Kit (ATAK),
EPA's Scribe, and RadResponder. While RadResponder was specifically identified during Phase
1, the Federal Emergency Management Agency (FEMA) sponsors a suite of "responder apps"
collectively referred to as CBRNResponder that includes access to both RadResponder and
ChemResponder (and soon, BioResponder).

Phase 2 of this project focused on exercising and evaluating the candidate tools recommended in
Phase 1. Specifically, this study evaluated the current state of technologies through a
demonstration event and documented observations and recommendations to enhance the USCG
and EPA's ability to respond to and recover from a CBRN incident. Through this project, EPA
gained invaluable experience in understanding how to apply advances in technologies and
software to improve field data acquisition tasks. Important technological issues were identified to
inform future planning and training efforts. Based on the expressed needs of EPA and
DHS/USCG and the experiences of participants in the AnCOR DATA Demo, the project team
recommends using Esri's suite of tools and ArcGIS Field Maps to support field data acquisition
efforts for the AnCOR program. Consistent with the findings from a related effort to assess data
visualization and analysis tools, the Esri suite has the most features that meet the largest number
of needs, is familiar to and accepted by target stakeholders, and is generally viewed as easy to
customize and tailor to meet the specific needs of the operation.

CBRNResponder, and a forthcoming BioResponder, offer many promising features. At present,
however, several key requirements for EPA's AnCOR program cannot currently be met—
namely, alignment with required data fields/types that will be collected and integration with real-
time geospatial assets. The project team recommends that EPA continue engagement attempts
with FEMA to convey EPA's needs regarding biological sampling (and other agents), and

IX


-------
closely monitor FEMA's progress and tool enhancements to determine whether the tool could
better meet EPA's needs in the future.

The AnCOR project will conclude with a Wide Area Demo (WAD) that consists of a field-level
biological remediation study. The primary purpose of the AnCOR WAD is to operationally test
and evaluate options for decontamination, sampling, data management, and waste management
for areas impacted by a wide-area biological agent release in a USCG or urban environment. A
secondary goal of this project was to document a repeatable, transparent, and stable workflow to
support the WAD data management needs. To address this need, the project team developed a
Data Management Task/Workflow that identifies when and how various data management tools
can be used across the response. Specific tasks that have a related data management component,
the various tools that are available to support activities, and the established workflow among the
tasks and tools were documented. Important additional considerations resulting from experiences
gained through completing Phase 2 of the project were also identified and generally centered on
the following topics:

•	Quality Control Procedures/Objectives,

•	Field Data Capture Form,

•	Training,

•	Operational Logistics, and

•	Managing Devices.

Through this effort, candidate tools were exercised and evaluated to assess the current state of
technologies to enhance the USCG and EPA's ability to respond to and recover from a CBRN
incident. The technologies and software recommended were exercised through a complete data
management workflow during the AnCOR WAD field study held in May 2022. The operational
considerations illuminated through this study provided invaluable information to ensure
increased preparedness and, ultimately, more efficient and successful field data acquisition and
management activities.

x


-------
1 INTRODUCTION

The U.S. Environmental Protection Agency (EPA) is designated as a coordinating agency, under
the National Response Framework,1 to respond to discharges or releases of oil and hazardous
substances. As such, EPA's role is to prepare for, respond to, and recover from threats to public
health, welfare, or the environment posed by oil and hazardous materials incidents. Hazardous
materials incidents can include accidental or intentional releases of chemical, biological, and
radiological or nuclear (CBRN) substances. EPA can also have responsibilities to address debris
and waste through decontamination, removal, and disposal operations.

Following a wide-area CBRN incident, from initial characterization sampling to evaluate the
contamination event through clearance sampling and waste disposal processes, a substantial
amount of data will need to be collected, checked for quality, and maintained to support
decision-making. Depending on the size and scope of the hazardous contamination, data
management could result in a significant technological undertaking that could continue for many
years. Types of data collected during the response could include:

•	Sample location,

•	Sample matrix,

•	Sampling method,

•	Time and date of sample collection,

•	Image of sample location or sampling surface,

•	Sample collection personnel or team,

•	Laboratory processing the analysis,

•	Analysis results,

•	Mapping data (e.g., Global Positioning Systems [GPS], light detection and ranging
[LiDAR], photogrammetry),

•	Documentation of quality assurance activities, and

•	Decontamination method.

Tools and technologies that might be used during characterization and clearance sampling
include computers or tablets, software applications, mobile devices, databases, data models,
geographic information system (GIS) applications, laboratory reporting tools, and auxiliary tools
such as GPS. Understanding the capabilities of these tools and technologies, identifying how
they connect and work together, and evaluating the usability of various technologies is critical to
advancing EPA's and the Department of Homeland Security's (DHS's) data management
capabilities. Data management frameworks are plans that are developed to help address
practically every part of the data management process including the individual tools,
technologies, and processes that are used to collect, store, retrieve, and visualize data. Integrating
a suite of technologies to support a comprehensive data management framework is necessary to
effectively organize, document, quality assure, and communicate data during a wide-area CBRN
incident.

'The National Respoi	nework is a guide to how the Nation responds to all types of disasters and

emergencies.

1


-------
This project supports the Analysis for Coastal Operational Resiliency (AnCOR) program.
AnCOR is a multi-agency program with the purpose of developing and demonstrating
capabilities and strategic guidelines to prepare the U.S. for a wide-area release of a biological
agent, including mitigating impacts to United States Coast Guard (USCG) facilities and assets
[1], This project evaluated the current state of technologies for conducting site surveys and
managing sampling data following a wide-area incident and correlated supportive technologies
with specific field sampling activities to describe how tools and technologies are applied within
an overall decision framework.

This project aims to streamline capabilities and identify improved data management tools to
better fit the needs of DHS, USCG, and EPA responders following a wide-area contamination
incident. The project does so by evaluating the current state of tools and technologies that
facilitate the acquisition and subsequent management of field sampling data.

This project had four (4) primary objectives:

1.	Conduct a literature review and market research to identify relevant articles, reports,
and other information describing research, ongoing initiatives by regional and state
partners, and available commercial-off-the-shelf products that streamline and modernize
field data collection activities;

2.	Solicit subject-matter expert feedback from the response and research community on
important functionality that field data acquisition and/or data management tools and
technologies should have for responding to a wide-area incident;

3.	Identify and evaluate technology to support response personnel based on
recommendations provided by the response community; and

4.	Conduct a field-scale demonstration to further evaluate operational aspects of selected
technologies for the potential to enhance preparedness.

Due to COVID-19 (coronavirus disease 2019) restrictions postponing field exercises, this project
was divided into two phases. Phase 1 addressed objectives 1 and 2 and part of objective 3 where
candidate tools were identified for further evaluation. From Phase 1 of this project, EPA gained a
better understanding of users' needs and identified available candidate tools to evaluate [2],

Phase 2, the subject of this report, addressed objectives 3 and 4 in which candidate tools were
exercised and evaluated. Specifically, this study evaluated the current state of technologies
through a demonstration event and documented observations and recommendations to enhance
the USCG and EPA's ability to respond to and recover from a CBRN incident. The findings that
resulted are described in the remainder of this report, which is structured in the following
manner:

•	Chapter 2 discusses quality assurance/quality control activities,

•	Chapter 3 provides an overview of the demonstration event,

•	Chapter 4 describes several additional tools that are available to support sampling design
and implementation,

•	Chapter 5 describes the field data collection software that was evaluated and used during
the demonstration,

2


-------
•	Chapter 6 summarizes the technology that was evaluated, including GPS and mobile
devices,

•	Chapter 7 discusses data management and additional real-time quality control measures
that were evaluated,

•	Chapter 8 presents participant observations and findings that resulted from the
technology demonstration and evaluation, as well as observations related to configuring
technology and software to support the demonstration, and

•	Chapter 9 summarizes the conclusions and recommendations resulting from Phase 2 of
this research.

2	QUALITY ASSURANCE/QUALITY CONTROL

The purpose of this study was to synthesize existing knowledge and research related to data
management applications that could be used following a wide-area CBRN incident. The work
and conclusions presented as part of this study were empirical and observational—no scientific
experiments were performed. Technical area leads evaluated candidate tools and provided
feedback on their experiences.

3	DATA AND TECHNOLOGY ASSESSMENT
DEMONSTRATION OVERVIEW

EPA conducted the AnCOR Data And Technology Assessment (DATA) demonstration (AnCOR
DATA Demo) at EPA's Campus in Research Triangle Park (RTP), North Carolina on September
15th-17th, 2021. The AnCOR DATA Demo evaluated and operationally exercised tools and
supportive technologies for use in supporting sampling activities and data management workflow
and processes following a contamination incident. The study took place outdoors (over a 200-
acre test area) and involved approximately 20 volunteers from EPA and DHS/USCG.

Participants completed a series of controlled tests using a variety of technologies to identify,
locate, and document mock biological surface samples. As shown in Figure 1, 200 sampling
points (point color indicates team association), five control points, and one test point were
established to support exercising technology and software.

3


-------
Figure 1. Map of sampling, control, and test points throughout the RTP campus.

Below is a list of the events that were completed throughout the two-day period:

•	Welcome, Objectives, and Methods Briefing,

•	Health and Safety Briefing,

•	Technology Overview Presentation,

•	Sampling Method Demonstration (sponge and microvac),

•	Stations/Teams Processing, and

•	Phases 1 through 3 of Field Testing.

Teams exercised different technology and field data collection software (consisting of three
separate iterations) over two days. During each iteration, individuals were paired and assigned to
team s to test different configurations of technology/software. Prior to beginning a sample
collection phase, each team proceeded through a series of stations, as shown in Figure 2, and
returned to report feedback following completion of the sampling activity.

Figure 2. Progression of stations.

4


-------
Following the completion of an iteration, participants reported their observations and feedback.
Participants evaluated key criteria that were established as high priority during Phase 1 of the
study. During Phase 1, project team members— consisting of federal responders, data
management subject matter experts, and researchers, were asked to rank the importance of each
criterion to prioritize essential attributes. The rankings were analyzed to identify the top
evaluation criteria that were designated as a high priority for consideration (see Figure 3) [2],

1.

Web-based System

12. Allows Data Capture of Notes

2.

Ease of Use In Personal Protective

13. Ease of Customization



Equipment

14. Database Compatible Export Format

3.

Easy to Maintain

15. Long Battery Life

4.

GPS Enabled/Capable

16. Rugged Device

5.

Functions with or without Wireless

17. Flexible Data Export Formats



Connectivity

18. Flexible Export Formats to Support Easily

6.

Image Capture Capability

Formatting Data for Representational

7.

Few Clicks for Data Entry

State Transfer (REST) Services

8.

Ease of Configurability

19. Compatible with Other EPA Systems

9.

Includes GIS Capabilities

20. Current Availability

10

Capable of Accurately Locating Indoor

21. Minimal Components with Capability to



Sampling Locations (X, Y, Z)

Expand (USB/Bluetooth)

11. Mobile Compatible



Figure 3. Priority evaluation criteria.

In addition to general observations and feedback, participants (including data managers) were
asked to observe and report on their experiences using the technologies evaluated during the
demonstration. For the Phase 2 evaluation, priority evaluation criteria were organized into the
following four over-arching categories: Software, Hardware, Technology Configuration, and
Operational Feasibility:

Software

•	Use of touch-sensitive data capture forms while wearing personal protective equipment
(PPE); note: garden gloves were used as a substitute during testing,

•	Toggling online or offline mode,

•	Capturing an image using the camera feature,

•	Capturing a video,

•	Entering text using a finger and/or stylus,

•	Using onscreen maps,

•	Scanning a quick read (QR) code (for tracking samples),

•	Synchronizing data from mobile application to centralized data storage, and

•	Using navigational and GIS features.

Hardware

•	GPS performance,

•	Battery capacity,

•	Capability to expand (USB/Bluetooth), and

•	Ruggedness of device (i.e., performance related to excessive heat, light).

5


-------
Technology Configuration

In advance of the exercise, AnCOR DATA Demo project team members were responsible for
acquiring devices and configuring field data collection software. Observations related to the ease
of implementing the acquired technologies and field data collection software were also captured
and documented, including:

•	Compatibility with other EPA systems,

•	Database-compatible export formats,

•	Ease of configurability,

•	Ease of customization,

•	Ease of maintenance, and

•	Flexible data export formats.

Operational Feasibility

The AnCOR DATA Demo project team also evaluated the feasibility of several important
considerations related to technologies and data management operations in the field. The
following considerations were evaluated:

•	Span of Control: The number of individuals or resources that one person can effectively
manage during an incident according to the Incident Command System (ICS).

•	Just-in-Time Training: Training personnel only when it is needed rather than in advance
or on a predetermined frequency. These training opportunities are typically used as
refresher courses prior to emergency response teams utilizing procedures or technologies.

•	Offline Operation: Communications might be inoperable following a large-scale
biological incident; therefore, the AnCOR DATA Demo evaluated the use of data
acquisition tools without access to internet.

•	PPE Limitations: The use of PPE can limit the dexterity of personnel, especially when
using tablets that are touch sensitive. Teams were randomly (via injects) asked to use
thick garden gloves (to mimic PPE) and equipped with a stylus. Teams were then asked
to interface with the software.

•	Protecting Sensitive Equipment: The decontamination of expensive electronic
equipment for reuse is essential. Previous field studies have successfully demonstrated
that electronic tablets (e.g., Apple iPads) can be successfully decontaminated by
encapsulating them in a water-resistant case and dunking them in a bleach solution [3],
The use of external GPS equipment, however, complicates this process.

Chapters 4 and 5 describe the software tools and technology that were evaluated during the
AnCOR DATA Demo, including several tools that EPA created to support designing and
implementing a sample plan. Chapters 6 through 8 summarize observations from the project
team and demonstration participants.

6


-------
4 TOOLS FOR SUPPORTING SAMPLING DESIGN AND
IMPLEMENTATION

As discussed, following a wide-area CBRN incident, from initial characterization sampling to
evaluate the contamination event through clearance sampling and waste disposal processes, a
substantial amount of data will need to be collected, checked for quality, and maintained to
support decision-making. Depending on the size and scope of the incident, sampling activities
might require a significant effort. Phase 1 research efforts of this project identified candidate
field data acquisition software tools to further evaluate and exercise during the AnCOR DATA
Demo. In addition to evaluating the candidate software, EPA also maximized this opportunity to
evaluate several other tools and technologies that can support implementing sample designs and
data management needs during an event. An overview of other tools and technologies that can
support this phase of the response is provided below.

4.1	MicroSAP

EPA's Sampling and Analysis Plan Template Tool is available to assist planners in developing a
sampling and analysis plan (SAP) needed to collect data that are suitable for decision-making
and/or determinations of existing conditions for all phases of a contamination incident involving
pathogens in which EPA would be responsible for conducting sampling and analysis [4], While
the project team did not use MicroSAP for the demonstration, the framework/guidance is a
resource that can be used to support documentation of the sample collection and analysis
procedures or methods to be used, sampling design, quality control procedures, and data
reduction and visualization planned. The tool supports documenting user inputs related to an
incident while associating those inputs with their respective data quality objectives. EPA is also
developing an online tool to support plan creation, and in the meantime a template and user's
guide are available that provide the framework for capturing information needed to complete the
SAP.

4.2	Trade-Off Tool for Sampling (TOTS)

The Trade-Off Tool for Sampling (TOTS) allows users to create sampling designs and estimate
the associated resource demand through interactive point-and-click tools to visually develop
sampling plans for biological contamination incidents [5], Users can plot sample locations in
conjunction with externally developed indoor or outdoor imagery that can be imported into the
tool. Users can configure and include custom sampling methods, visualize sampling plans, and
share sampling plans. Based on the plans designed, TOTS estimates the total time and cost
necessary for implementation, which includes preparing sampling kits, conducting the sampling
campaign, and analyzing the samples in the laboratory. The resulting sampling plan can be used
to consider trade-offs in the sampling design (i.e., cost-benefit analysis), alternate sampling
approaches (i.e., traditional versus innovative sampling methods), and sampling coverage. TOTS
also operationalizes sampling plan maps by enabling response field personnel to leverage a web-
based map, real-time navigation, and field data capture while sampling in the field. TOTS
outputs include geospatial assets that can be saved and shared for reuse.

TOTS was used by the AnCOR DATA Demo project team to create sampling designs for the
AnCOR DATA Demo and to determine whether the proposed designs could be successfully

7


-------
executed within the allotted timeline of the demonstration. Figure 4 shows a screenshot of TOTS
with an active random sampling design. Figure 5 illustrates a 20x20 foot grid-based approach
that could be implemented by the AnCOR planning team. Both the random and grid-based
designs were exercised as part of the AnCOR DATA Demo. The sample designs were exported
and used with field data capture software evaluated for the demo.

Trade-off Tool for Sampling (TOTS)

Basemap Legend Logout Contact Us

$ Add Data

a Publish
ฆฆ Output

Resource Tally
Total Cost: $98,414

*3r S27.014
A 571,400
Max Time day(s): 2.9

Limiting Factor

A Analysis

Create Plan

O Training Mode
Q Auto Zoom

c

Start Over

Delete All
Samples

An empty sample layer is loaded by
default. Use the "Active Sampling
Layer" controls to link, add, modify,
and/or delete the sampling layer
associated with the active plan. You
may associate multiple layers with a
plan by selecting sampling layers from
the menu and clicking the link icon. The
menu will display linked layers and
indicate other layers available for
linking. Use the "unlink" control to
remove a layer from a plan.

Specify Plan
AnCOR DATA Demo

Bf +

Active Sampling Layer CD 28 Bf +
Default Sample Layer

oo „o



V " ฐฐ

%

o o o

O n

CP ฐฉ

Oo 0.



o o

€

Oo00

o

ฐCPOo oO
o Oฎ _.o fiCP n
o<, cp

ฉ	jy oo

ฃฐo-*ฐjjr •o%0

Figure 4. Example TOTS screen illustrating a random sample design.

Figure 5. Example TOTS screen illustrating a 20x20 foot grid-based sample design.

4.3 Simple QUIck REad Label (SQUIREL)

Field data capture applications can leverage QR code technology to minimize data entry
requirements while in the field. Operational personnel consistently emphasize the desire to
record data with as few clicks as possible and leverage technological advances that streamline
data entry and minimize data transposition errors. In support of this need, EPA developed the
Simple QUIck REad Label (SQUIREL) tool [6],

SQUIREL is a lightweight tool that can be used to create sample label designs that are rendered
in a QR code format that can be used with field data capture applications and/or laboratory

8


-------
systems; therefore, reducing human error when documenting sampling identifiers and other
relevant information. Users can generate labels by entering study-specific nomenclature into text
fields. Additional columns can be added to expand the format of each label. Rows can be added
to include multiple label designs within a single instance. Alternatively, QR codes can be
generated by directly importing a comma-separated value (CSV) file. Once the label
nomenclature is established, SQUIREL generates a portable document format (PDF) document
specific to the chosen label sizes. Labels can then be attached to sampling bags and other
containment apparatuses for quick scanning in the field or in the lab. Figure 6 displays the
SQUIREL interface.

Q SQUIREL

Simple QUIck REad Label (SQUIREL)

piacenoiaers ai any location in any column, or in us own column:

1.	#num_seq - Adds a sequentially incrementing number starting from 1 (ex. 1,2, 3...)

2.	#num_rand - Adds a randomly generated number between 1 and 1000, no repeats

3.	#alpha_seq - Adds a letter sequentially increasing from A-Z (ex. A, B, C...)

4.	#alpha_rand - Adds a random letter combination between A and ZZ, no repeats

Credits: Taha Karimi, Timothv Boe. Worth Calfee

Figure 6. Simple QUIck REad Label (SQUIREL) interface.

5 FIELD DATA AQUISTION SOFTWARE EVALUATION

Phase 1 research efforts of this project identified software tools and technologies to further
evaluate and exercise during the AnCOR DATA Demo. Emphasis was placed on tools that are
currently available commercial off-the-shelf (COTS) or government off-the-shelf (GOTS) and
appeared to have features that would meet the largest number of users' expressed needs. Tools
recommended in Phase 1 for further evaluation included:

•	Esri's ArcGIS Field Apps Suite - field data capture and mapping capabilities,

•	RadResponder2 (CBRNResponder) - field data capture capabilities and aggregated
reporting,

•	Android Team Awareness Kit (ATAK) - increased situational awareness and offline
communication needs, and

•	EPA Scribe - storing field and laboratory data.

An overview of how the tools were configured and insights gained from exercising them during
the AnCOR DATA Demo is discussed in the sections that follow. The project team utilized the

2 While RadResponder was specifically identified during Phase 1, the Federal Emergency Management Agency
(FEMA) sponsors a suite of "responder apps" collectively referred to as CBRNResponder that includes access to
both RadResponder and ChemResponder (and soon, BioResponder). The demonstration evaluated
CBRNResponder.

9


-------
outputs from both TOTS and SQUIREL (discussed in Chapter 4) in conjunction with the field
data acquisition software.

Esri provides a large suite of tools to support geospatially-driven tasks and analyses. EPA
provides an enterprise-level offering for Esri's suite of tools. AnCOR DATA Demo participants
only evaluated Esri's ArcGIS Field Maps tool during the exercise, whereas the project team
utilized Esri routing tools for planning and the ArcGIS Dashboard and Tracking tools during the
demonstration.

5.1.1 ArcGIS Pro Linear Referencing (Create Routes)

The routing of teams for sampling or decontamination purposes is a significant challenge. Teams
could encounter hazardous environments and/or spread contamination to otherwise clean areas.
Because many of the tools evaluated in support of the AnCOR DATA Demo are GIS-based, the
project used advanced geospatial capabilities [7] to determine optimal paths according to time
and distance. Sample designs were transitioned directly from TOTS into ArcGIS Field Maps
(TOTS automatically publishes sample designs to an ArcGIS Field Map web map with attributes
to customize a data capture template).

Using ArcGIS Pro's Linear Referencing (Create Routes) feature, samples were automatically
grouped according to proximity, and paths were drawn specific to start and end locations. The
results of this analysis were used to determine sample sequence (i.e., the sequence in which
samples are collected). The resulting pathways could then be made available to other geospatial
tools to support sampling team navigation. Figure 7 shows the resulting pathways teams (point
color indicates team association) used to navigate to samples.

5.1 Esri's ArcGIS Field Apps Suite

Figure 7. Assigned team sampling routes.

10


-------
5.1.2 ArcGIS Field Maps

To support teams navigating to sampling locations, recording and documenting field data, and
managing team assignments, a customized template for ArcGIS Field Maps was developed.
ArcGIS Field Maps is an all-in-one application that uses data-driven maps to help mobile
workers perform data collection and editing, find assets and information, and report their real-
time locations. Field Maps supports both iOS and Android mobile operating systems and can
operate in offline mode (for saving data locally) [8],

Since Field Maps is based on a full-scale GIS platform, custom aerial imagery and feature layers
(e.g., sampling points, paths, and boundaries) can be used to support situational awareness. Field
Maps includes an integrated navigation capability that indicates the distance and bearing to the
assigned sample. The tool visually prompts the user once they have reached the designated
sampling location and documentation can begin. The tool further includes image and video
capturing capabilities, a built-in QR code scanner, and conditional inputs for documenting
sample types, sampling start/finish times, and observations.

The AnCOR DATA Demo project team created tailored data capture templates for use in the
demonstration. Figure 8 illustrates the steps that are required to document a sample using the
AnCOR DATA Field Maps data capture form:

1)	Select Sample (far left): Samples are shown in sequential order according to the most
optimal path. The user selects the appropriate sample to initiate navigation.

2)	Navigate (left center): The user's current location is displayed with reference to the
sample location. The distance and bearing to the assigned sample are shown at the bottom
of the screen.

3)	Scan Sample Bag (right center): The user scans the QR code attached to the equipped
sample bag. The sample ID is then appended to the sample location ID. The tool then
provides special instructions regarding the location or sampling procedures.

4)	Document/Record (far right): Once the user scans the sampling bag, the remaining input
fields are unlocked. The form automatically collects start/end times, information on the
surface sampled, and observations.

The form was designed to be completed within 60 seconds and is stylus friendly. Once
completed, the form is either saved to the device locally when operating in offline mode or saved
to ArcGIS Online (the Esri Cloud) when operating in online mode.

11


-------
q Team 1 Sample 1
G3

Team 1 Sample 1

Instruct

Take a picture of the ground

Figure 8. AnCOR DATA Demo Field Maps field data capture form.

x-n Team 1 Sample 10

35.881836*N 78.869953'W 3ซซซn wซ.

5.1.3 ArcGIS Dashboards

A customized ArcGIS dashboard was developed to show the status of sample collection
activities in real-time. The dashboard is composed of three components: 1) Map (center): the
map displays the location and status of each sample (according to team and sequence). The color
of the sample changes according to the surface type sampled. The status of samples is updated
instantaneously (when operating in online mode); 2) Status (top right): the status counter
displays the total number of samples plotted versus the total samples remaining to be sampled; 3)
Bag Scans (bottom right): the sampling bag scan section displays the sampling bag ID and
sample location ID as they are scanned. This view can be used to detect erroneously scanned
sample bags or locations. A screenshot showing the AnCOR DATA Demo dashboard is shown
in Figure 9.

12


-------
Data Acquisition Samples

Sample Bag Use

Team 10 Sample 7
' Sponge-190-9/15/2021.10:30:26 AM

Team 10 Sample 6

SPONGE-186 - 9/15/2021,10:27:12 AM
Team 10 Sample 2

Sponge-199-9/15/2021. 10:12:40 AM
Team 10 Sample 3

Sponge-198 - 9/15/2021,10:17:18 AM
Team 10 Sample 4

Sponge-193-9/15/2021.10:19:13 AM

| Lttupdsto * foy fCGrefr ซgo	

Number of Finished Samples

I Lmt updai*: * fov mcc-oj *50

Figure 9. AnCOR DATA Demo ArcGIS dashboard.

The dashboard can also be used to indicate the location where the sample was taken compared to
its intended location. Figure 10 shows a few instances in which the sample diverged from its
intended location (the white dot indicates the intended sample location, and the colored dots
indicate where the sampling team was located when they recorded the sample). This information
is also shown in real-time and could be used to identify samples taken in the wrong location.

Figure 10. Example of misplaced samples.

5.1.4 ArcGIS Tracker

In addition to optimizing navigation, the AnCOR DATA Demo also evaluated Esri's ArcGIS
Tracker solution for tracking teams in real-time. Teams that were assigned cellular-activated
tablets were tracked throughout the duration of their sampling activities. The functionality can be

13


-------
combined with the dashboard to improve situation awareness, health and safety, and the potential
for plotting new samples in-situ depending on a given team's location. Figure 11 shows a
screenshot of a team's completed route (the top of the figure indicates true north). The teal-
colored line indicates the precise path they traveled. The circle containing the letters "TB"
indicates their current location. This information can be relayed back to the command post to
provide situational awareness and mission status updates.

ฉ Sep 16, 2021 8:00 AM - Sep 16,2021 11:00 AM • & 1 User *

m m W

if*

Figure 11. Example tracking route.

5.2 CBRNResponder

While the AnCOR program was designed to address a biological agent contamination incident,
Phase 1 research identified RadResponder (a collaborative tool for responding to radiological or
nuclear emergencies) as a candidate tool to evaluate, along with a new tool, BioResponder, to
assess potential applicability/expansion to support a biological event. The BioResponder tool
was under development during the project study period but is expected to aid with the collection
of biological samples and laboratory analyses [9],

Following the completion of Phase 1 research, a single platform for accessing all CBRN event
types was launched—CBRNResponder. CBRNResponder is a free application for emergency
response organizations that is sponsored by the Federal Emergency Management Agency
(FEMA) and other federal partners [10], As of September 2021, the full scope of data fields was
unknown, and the applicability to the biological sampling events in this context (versus an
epidemiological context) could not be assessed at that time. Nonetheless, the project evaluated
the overall CBRNResponder framework that was available to determine potential applicability to
support the AnCOR program. Similar to the Esri Field Apps suite, AnCOR DATA Demo
participants only evaluated the CBRNResponder Field Data Collection application during the
exercise, whereas the project team utilized the CBRNResponder website to access Event
Management and Dashboards for planning and oversight during the demonstration.

14


-------
5.2.1 Event Management and Configuration

The AnCOR DATA Demo project team established an account with the CBRNResponder
support team. A test event, DATA Day Test, was created to facilitate exercising the application.
The project team evaluated the different data types that are included to support field data
collection. For the demonstration, the team focused on the "Sample" and "Observation" data
types. Figure 12 below illustrates many of the available data types available for selection.

Prescribed data fields that are tied to specific data structures present some challenges given the
need for flexibility to nimbly respond to changing data needs. Depending on the phase of an
event, event type, and primary sampling objectives, what needs to be collected/characterized
could change, and there does not appear to be a way to easily tailor sampling methods (only an
"other" "sample type" can be created for an organization). Therefore, for the AnCOR DATA
Demo the project team selected representative data types that most closely represented what data
needed to be collected. Participants were instructed to ignore other irrelevant data fields and to
only focus on a subset of fields to collect, including:

•	Sample Type (Swipe),

•	ID/Barcode (Scan sample bag QR Code),

•	Surface Area/Units,

•	Comment, and

•	Photograph (via Attachment).

A sample type of "swipe" (e.g., wipe sample) was selected for demonstration purposes. The
types of sampling methods that will likely be needed for the AnCOR program (i.e., swab,
microvac, aggressive air, or other innovative methods) do not currently align with the methods
that are available for selection within the CBRNResponder application. The ability for users to
add user-defined sample types or other "flexible" user-defined fields on an ad hoc basis is
needed to meet EPA's needs.

Once the field survey form to support data collection was defined, the project team defined the
sampling locations for the event. A "Facility" was established to permit an association with

Restore visibility
Select All

Figure 12. CBRNResponder data fields.

15


-------
sampling locations. The EPA RTP campus was used as a surrogate facility. Sampling locations
can be entered one at a time (search/click on a map, enter latitude/longitude, or enter address) or
pre-existing sample locations can be bulk uploaded via an Excel-based format with predefined
coordinates using a provided template. While CBRNResponder allows shapefiles to be uploaded
for reference during an event, there is not currently a feature that would allow a user to
import/integrate geospatially-referenced sample locations from a map (e.g., using TOTS output
with sample locations mapped). An additional data transformation would be required to generate
and convert a shapefile to a CSV file to facilitate upload within CBRNResponder.

For the demonstration exercise, going through this process (or manually transposing the
coordinates) was not an issue. However, for a wide-area event that might require taking hundreds
or thousands of sample points, the ability to rely on a single data source of geospatially-
referenced sampling locations on a map is important—both to minimize extra processing steps
and to avoid data transformation errors. For the AnCOR DATA Demo, sample coordinates from
TOTS output, along with sampling instructions, were manually entered into the provided
template, and the template entries were bulk uploaded to the event. Figure 13 presents an
example of a completed template.

Sampling Location Import

[This upload is for adding sampling locations to selected facility. **A location is required, enter a latitude/longitude OR address**

Name

Latitude

Longitude

Street Address

City

State

Zip Description

Sample 1

35.88338

-78.870621







Sample surface using the provided sampling bag; data
person should video the process

Sample 2

35.883598

-78.870829







Locate and take picture of a spray painted rock

Sample 3

35.883251

-78.870548







Switch places: hand the device over to your teammate.
You may switch back after completing the next point

Sample 4

35.883062

-78.870487







Sample surface using the provided sampling bag; data
person should video the process

Sample 5

35.882878

-78.870344







Locate and take picture of a spray painted rock

Sample 6

35.882193

-78.870162







Sample surface using the provided sampling bag; data
person should video the process

Sample 7

35.882373

-78.870157







Sample surface using the provided sampling bag; data
person should video the process

Sample 8

35.882527

-78.870198







Locate and take picture of a spray painted rock

Sample 9

35.882702

-78.870273







Locate/scan laminated QR code

Sample 10

35.882009

-78.869945







Locate and take picture of a spray painted rock

Figure 13. CBRNResponder sampling location import template.

Once sample locations were established, the project team then created sampling team
assignments to facilitate completing sampling and data collection tasks. Using the
CBRNResponder website, assignments could be established one at a time or bulk uploaded by
completing a provided template where both individual and team assignments could be made. For
the AnCOR DATA Demo a single team was established, and assignments were associated with
specific teams and automatically associated with predefined sample locations (latitude/longitude)
and instructions. Figure 14 shows a completed sample bulk assignment template.

16


-------
Import Allowed Values

Field Team (on event) Facility

Sampling Location

Team Shed Row EPA

Sample 1



Sample 2



Sample 3



Sample 4



Sample 5



Sample 6



Sample 7



Sample 8



Sample 9



Sample 10





Figure 14. CBRNResponder assignment import template.

5.2.2 Field Data Collection

Several participants used the CBRNResponder application to assess its potential for wide-area
incident response. Figure 15 and Figure 16 present the series of steps participants completed to
simulate data collection activities (orange boxes illustrate user interface click-events):

1)	Select Event (Figure 15, top middle): Events listed for selection.

2)	Submit Data (Figure 15, top right): Initiate sample data collection.

3)	Choose Assignment (Figure 15, bottom middle): Review instructions and select
assignment.

4)	Review Assignment and Navigate to Sample Location (Figure 15, bottom right):
Review details and select one of several actions.

17


-------
DATA Day Test

Respond to Emergency

Click here to view all active emergencies.

My Events	o

From My Org	Shared w/ Me

DATA Day Test

US EPA Homeland Security Research Prograi

Testing/Training

09/09/2021 0:00 - 9/17/21

DATA Practice

US EPA Homeland Security Research Prograr
Exercise/Drill

09/02/20210:00 - 09/03/2021 23:ฃ9

Emergency Response

US EPA Homeland Security Research Prograi

Emergency Response

08/30/2021 9:07 - Ongoing

i Routine Monitoring



US EPA Homeland Security Ress

arch Program

Routine Monitoring

" 	y

[ Accuracy: 14m

Tracking: Off ft

ft Collected Data

Resources

Accuracy: 5m	Tracking: On O

4 ft

O A

Team Shed Row	0 incomplete

Sample surface using the provided sampling bag;
data person should video the process

ฆ

Team

Team
Switch

Manage Responder Tracking

Click "Enable Tracking" to begin
tracking your location.

Enable Tracking
Cancel

Team Shed Row	ฉ Incomplete

Locate and take picture of a spray painted rock

in

Team Shed Row	ฉ incomplete .

Switch places: hand the device over to your
teammate. You may switch back after completing
| the next point

Sample surface using the provided sampling
bag; data person should video the process

Due Date	No due date specified

Proximity in meters No proximity specified

35.88338, -78.87062

Navigate to Assignment

Assignment

Team Shed Row	Q incomplete ;

Sample surface using the provided sampling bag;
data person should video the process

ill

Team Shed Row	ฉ incomplete :

Sample surface using the provided sampling bag;
data person should video the process

Mark assignment complete
Record data for assignment

El

Tracking: Off

m i



\ Accuracy: 11m

Tracking: On O

fl Accuracy: 5m

Tracking: On ft

* c

ซ O A

* G

/Mens Assignments

ซ P &

Figure 15. CBRNResponder App screens for field data capture - Part 1,

5)	Navigate: Opens user's device mapping program to provide navigational support.
Requires leaving the CBRNResponder application to toggle between the mapping
application and the field data collection application. CBRNResponder's navigation
capabilities are limited to vehicle navigation using standard mobile routing platforms
(Google/Apple maps).

6)	Initiate Data Collection (Figure 16, middle): Users click to record data and specify the
data type to record. As previously noted, sample and observation data types were used for
the demonstration. The sample detail screen is made available for data entry.

7)	Record Data (Figure 16, right): The user can select a sample type and scan the QR code
attached to the equipped sample bag to populate the ID/Barcode. Required fields are
denoted by an asterisk and photographs can be added and linked to the sample using the

18


-------
Attachments feature. The tool automatically captures basic metadata (user, date/time, and
location details). Once data entry is complete, the user can click the save button (shown
as a disc icon) in the top right.

10:56

•ill ^ ฆ



<

Assignment



11:03 f







<

Submit Data





10:58-?



a

X Create Field Sample





Assignment has not yet been completed

Details

Name	No name specified

Instructions

Sample surface using the provided sampling
bag; data person should video the process

Due Date	No due date specified

Proximity in meters No proximity specified

Location	35.88338, -78.87062

Navigate to Assignment Location

View Assignment Location on Map
Mark assignment complete

Record data for assignment

Accuracy: 5m

A 6

Alwtt Assignments

Tracking: On 0

Choose a Data Type



All

Chem

Rad

t







ฉ

Survey

*

Observation



Field Screening Radiological
Spectrum



G. - i..
H ฆ
VI

Reading



Chemical

Chemical



In.*ฎ

Chemical

SITREP



rri





Dose Reading

Ad-Hoc

Accuracy: 5m

Tracking: On 0

Sample Details

Sample Type

Swipe >

Is Background?

ID/Barcode *

CN-XXX-XXXXX M

Surface Area •

X.XX

Surface Area Unit •

>

Comment

Attachments	O

Click the add button to add attachments.

•Please do not include photos with personnel.

Contact Dose Rate

Accuracy: 5m

r>

Alignments

Tracking: On 0

o

Chat

Figure 16. CBRNResponder App screens for field data capture - Part 2.

Once data collection is complete, users submit data to an online data repository. Depending on
whether the device is operating in on- or offline mode, data will synchronize once a connection
is available. Data can then be viewed by event managers, as well as users through the application
(Figure 15, top right - Collected Data). Data can be exported from the platform; however,
individual data types must be separately downloaded.

5.2.3 Event Dashboard and Map

A preconfigured event dashboard is available to support event management. During the AnCOR
DATA Demo, data managers could track field personnel using the map and predefined metrics
could be arranged on an event dashboard to monitor a variety of metrics (e.g., data/type
collected, field teams, assignments, organizational partners). Figure 17 presents an example
event map. Available features facilitate interacting with the map and tailoring views depending
on the types of questions needing answers. For example, specific responders could be isolated to
track status, sample status can be accessed, assignment information can be viewed, and
additional GIS files can be viewed. Note, however, that GIS files must be uploaded to the
platform and are not directly integrated with any ArcGIS online capabilities. Data captured

19


-------
within the platform can be exported into different formats, including keyhole markup language
(KML), shapefile, and CSV.

ป DATA Day Test ~ & EPA/HSRP ~ i About ~ ซ5 Our Network ~ ซS Resources ~ Of Contact

Search address

Filters
Layers

0

ฉ T Quick Views - PI Saved Views - Jk Downloads - 9 ฉ

\ \\\\
WW

Q (J) (-. Road

S Event Information Optio...
User Data Options

(ฉ)
ฉ„

%

<0>

m

%
WW



\

\Y

WW

WW

vW

WW

WW

Figure 17. CBRNResporider App event map for monitoring activities.
5.2.4 Reports

CBRNResponder offers a variety of preconfigured reports that can be generated [11], The
AnCOR DATA Demo did not fully evaluate the range of offerings; however, two features that
might be of interest include the ability to create barcodes and generating chain of custody forms.
Barcodes can be generated for sample labels in advance of a sampling event and scanned using a
device's camera. Figure 18 below illustrates a screen for supporting the creation of sample
labels.

Print Sample Labels

Use the drop down boxes to select the quantity of unique labels and number of copies of each you would like to print. Click Generate Labe
to generate a PDF of labels for an Avery 5160,30 labels per sheet, template.

Organization * us EPA Homeland Security Research Program
Print Type' Print New

RadResponder Sample Collection #

Quantity * 30
Copy Quantity ' 2

Label Prefix

Barcode range to be printed: SCN-04D-000001 - SCN-04D-000030

mill

Sample PDF417 Barcode

I

SCN-04D-000001

RadResponder Sample Collection #

I

SCN-04D-000002

Figure 18. CBRNResponder App supports generating sample labels.

EPA also expressed the need for a feature to create chain of custody forms and electronic data
deliverables to convey the information required for laboratory analyses. CBRNResponder can

20


-------
support creating Sample Control Forms (SCF) (chain of custody) and Analytical Request Forms;
however, the ability to customize and/or tailor elements that are included on the form is currently
unavailable. For SCFs, users can select existing samples and general information will be
prepopulated, including the barcode. Figure 19 illustrates an excerpt of a system-generated form.

JXBRN

Responder

General Information

Sample Type:	Swipe	

Collected By:	Boe, Timothy

Collected Date:	09/16/2021 09:05

Field Team:	Team Shed Row

Sample Status:	Collected	

Volume:		

Weight		

Comment:

Figure 19. CBRNResponder chain of custody form example.

5.3 Android Team Awareness Kit (ATAK)

Android Team Awareness Kit (ATAK) is a tool that was developed by the DHS Science and
Technology Directorate (S&T) and has been adopted by multiple agencies, including the
Department of Defense (DoD). ATAK allows the user to submit and receive real-time spatial
awareness information and communicate between responders across multiple agencies [12], The
project team had initially planned to evaluate ATAK as part of the AnCOR DATA Demo;
however, the team experienced several impediments that prevented its full implementation,
including lack of access to a detailed user guide describing how to use the tool.

ATAK lacked a centralized dashboard for designing and implementing activities. Points of
interest (i.e., samples) were shared directly with test devices using email to send .shp/.kmz files
to each individual device. A basic viewer within ATAK was established for locating and tracking
samples (Figure 20). Additional limitations included viewing and documenting data associated
with a given point (i.e., participants could not easily document information as demonstrated in
the Field Maps and CBRNResponder tools). Lastly, at the time of publishing this report, ATAK
was limited to Android devices (an iOS application was under development and available for
testing, but lacked critical features supported by the Android version). ATAK did feature
functionality that was not available in Field Maps and CBRNResponder, including direct
messaging, team proximity information, line of sight, and mobile ad-hoc networking.

Sample Control Form

Chain of Custody



111



MM

Sponge-106



Location: 35.882124 /-78.870180
Location Status:

21


-------
Figure 20. Screenshot of ATAK viewer showing samples and route with aerial image overlay.

Because essential features necessary to fully investigate this tool could not be fully implemented
within a period of time comparable to that of Field Maps and CBRNResponder, the AnCOR
DATA Demo project team decided to forgo testing ATAK at the AnCOR DATA Demo. This
tool should be fully investigated in the future once additional functionality and guidance are
made publicly available for supporting biological sampling.

5.4 EPA Scribe

Scribe is a desktop-based software tool developed by EPA's Environmental Response Team
(ERT) that is available to EPA personnel. The tool supports storing and managing sampling,
observational, and monitoring field data. Scribe allows users to produce outputs for collected
samples for analytical data reports. Scribe can import a variety of data, and scripts can be saved
to manage import mappings [13], The Scribe tool is routinely used by ERT; however, the
AnCOR DATA Demo did not include a full data workflow that integrated analytical data
communication to/from laboratories. Therefore, Scribe was not exercised during the
demonstration. Data collected as part of exercising field data collection tools and software were
instead stored on EPA's GeoPlatform for the AnCOR DATA Demo.

6 TECHNOLOGY EVALUATION

The AnCOR DATA Demo also evaluated GPS and mobile devices to document any notable
differences among devices that were used to support field sampling data collection activities.

6.1 Global Positioning System (GPS) Devices

Sub-meter GPS receivers provide relative positional accuracy (hence the name, accuracy within
1 meter) and are used to determine the user's position, locate objects/areas of interest, and
support navigation. The accuracy of these units is largely impacted by end-user experience,
atmospheric effects, and multipath effects (i.e., the GPS signal is reflected or diffracted from the

22


-------
local objects). For a biological incident, sub-meter GPS receivers may be used in combination
with data acquisition systems to aid navigation when challenged with sampling designs that
reference predetermined sample locations.

Several commercial sub-meter GPS receivers are available. A select group of participants in the
AnCOR DATA Demo evaluated the following sub-meter GPS receivers: 1) Arrow Series GPS,
2) SXblue, and 3) Geode. The demo also evaluated the built-in GPS chip found in Apple's Wi-Fi
+ Cellular iPad models (iPad Air 2). A survey-grade receiver was used to capture control points
in five separate locations (Table 1).

Table 1. GPS Test Locations

Location
ID

Description

Estimated Signal Blockage

A

Adjacent to forest

50%

B

Parking lot

0%

C

Close proximity (10 feet) to 50-foot structure

50%

D

Field with trees

25%

E

Dense forest

100%

A multi-station equipped with a laser-based precise long-range scanning capability (LeicaNova
MS50) was used to capture precise locations in areas that might introduce multipath interference
(e.g., tree canopy coverage, buildings). A picture of the multi-station and the associated prism is
shown in Figure 21. Using the parking lot location (i.e., open sky) as a reference point, the multi-
station was used to shoot the four other locations. This approach provided a horizontal accuracy
of < % in.

Figure 21. Leica Nova MS50 multi-station and associated prism marker.

23


-------
The accuracy of these measurements was compared against a known National Geodetic Survey
benchmark. Figure 22 shows a picture of the horizontal control used to support this evaluati on.

Figure 22. National Geodetic Survey benchmark.

Once the control points and horizontal accuracy were determined, personnel equipped with the
prescribed GPS devices (Arrow Series GPS, SXblue, Geode, iPad Air 2) navigated to each test
location (control points were marked on the ground using orange tape). At each test location,
participants would stand directly over the orange marker, wait for 15 seconds, and capture five
successive points (using a custom ArcGIS Field Maps template) every 10-15 seconds. The
resulting points were then plotted in a GIS application to determine relative distance from the
control point for each test location. Figures 23-27 show a scatter plot map for each test location
(A-D), respectively. The control (red point) is shown in the center of the circle, and the
surrounding buffers represent the distance from the centroid of the circle (i.e., 2D horizontal
error). Overall, the Geode consistently outperformed the SXblue, Arrow, and iPad Air 2 (<3 ft of
accuracy for this test condition). Table 2 shows the average horizontal error for each device and
location.

24


-------
/	N20ft

/	O	\

/ \

i	* " "	" *ฆ ซ.	\

/	/'	"" S	\

' x' "v	*

' / s
/ / s 10 ft
/ / \
F	'	~			\

'	~ "	X

'	/	-	\ 5ft	^

' ' ^ " - N. V	^

I ' ' \ 3 ft *	I

I	//_<*ป \ \	,

I	I I ' ~ > 1ft ( I	,

I	I 1 J * J 1 •	I

,	t \	*ฐฐ0 / ป	,

V	' \	'	'

V	>	'	/

V	\	/	/

^	S	~	/

\	N	~	/

\ \ /

\ \ /

\	\	/	O Geode

\

\

O Arrow
O SXblue
O'Pad

/

/

Figure 23. Horizontal error for test location A.

/ \

/ ^	 \

/ ^ ^ \

/	^	s	\

' / \ *

/	~	N	\

/	'	^ 5ft	\

/	7					^	\

/ •'%."* \

'	/ /' * \3ft N	*

'	; /	\ \	1

'	'	^->ift x	1

i	i	,.	\	\

i

/	\

i	• i	1 1	'

i t	/	i i	1

v \ ~	- '	II	>

I	\	\	/	l	I

\	x	\	/	/	/

I	\	/	/

\	\	v , , '	/	/

\	S	'	/

\	N	'	/

\	N	"	/

>.	x*

\ ^ ^ ^ ^ /

\	• /

\	O / 0 Geode

S	~ .

\	/ ฎ Arrow

N	/	ฉ SXblue

\ ~
\	~	O iPad

Figure 24. Horizontal error for test location B.

25


-------
/	0	\20ft

/ \
/	ฐ	V

/ , " 	- „ \

/	O	\

I	'			v	V

'	! '' ~ Vvsft \	1

: 'V-.Vr* \

' \ ; ' ; !
'	v	X	'	/	'

\	N ^	'	/	'

* \ - - / '
v	\	~	/

\	v	'	/

X "v /
\ ^ -	" /

^	^ O Geode

\ /

N	~	O Arrow

s	~

x	~	ฉ SXblue

s	/

N ^	^ ^ ^	O iPad

Figure 25. Horizontal error for test location C.

'	v 10 ft

/ \
/ \
/ \

/ 	 \

t	N

/	^	s	\

'	~	s	1

.:

'	/7 - -•* \	*

/	,	'	^ 3ft	v	t

/ \

'	/	/	\ 0 X	1

'	'	'	.-.1ft	V o *	'

'	I	1 . ,	1	I

'	ป	V /	'	I

1	\	V	ฆ> _ -'	I	I	1

I	\	\	O	/	I	I

\	X	s	<0	/	,	I

\ \ / /
\ \ - ^ , ' / /
\	N	"	'	/

\	v	•	~	/

\ •> * ' /
^	N	X	^

\ # ^ /

\	•	// 0Geฐd=

^	' O Arrow

N	~

\	/	O SXblue

S N	y'	O 'Pad

Figure 26. Horizontal error for test location D.

26


-------
t	O Geode

/

\	/	3 Arrow

\	O SXblue

Figure 27. Horizontal error for test location E.

Table 2. Average Horizontal Error (ft) by Location and Unit

GPS
Unit

Adjacent
to Forest
(A)

Open

Sky
(B)

Near Tall
Structure
(C)

Field with
Trees

(D)

Dense
Forest
(E)

Average

STD

Geode

1.35

0.34

1.56

1.60

1.59

1.30

0.57

Arrow

1.07

0.91

10.27

5.43

2.05

3.95

4.07

iPad Air 2

11.78

5.29

8.26

4.14

8.41

7.58

2.74

SXblue

1.84

2.43

15.43

2.82

6.95

5.93

5.33

STD - standard deviation

Overall, the Geode outperformed the other devices at all five locations. The Geode had the
lowest average horizontal error (1.3 ft) and STD (0.57 ft). It should be noted that horizontal error
can also be a condition of satellite geometry, signal blockage, and atmospheric conditions, which
can vary by location and time of day/year. These conditions were not evaluated as part of this
study. A control point (horizontal control) should be taken to assess conditions and accuracy
prior to using a sub-meter GPS.

6.2 Mobile Devices

The AnCOR DATA Demo evaluated multiple electronic devices capable of documenting
samples, uploading data, and connecting to sub-meter GPS units. The devices evaluated as part
of this study included:

1) Samsung - Galaxy Tab S7 tablet (Android, 11),

27


-------
2)	Apple - 7.9-Inch iPad mini (5th Generation) with Wi-Fi (iOS, 10.3.4),

3)	Apple 10.2-Inch iPad Air 2 with Wi-Fi + Cell (iOS, 10.3.4), and

4)	Apple iPhone XR with Wi-Fi + Cell (iOS, 10.3.4).

These electronic devices were evaluated based on general performance and user feedback.
Participant feedback is discussed in Chapter 8. Generally, key issues centered on:

•	Battery life,

•	Screen brightness,

•	Bluetooth connectivity, and

•	Proper configuration (e.g., cellular data access, camera access, time out settings).

7 DATA MANAGEMENT AND REAL-TIME QUALITY
CONTROL MEASURES

Phase 1 research identified important data management needs, including:

•	Addressing secure, bulk data uploads (e.g., once an internet connection is re-established),

•	Storing and processing large quantities of data, and

•	Analyzing results in a collaborative platform.

The Phase 2 AnCOR DATA Demo exercised field data capture software capable of capturing
data while in the field and submitting and synchronizing data to a consolidated online platform.

7.1 ArcGIS Field Maps Data Storage

Esri's ArcGIS Online (EPA's GeoPlatform) was used to store data collected using the Field
Maps application. Data that were uploaded through Field Maps were automatically formatted
into the appropriate schema and text entries were limited. This approach reduces the risk of
generating inoperable data that break the schema specified by the data manager. Furthermore,
ArcGIS Online supports adding imagery and video and associates supporting media files with
sample data linked to specific geographic locations. An example data view is shown in Figure
28.

Q GroupID

Team 3

Nav_Seq

Q Instruct

rg Sample Bag iD Q Texture/Surf... g Start Tirne/D... Q. End Time/Date :

Take a picture of 59

you end your
teammate

Take a picture that Sportge-60
shows the ground
and the sun

Take a oicture of Sponge-75
Sponge-76

Vegetation

Vegetation

9/15/2021,
11:24:03 AM

9/15/2021,
11:27:33 AM

9/15/2021,
11:40:23 AM

9/15/2021,
11:36:20 AM

9/15/2021.
11:24:34 AM

9/15/2021,
11:28:14 AM

9/15/2021,
11 ;40:41 AM

9/15/2021,
11:36:27 AM

€>ane in way

Osne in way. May
have duplicated
QR code

Sherlock Ho'mes

Figure 28. Table showing a sample of ArcGIS Field Maps data collection results.

28


-------
7.2 CBRNResponder Data Storage

CBRNResponder data are stored in a Microsoft Azure Cloud environment managed by vetted
site administrators from the vendor organization that manages CBRNResponder for FEMA. Data
uploaded are owned solely by the collecting organization and are only visible to other
organizations if an organizational administrator provides explicit access to the data [14],

Uploaded data are formatted according to specific schema defined for various data types. Use of
drop-down lists and limited free-form text entry fields is common among the different data types.
Image files and documents can be uploaded and associated with a data point using the
Attachments feature. Figure 29 illustrates a sample table of CBRNResponder data collection
results. Columns to display can be tailored and each data record can be expanded to view the full
record, including any attachments associated with the data point.

A Samples	_ ilmp<

d Print -



Analytical Results



B Update Statuses ฆป

| B Assessment Mode |

+ Create Sample

T

Filters >























Q.

sponge





E

^ Exact

O











Choose Visible Columns •$



ID it

Collected Date/Time

it TVpe it

ID/Barcode

It Source it

Status it

Team it

Recorded By it

Latitude it

Longitude it Facility it

Surface it Material it Latest Assessment Status it



1765252

09/16/2021

8:23

Swipe

Sponge-12

iOS

Collected

Team Shed Row

Rodgers, Molly

35.879863

-78.86886

-



1765253

09/16/2021

9:05

Swipe

Sponge-106

Android

Collected

Team Shed Row

Boe, Timothy

35.882124

-78.87018

-

<21

1765254

09/16/2021

9:09

Swipe

Sponge-100

iOS

Collected

Team Shed Row

Rodgers, Molly

35.882026

-78.869977

-

GK

1765255

09/16/2021

9:22

Swipe

Sponge-101

iOS

Collected

Team Shed Row

Rodgers, Molly

35.882373

-78.870125

-



1765256

09/16/2021

9:25

Swipe

Sponge-107

Android

Collected

Team Shed Row

Boe, Timothy

35.882595

-78.870225





1765257

09/16/2021

9:27

Swipe

Sponge-102

iOS

Collected

Team Shed Row

Rodgers. Molly

35.882717

-78.870268





1765258

09/16/2021

9:29

Swipe

Sponge-103

iOS

Collected

Team Shed Row

Rodgers, Molly

35.882717

-78.870268



Q.

1765259

09/16/2021

9:32

Swipe

Sponge-108

Android

Collected

Team Shed Row

Boe, Timothy

35.882609

-78.87026

-

GK

1765260

09/16/2021

9:34

Swipe

Sponge-105

iOS

Collected

Team Shed Row

Rodgers, Molly

35.883043

-78.870458

-



1765261

09/16/2021

9:37

Swipe

Sponge-109

Android

Collected

Team Shed Row

Boe, Timothy

35.882844

-78.87035

-



1765262

09/16/2021

9:38

Swipe

Sponge-I07a

iOS

Collected

Team Shed Row

Rodgers. Molly

35.883315

-78.870587

:



1765263

09/16/2021

9:42

Swipe

Sponge-108a

iOS

Collected

Team Shed Row

Rodgers, Molly

35.883556

-78.870822





1765264

09/16/2021

9:43

Swipe

Sponge-110

Android

Collected

Team Shed Row

Boe, Timothy

35.883192

-78.870458

-

25

v Showing 1 to 13 of 13 entrie.

(filtered from 14

otal entries)













Previous Q Next

Figure 29. Table showing a sample of CBRNResponder data collection results.
7.3 Real-Time Quality Control Measures

Quality control (QC) is an integral part of data collection for any event. For emergency response
and recovery activities, it is essential that field data capture forms are designed to minimize data
entry errors and that sampling activities are monitored in real-time to identify and correct
erroneous entries during a sampling episode if possible. Examples of real-time corrections could
include:

•	Reporting an incorrect sample method based on sample bag QR code (swab versus
microvac),

•	Reporting an incorrect sample matrix, or

•	Capturing the sample at a distance outside of an established threshold designated by the
sample design.

7.3.1 ArcGIS Field Maps QC

The AnCOR DATA Demo evaluated data in a two-step process: 1) Error Checking: the Field
Maps form was designed to prevent users from entering erroneous data by confirming that input
fields contained expected characters or limited free-text entries. If an error in the input menu was

29


-------
discovered or left blank, the tool would notify the user; and 2) Remote Review: since the
AnCOR DATA Demo featured the real-time collection of data, the dashboard and associated
data layers presented a unique opportunity to review data while sampling was underway. During
the demo, two remotely located EPA personnel were asked to review data as they were uploaded
to the cloud. Reviewers could either monitor the dashboard or the raw data (shown in Figure 9
and Figure 28, respectively).

7.3.2 CBRNResponder QC

While data captured using CBRNResponder were not explicitly reviewed by a dedicated team
during the collection event, the applications dashboard and data collection views would enable
real time data review (see Figure 17 and Figure 29). CBRNResponder also makes use of
controlled data entry elements; however, there appears to be less flexibility to customize data
validation rules as users are bound by what the application offers out-of-the box.

A feature not exercised, but advertised as forthcoming, is a Chat function. The feature would
presumably be very useful to quickly communicate/chat with a responder to convey any issues
that might need correction/resolution while out in the field.

8 DATA DAY DEMONSTRATION PARTICIPANT
OBSERVATIONS AND FINDINGS

A summary of observations and feedback provided by the AnCOR DATA Demo participants is
provided. Input is organized first by general observations, followed by specific topics.

8.1 General Observations and Participant Feedback

Participant feedback was collected through several different avenues including:

•	Debrief sessions (after Day 1),

•	Field observations (observer who accompanied different teams),

•	Verbal communication with participants, and

•	Populated via an online feedback form.

The observations and feedback findings are organized by the following topics: 1) Demo
implementation; 2) Operational feasibility; and 3) Technology and software evaluation (GPS,
mobile devices, and software).

8.1.1 Demo Implementation

Overall, participants felt that the demonstration was well organized and implemented. The
following are the major findings:

•	Optimize routes to ensure that distances in-between samples are reasonable (i.e., distance
and time necessary to travel between the sampling locations) and make sure routes avoid
non-access areas or work zones;

•	Ensure the most current aerial imagery is available to reference;

•	Configure technology pairings in advance (e.g., assign GPS units to tablets and note
identifying IDs);

30


-------
•	Note team associations/assignments to expedite teams moving through initial stations;
and

•	Document what team is associated with a specific device ID.

8.1.2	Operational Feasibility

The AnCOR DATA Demo project team evaluated the feasibility of several important feasibility
considerations related to technologies and data management operations in the field.

8.1.2.1	Just-In-Time Training

Just-in-time training was provided to participants prior to entering the field. Participants with
limited experience using the prescribed technologies received the same level of training as those
who were considered experts (through routine use). Feedback from participants concluded that
just-in-time training was inadequate for personnel who had never interacted with the prescribed
technologies. The following are findings that could help prepare responders for a sampling event:

•	Provide routine training to both emergency response and surge capacity personnel
(including researchers) for both data capture software and technologies that would be
used during a response.

•	Offer interactive, step-by-step training (either in person dry-runs or via PowerPoint or
MS Teams) prior to sampling in the field.

•	Distribute laminated instruction cards that provide quick tips and troubleshooting
solutions to sample teams.

8.1.2.2	PPE Limitations

Teams were randomly asked to use thick garden gloves (to mimic nitrile work gloves and/or
task-specific gloves representing PPE types B, C, or D; level A or encapsulated B with the thick
butyl rubber gloves are not approximated with thick gardening gloves) and were equipped with a
stylus. Teams were asked to provide feedback on limitations in dexterity while operating the
tablets. Overall, teams found it easy to navigate the devices and enter data on the forms using a
stylus while wearing PPE. However, it should be noted that teams did not fully mimic a response
requiring full Level C, including respirators, where a reduction in mental and possibly physical
acuity might be experienced.

8.1.3	Technology Evaluation

The AnCOR DATA Demo evaluated GPS and mobile devices to document any notable
differences among devices that were used to support field sampling data collection activities.

8.1.3.1 GPS Evaluation

In the AnCOR DATA Demo, built-in GPS on the tablets/phones or sub-meter GPS receivers
(Arrow Series GPS, SXblue, Geode) used in combination with tablets were used to locate and
navigate to predetermined sampling locations. The following key findings are based on user
feedback and observations:

•	Sub-meter GPS units worked better than cellular built-in GPS on all the devices
evaluated, and the sub-meter units exhibited a better battery life.

31


-------
o Participants noted frustration with fluctuations of the location point on Samsung
devices.

•	Overall, with all receivers, but more so with the built-in GPS, signal seemed to be
diffracted from local objects and would not stabilize or showed a decrease in accuracy
when near solid structures or tree canopies.

•	The Geode sub-meter GPS receiver mounted on a pole was reported as very accurate (,5
ft).

•	The Arrow Series and SXblue sub-meter GPS receivers positioned on hats were more
convenient than the Geode GPS on a pole.

•	Sub-meter units need to be stabilized on the hat and face straight upward for best
performance and accuracy.

•	Avoid interference from multiple devices in close proximity where Bluetooth connections
can sync with a nearby device.

8.1.4 Mobile Devices

Multiple electronic devices capable of documenting samples, uploading data, and connecting to
sub-meter GPS units were evaluated. Evaluated devices include: 1) Samsung - Galaxy Tab S7
tablet, 2) Apple - 7.9-Inch iPad mini (5th Generation) with Wi-Fi, 3) Apple 10.2-Inch iPad Air 2
with Wi-Fi + Cell, and 4) Apple iPhone XR with Wi-Fi + Cell. The following are general key
findings based on user feedback and observations:

•	Sampling events might require a longer battery life and use of an external battery,
particularly when maintaining an active WiFi/cellular link to online data repositories, as
well as reliance on GPS signals.

•	Carrying an extra, fully-charged external battery would also impact sampling time
requirements to avoid needing to return to the support zone (where personnel would need
to also decontaminate and don/doff PPE to acquire additional supplies or charged
equipment).

•	Screen visibility was limited due to sun and accumulation of fingerprints (expected to be
less of an issue when wearing protective gloves).

•	Participants surmised that it would likely be difficult to see the screen with a full-face
respirator (FFR). Additionally, increasing the screen brightness for better visibility
rapidly drained the battery.

•	Furnishing the tablets with a hand grip would make for easier operations in the field.

•	Training on how to use the tablet's video feature is necessary.

•	Overheating of devices could present issues such as lagging and decreased battery time.
Additionally, usability feedback for specific devices was captured and is summarized below.

8.1.4.1 Apple iPad

Feedback for the Apple iPad (iPad Air 2 and Mini) was provided by 12 participants. The ease of
using garden gloves (i.e., PPE surrogate) and entering notes with a stylus was noted as very easy
to average. The ease of capturing a video varied widely and was reported as very easy to
extremely difficult. Many participants reported that they did not know to capture the video using
the data capture form's "attach" feature - more likely a function of inadequate software training
versus the device's camera function.

32


-------
Mixed responses were received on the acceptability of the device's battery performance.
Participants noted that when the screen brightness was adjusted to the highest level (to increase
the visibility of the screen), the battery drained rapidly. One instance was noted where the device
went directly from "low battery, 10%" to a black screen and the device became unresponsive.
One participant noted that the extended use of Bluetooth to interact with the Geode GPS unit on
the pole might have drained the battery quicker than expected.

Most participants agreed that the device withstood harsh field conditions. Several participants
noted that the devices felt hot. One participant reported lagging issues (where up to ten seconds
passed before screen would scroll or submit would complete) which they believed to have been
caused by the heat. Several participants noted that the screen was difficult to see (especially with
fingerprints (which might not be an issue with gloves/PPE) and that it would be difficult to see
the screen with a FFR.

8.1.4.2	Samsung Tablet

Surveys for the Samsung tablet were completed by three participants. The ease of using garden
gloves and entering notes with a stylus was noted as easy to average. The ease of capturing a
video was reported as very easy to extremely difficult, and it was noted that capturing a video
was not possible while using the data capture application. Mixed responses were received on the
acceptability of the device's battery performance. Participants agreed that the device withstood
harsh field conditions, and one participant indicated that the Samsung table had better optics than
the iPad, but would still be difficult to see with a FFR.

8.1.4.3	Apple iPhone

One participant provided feedback related to the use of an Apple iPhone (gloves or a stylus were
not used). The ease of capturing a video was reported as average and the device maintained
acceptable battery performance.

8.1.5 Software

As previously described, the AnCOR DATA Demo exercised two mobile field data acquisition
software tools: 1) ArcGIS Field Maps and 2) CBRNResponder. User experience feedback
provided by participants is summarized in the sections below.

8.1.5.1 ArcGIS Field Maps

The following are key findings for ArcGIS Field Maps based on user feedback and observations:

•	Overall, Field Maps worked well, was easy to use, and was a useful data acquisition tool
that participants are very likely to use again.

•	Use of "edit" and "copy/copy all" in the software was confusing to participants.

•	Logging information for each point was not intuitive. Users requested a more
explicit/obvious user interface control to enable/start data collection for a sample point.

•	Participants suggested incorporating a "done" button when selecting the collection
time/date to denote completion of sampling (tracking sampling times were an ancillary
data point and expected to be automated in the future).

•	Sample finish time/date might not be necessary to explicitly collect.

33


-------
•	Participants suggested that it would be beneficial for a different color point or icon to
display on the map when a sampling location status is updated to confirm data were
successfully recorded.

•	Older iPad devices (iPad Air 2) appeared to more slowly record data and submission
failed on several occasions.

•	Entering notes on the form with both gloved and ungloved fingers was easy on all tablets
evaluated.

•	Issues regarding scanning QR codes were more likely associated with the label itself
(printer/pixilation issue or misprints where the label was partially cut off) rather than an
issue with the software feature and/or device camera.

•	Participants suggested adding an option to enter sampler and/or team in the form (or
using a QR code to denote the team collecting the sample).

•	Field Maps was fully compatible with offline collection. Data captured were
automatically uploaded to the ArcGIS Online as soon as the tablets established an
internet connection. All samples collected in offline mode were successfully captured and
uploaded.

Additionally, user experience feedback focused on the software is also summarized based on the
type of mobile device and field data collection application used.

8.1.5.1.1 ArcGIS Field Maps + Apple iPad Air 2

Ten participants provided feedback using ArcGIS Field Maps with the Apple iPad. Five
participants reported ease of use with gloves as extremely easy to average (feedback was not
provided by five participants). The ease of entering notes with fingers was noted as extremely
easy to easy. It was noted that the iPad keyboard is large enough to enter text without difficulty.
The ease of scanning the QR code was rated extremely easy to difficult. Participants mentioned
some minor glare issues in cloudy conditions where some codes did not scan; however, as
previously noted it is likely that issues stem from the integrity of the printed QR code label itself.
Toggling between online or offline mode was rated extremely easy to easy, and participants
noted that having a preexisting familiarity with iOS is beneficial. The ease of capturing an image
using the camera feature was noted as extremely easy to average. For one participant, it was not
intuitive to click the arrow icon (in the top right) after you click edit. It was also noted that taking
a video requires using a hidden feature under "attachments."

The ease of entering data was rated as extremely easy to average. One participant noted that
there was 10-15 second latency when moving through the screen. A participant also noted that
the calendar stayed open after entering the date, rather than automatically closing. Additionally,
activating the data entry dialog was sometimes difficult (e.g., users had to click on the number of
the samples and not on the blue GPS dots).

The ease of using navigational and GIS features was rated as extremely easy to average with
participants noting that the iPad appeared to be more accurate than the Samsung tablet, but still
up to 15 feet off. They noted the drift was sometimes more than expected, making it hard to
pinpoint the sampling spot. The ease of synchronizing data from mobile app to centralized data
storage was noted as extremely easy to easy (not applicable in six responses).

34


-------
Overall, participants responded that it was very likely or likely that they would use this
application in the future and recommended having a dedicated scribe to operate the tool,
providing an external battery, and furnishing the iPad with a hand grip.

8.1.5.1.2	ArcGIS Field Maps + Apple iPad Mini

Five participants provided input regarding the use of ArcGIS Field Maps and the Apple iPad
Mini. Ease of use with gloves was noted as average by two participants (not rated by three
participants). The ease of entering notes with fingers was noted as easy to average. It was noted
that entering notes was slightly more difficult than a normal iPad due to size of the screen, and
that the screen was hard to see sometimes due to glare. Consistent with previous tablet
observations, adding a hand grip to the device was noted to be useful for the future. The ease of
scanning the QR code was rated extremely easy to difficult, with issues occurring in both
sunshine and shade. Again, this was more likely due to the printed QR code labels themselves.
One participant noted that more effort was required to activate the camera than when using the
standard iPad. The ease of toggling online or offline mode was rated extremely easy to average,
and that an existing knowledge of iOS helps.

Participants noted that switching between edit and the copy/copy all feature for test points is a bit
cumbersome and should be avoided. The ease of entering data and capturing an image using the
camera feature was noted as extremely easy to easy, with a participant noting that the keyboard
is still large enough for easy typing and reiterating that a hand grip would be useful.

The ease of using navigational and GIS features was rated as extremely easy to average. The
ease of synchronizing data from the mobile application to a centralized data storage location was
noted as extremely easy or not applicable. Participants responded that it was very likely or likely
that they would use this application in the future.

8.1.5.1.3	ArcGIS Field Maps + Samsung Tablet

Four participants provided input regarding the use of ArcGIS Field Maps and the Samsung
tablet. Ease of use with gloves was noted as easy to difficult (not evaluated by two participants);
however, the ease of entering notes with fingers was noted as easy by all participants. The ease
of entering data, scanning QR codes, and capturing an image using the camera feature was
mostly noted as extremely easy to easy, although a user mentioned they could not figure out how
to provide the application access to the device's video camera. The ease of toggling online or
offline mode was rated as easy to difficult.

The ease of using navigational and GIS features was rated as easy to average, but participants
noted the tablet was not accurate at getting to the point and would continuously move. One
participant noted that when they were offline, they had trouble navigating close to the sampling
point, especially near buildings. The ease of synchronizing data from mobile application to
centralized data storage was noted as easy to average. It is likely that the users would use this
application in the future; however, one user noted they did not like the size of the tablet nor the
platform (Android) overall.

8.1.5.1.4	ArcGIS Field Maps + Apple iPhone

Two participants provided feedback related to their experience using ArcGIS Field Maps with an
Apple iPhone. The ease of use with gloves was not evaluated; however, the ease of entering

35


-------
notes with fingers was noted as extremely easy to average. The ease of entering data, scanning
QR codes, and capturing an image using the camera feature was also noted as extremely easy to
average, although one user did have to take some time to allow the application to access the
device's camera.

The ease of using navigational and GIS features was rated as easy to difficult, as the iPhone was
not accurate and presented an issue with navigating to the sample point. The ease of
synchronizing data from mobile application to a centralized data storage was noted as extremely
easy with an internet connection (a user did note that the QR code and submittal stopped working
when the phone was in airplane mode). It is very likely that the user would use this tool in the
future.

8.1.5.2 CBRNResponder

Two participants provided feedback on their experience using CBRNResponder during the
AnCOR DATA Demo. The following are key findings based on user feedback and observations:

•	Users noted that reliance on the device's default map (e.g., Apple map or Google Maps)
was not as user-friendly as the integrated map offered by Esri's ArcGIS Field Maps.

•	Toggling back and forth between the data collection application and a separate mapping
application was frustrating for the users. Additionally, the default "mode of transport"
required additional setting adjustments to ensure "walking" directions were enabled each
time the map loaded.

•	The mapping interface was cumbersome. Users expressed frustration at having to reload
the map to view each sampling location where extra time was required to adjust settings
and reorient the map.

•	A participant suggested incorporating a toggle yes/no button to confirm a status change
rather than having to manually type "Yes" in "Confirm Status Change."

•	Participants noted difficulty rendering QR codes; however, issues were likely a result of
the label resolution.

•	The application did not support videos, only images and documents.

•	In the future, correlate assignment descriptions with sample numbers.

8.1.5.2.1	CBRNResponder + Apple iPad Air 2

•	A participant reported that they could not get walking directions to work when using a
satellite basemap (no sample point was visible).

•	Several sample data collection instances did not correctly render the QR scan; however,
the QR code could not be later used because the form would not permit a duplicate ID
(indicating it was initially read, but not rendered). Given the repeated issues reported
about QR code labels, no conclusive comments/issues can be attributed to the software,
and users assume the feature should be QR code compatible.

8.1.5.2.2	CBRNResponder + Samsung Tablet

•	A participant noted unexpected closing of the application and loss of data entry.

•	A participant noted loss of text entry prior to submission when adding a picture.

•	A user noted confusion with a pop-up window where, when a user selects undo, the
validation language displayed by the application appeared to confirm the opposite action
than the user was attempting to perform.

36


-------
8.2 Data Manager Observations

General observations from a data manager perspective related to oversight of data management
tasks and configuring the software that was exercised are described in the sections below.

8.2.1 Operational Feasibility Considerations

Several important operational feasibility considerations were reinforced through the AnCOR
DATA Demo. Observations and recommendations are listed below:

•	Span of Control: For data mangers actively managing personnel who are equipped with
newly introduced hardware and/or software, the span of control is significantly reduced.
Troubleshooting might be required to assist inexperienced personnel and/or resolve
hardware issues. It is recommended that data managers be associated with no more than
four teams during an active response that requires the use of data acquisition tools and
GPS hardware. The span of control could be increased through routine training of
participants.

•	Just-in-time Training: Just-in-time training was provided to AnCOR DATA demo
participants prior to entering the field. Participants who had never interacted with the
prescribed technologies received the same level of training as those who were considered
experts (through routine use). Following the demo, feedback and observations clearly
determined that just-in-time training was inadequate for personnel who had never
interacted with the tools and technologies that were exercised. Routine training should be
provided to both emergency response and surge capacity personnel (including
researchers) on a regular basis.

•	Offline Operation: Teams were randomly chosen to operate in full offline mode (both
field data collection applications exercised have offline data collection modes). Once the
tablets resumed internet connectivity, results were automatically uploaded to the
dashboard. All the samples collected in offline mode were successfully captured and
uploaded.

•	PPE Limitations: Overall, teams found devices and data capture forms easy to navigate
using a stylus while wearing limited PPE (garden gloves to simulate nitrile gloves).

•	Protecting Sensitive Equipment: While previous field studies have successfully
demonstrated that electronic tablets (e.g., iPads) can be successfully decontaminated by
encapsulating them in a water-resistant case and dunking them in a bleach solution, there
are currently no water-resistant cases available for GPS systems. Furthermore, the GPS
systems typically consist of multiple parts (e.g., control unit, wires, and antenna). The
current recommendation would be to keep the GPS system in the field (i.e., hot zone) for
the duration of the daily sampling activity (<12 hours). The GPS antenna and control unit
can be contained in Ziplock bags. The bags should be sealed using a durable adhesive
tape. The wire portion can remain exposed. At the conclusion of the sampling day or
every 12 hours, the GPS equipment should be removed, decontaminated, and charged (or
charged in the field if conditions allow). The GPS equipment should be routinely checked
against known control points to ensure operability. The steps to protect GPS equipment

37


-------
were not evaluated as part of the AnCOR DATA Demo but should be evaluated as part of
the AnCOR wide-area field demonstration.

8.2.2	ArcGIS Field Apps Suite

EPA maintains an enterprise-level license for Esri products. Gaining access to the GeoPlatform
(EPA's ArcGIS Online instance) for both the desktop-based application and mobile application
was straightforward. Geoservices was responsive to inquiries and requests. Ample guidance and
training materials are available to support learning how to configure various tools for use.

ArcGIS Field Maps directly integrates with EPA's TOTS tool output, as well as other geospatial
assets that can be shared from EPA's GeoPlatform portal (e.g., surface classification/
characterization spatial analyses, building footprints, landcover, and operational zones). Field
data capture forms can be easily tailored to meet stated data collection needs that are informed by
site-specific data quality objectives. While not fully exercised during the AnCOR DATA Demo,
EPA Regions have demonstrated creating sophisticated data entry validation measures to support
real-time validation to prevent erroneous data entry and resolution of errors while still in the field
[15, 16],

Several other specific lessons-learned and best practices were noted:

•	Use black text with a white outline when adding labels to features in maps to ensure
visibility on different basemaps.

•	Configure the original sampling map to the desired position for viewing sample points,
enable the appropriate layers, and save the map. Doing so creates a default view for both
the Dashboard map and Field Maps application map.

•	Only create the "Offline View Area" in the offline map after all layers are added and
properly configured.

Overall, the project team found the suite of Esri ArcGIS Field applications easy to configure and
tailor to meet the stated needs. Changes that could be easily incorporated as additional needs
were identified. This provides important flexibility to EPA where conditions in the field might
change, and the data management team needs to be able to quickly respond with any necessary
adjustments. Using an online platform that can easily distribute and synchronize updates
facilitates staying current with and responding to changing conditions.

8.2.3	CBRNResponder

Gaining access to the platform (both the desktop-based application and mobile application) was
straightforward, and the support team was responsive to inquiries and requests. Ample guidance
and training materials are available to quickly begin using the tool.

The underlying framework for the suite of applications is robust, including user administration
(i.e., pre-establishing sample collection assignments and pushing notification of assignments to
users' devices), team/event management, and a built-in operational dashboard. The
administrative-related features provided are powerful, including push notifications, assignment
status updates, and syncing capabilities. Tremendous benefit can be added by leveraging "routine
management" functions without having to "build" the capacity for each event. The overall user
administration and event tracking tools are very useful.

38


-------
This platform has potential; however, there are limitations related to the current alignment of
relevant data fields and reporting needs. EPA's field data capture needs related to a biological
contamination sampling event would best be met if the platform added data fields important to
EPA biological sampling operations and/or provided users with the ability to define custom data
fields and lookup values, as well as custom report options.

Another beneficial enhancement would be better integration with ArcGIS Online and/or map
services or layers that could be sourced from other online platforms. The need to separately
upload (potentially large) geospatial data files, rather than incorporate by way of reference to an
online URL, hinders EPA's ability to leverage other important operational geospatial assets.
Additionally, unlike ArcGIS Field Maps, the application cannot directly integrate an
operationalized map generated through EPA's TOTS tool. For a wide-area event that could
require hundreds or thousands of sample points, the ability to rely on an authoritative/single data
source of geospatially-referenced sampling locations on a map is important—both to minimize
extra processing steps and to avoid data transformation errors. Furthermore, the ability to
navigate to geospatially-referenced sampling locations in real-time (e.g., heading, distance) is
essential to implementing probabilistic sampling designs. CBRNResponder's navigation
capabilities are limited to vehicle navigation using standard mobile routing platforms
(Google/Apple maps).

Overall, the platform is easy to use and offers many robust features for the use cases on which
design decisions were made. Enhancements related to custom data types, reporting, and better
integration with other commercial GIS platforms would expand CBRNResponder's usefulness to
meet EPA's AnCOR program needs.

8.2.4 Other Noted Observations

Following the AnCOR DATA Demo, the project team held several additional meetings and
discussed other important topics that should be addressed, but for which uncertainty exists
related to implementation plans. Key issues identified include:

•	Define whether and how EPA's Scribe tool will be used in support of the AnCOR
program;

•	Address and document internal QA procedures that will occur prior to transferring
samples for analysis by the laboratory;

•	Define data auditing procedures and document explicit rules that will govern QA
activities;

•	Determine how chain of custody forms will be generated (i.e., what tool will be used) to
generate required documentation; and

•	Address how chain of custody forms will be shared with labs.

9 CONCLUSIONS AND RECOMMENDATIONS

More streamlined applications are needed for collecting, storing, analyzing, and visualizing field
and laboratory data in support of decision-making. This project had four primary objectives to
address this need:

39


-------
1.	Conduct a literature review and market research to identify relevant articles, reports,
and other information describing research, ongoing initiatives by regional and state
partners, and available commercial-off-the-shelf products that streamline and modernize
field data collection activities;

2.	Solicit subject-matter expert feedback from the response and research community on
important functionality that field data acquisition and/or data management tools and
technologies should have for responding to a wide-area incident;

3.	Identify and evaluate technology to support response personnel based on
recommendations provided by the response community; and

4.	Conduct a field-scale demonstration to further evaluate operational aspects of selected
technologies for the potential to enhance preparedness.

Phase 1 of this project addressed objectives 1 and 2 and part of objective 3 where candidate tools
were identified for further evaluation. From Phase 1 of this project, EPA gained a better
understanding of users' needs, candidate tools, and opportunities to improve wide-area data
management capabilities. Candidate software tools that were recommended for further
evaluation during the AnCOR DATA demo included:

•	Esri's Survey 123/Collector/Field Maps Suite,

•	RadResponder (CBRNResponder),

•	Android Team Awareness Kit (ATAK), and

•	EPA Scribe.

Phase 2 of this effort addressed research objectives 3 and 4 in which candidate tools were
exercised and evaluated. Specifically, this study evaluated the current state of technologies
through a demonstration event and documented observations and recommendations to enhance
the USCG and EPA's ability to respond to and recover from a CBRN incident. Through this
project, EPA gained invaluable experience in understanding how to apply advances in
technologies and software to improve field data acquisition tasks. Important technological issues
were identified to inform future planning and training efforts. Based on the expressed needs of
EPA and DHS/USCG and the experiences of participants in the AnCOR DATA Demo, the
project team recommended using Esri's suite of tools and ArcGIS Field Maps to support field
data acquisition efforts for the AnCOR program (and potentially future biological contamination
sampling events). Consistent with the findings from a related effort to assess data visualization
and analysis tools, the Esri suite has the most features that meet the largest number of needs, is
familiar to and accepted by target stakeholders, and is generally viewed as easy to customize and
tailor to meet the specific needs of the operation. Additionally, the Esri product suite is widely
adopted among the response community and has been used by the USCG in support of various
missions including search and rescue, pollution response, and response to natural disasters [17],

CBRNResponder, and a forthcoming BioResponder, offer many promising features. At present,
however, several key requirements for EPA's AnCOR program cannot currently be met—
namely, alignment with required data fields/types that will be collected and integration with real-
time geospatial assets. The project team recommends that EPA continue engagement attempts
with FEMA to convey EPA's needs regarding biological sampling (and other agents), and

40


-------
closely monitor FEMA's progress and tool enhancements (i.e., CBRNResponder and
BioResponder) to determine whether the tool could better meet EPA's needs in the future.

In pursuit of an additional project goal to document a repeatable, transparent, and stable
workflow to support AnCOR Wide Area Demonstration (WAD) data management needs, the
project team also developed a Data Management Task/Workflow that identifies when and how
various data management tools can be used across the response. Figure 30 illustrates specific
tasks that have a related data management component that require input and support from the
data management team. Tools available to support activities and the established workflow among
the tasks and tools are illustrated, and specific tools recommended to support the AnCOR WAD
are highlighted in blue.

Design Sampling
Plan

Guidance

•	Biological Sampling
Framework

•	Data Quality Objectives

Methods

•	Visual Sampling Plan
(Probabilistic)

•	Judgmental (Targeted)

Tools



• MicroSAP



•TOTS



Operationalize
Sampling Plan

Designate
Sampling Teams



Define Sampling
Assignments

Prepare Sampling
Kits

Configure TOTS
Export

• Geospatially-referenced
sample locations

Define Samples

•	Number of Samples

•	Sample ID
Nomenclature

Create QR Code
Labels

•	SQUIREL

•	Correlate with sample
IDs and Team ID



Capture Sampling
Data

Monitor Sampling
Event

Store and Manage
Sampling Data

Analyze Laboratory
Data

Prepare Field Survey

• ArcGIS Field Maps

Configure Devices

•	GPS

•	Cellular

•	Field Maps app

• Location Services On
. • Camera/ Video Access

Generate Electronic
Data Deliverables

Assess Sampling
Team Status

• ArcGIS Operational
Dashboard

Real-Time Data
Acquisition QA

* ArcGIS Operational
Dashboard

Acquire Database
Space

•	GeoPlatform (likely for
natural disaster events)

•	ER Cloud (likely for CBRN
events)

•	Scribe

Define Data Storage
Location

Configure Access
Privileges

Import/export data

ปScribe
ป GeoPlatform
• ER Cloud

Conduct Analyses

•	GeoPlatform

•	ArcGIS Insights

•	ArcGIS Operational
Dashboard

Figure 30. AnCOR Data management tasks and supporting tools.

41


-------
Recommendations resulting from Phase 1 of this study emphasized the need to create "a well-
documented workflow, articulating desirable decision-making driven features, and defining
required metadata and features needed to support data workflows." Further, enabling the
response community to quickly adapt to new technology implemented using proven workflows
will advance preparedness levels [2], Additional considerations resulting from experiences
gained through completing Phase 2 of the project generally centered on the following topics:

•	Quality Control Procedures/Objectives,

•	Field Data Capture Form,

•	Training,

•	Operational Logistics, and

•	Managing Devices.

Table 3 summarizes important observations and feedback and identifies several actionable
outcomes to further advance preparedness. Based on input from the project team, actions that
should be prioritized are designated accordingly.

Table 3. Additional Observations and Feedback

Topic

Observations/Feedback

Quality

Control Procedures
/Objectives

Closely integrate sampling and data management plans to ensure the right
data are collected to inform decisions.

Relate/define appropriate validation and QC checks that would be required
based on identified data needs.

Compare sample matrix to sample method (via QR code scan); prevent data
entry unless appropriate match.

Check acceptable proximity to pre-established sampling point.

Automate as many checks as possible through smart forms to prevent the
entry of erroneous data from the start.

Identify real-time checks (via a checklist) for data management team
monitors to assess during the event and define the process for communicating
any issues that might arise.

Consider creating specialized views/queries that would support quickly
identifying questionable entries.

Field Data Capture
Form

Optimize forms to facilitate completion in less than one minute.

Minimize free-form text entries.

Maximize the use of "auto-collected" data to require fewer entries by a user
(e.g., individual/team performing entry, day/time of entry, location of entry,
sample status following entry).

Incorporate visual cues on the digital map to illustrate the status of sample
points.

Training

Clearly define specific objectives related to sampling event and data
collection to ensure participants understand what, why, and how for each task
they are asked to perform.

Provide routine training to both emergency response personnel and surge
capacity personnel (including researchers) for both data capture software and
technologies that would be used during a response.

Offer interactive, step-by-step training (either in person dry-runs or via
PowerPoint or MS Teams).

Train participants in advance on both the equipment (technology) and the
software that they will use while in the field.

42


-------
Table 3. Additional Observations and Feedback

Topic

Observations/Feedback

Training (Continued)

Provide a classroom-based exercise to demonstrate tasks, discuss common
""gotchas." and answer questions well in advance of the exercise.

Strategically pair teams to ensure whoever is charged with collecting data has
the appropriate skills and training to successfully accomplish the tasks.

Distribute laminated instruction cards and electronic versions that are pre-
loaded on devices that provide quick tips and troubleshooting solutions to
sample teams.

Identify potential issues and corresponding course of action (e.g., if device
overheats, if battery capacity dips below 10%).

Operational Logistics

Optimize sampling routes to ensure that distances in-between samples are
reasonable and make sure sampling routes avoid non-access areas or work
zones.

Plan sampling routes to avoid spreading contamination to otherwise clean
areas.

Prepare QR code labels (for samples or team personnel identification) using
high quality printers and labels to support optimal recognition by barcode
readers/device cameras.

Regarding extending battery life, determine whether samplers can operate in
an offline mode for data submission where checkpoints are established (e.g.,
after completing five samples) to synchronize data or if they should carry
battery backups/chargers.

Associate data managers with no more than four teams during an active
response that requires the use of data acquisition tools and GPS hardware.

Leverage real-time location tracking among teams where teams near one
another can provide troubleshooting support or assist with resolving
immediately known collection errors/conflicts/issues from another team (as
long as contamination spreading is avoided).

Managing Devices

Develop and execute a checklist for configuring and testing all hardware and
devices that will be used in advance of the exercise.

Implement all operating system upgrades, application updates, device settings
(e.g., cellular, WiFi, Bluetooth, application access to camera, location
services) to ensure optimal performance and configurations to support the
associated data capture form features.

Provide extra battery packs (adequately protected from contamination) to
extend the capacity of a device's onboard batteries.

Consider distributing WiFi source/extenders in the field to support device
connectivity.

Place and retain a GPS system in the field (i.e., hot zone) for the duration of
the daily sampling activity (<12 hours).

Consider distributing alcohol wipes to clean screens when in the field.

Provide device hand grips to improve usability in the field.

Consider having replacement devices on-hand and ready to activate should
devices in the field begin to fail (e.g., battery needs recharging, device needs
to cool down).

Issue and attach a stylus for data entry.

Design and implement measures to protect all sensitive equipment that might
require decontamination.

43


-------
Several additional issues that were identified that require more information and/or research
include:

•	Address and document internal QA procedures that will occur prior to transferring
samples for analysis by the laboratory;

•	Define data auditing procedures and document explicit rules that will govern QA
activities;

•	Determine how chain of custody forms will be generated (i.e., what tool will be used) to
generate required documentation; and

•	Address how chain of custody forms will be shared with labs.

•	Evaluate the impacts of weather (e.g., cold/hot temperatures, precipitation) on the
usability and performance of tablets and GPS units.

Through this effort, candidate tools were exercised and evaluated to assess the current state of
technologies to enhance the USCG and EPA's ability to respond to and recover from a CBRN
incident. The technologies and software recommended will be exercised through a complete data
management workflow during the AnCOR WAD held in May 2022. The operational
considerations illuminated through this study provided invaluable information to ensure
increased preparedness and, ultimately, more efficient and successful field data acquisition and
management activities.

10 REFERENCES

1.	US EPA, ORD. 2019. "Analysis for Coastal Operational Resiliency." Reports and
Assessments, https://www.epa.gov/emergencv-response-research/analvsis-coastal-
operational-resiliency.

2.	"Data Management for Wide-Area Responses: Literature Review and Operational Expert
Feedback." EPA/600/R-21/095, 2021. U.S. EPA, Washington, D.C.

3.	"Bio-Response Operational Testing and Evaluation (BOTE) Project - Phase 1:
Decontamination Assessment." 2013. EPA/600/R-13/168. U.S. EPA, Washington, D.C.
https://cfpub.epa.eov/si/si public i-'ปx.)rd report.cfm?Lab=NHSRC&subiect=Homeland%20
Securitv%20Research&dirEntryId=263911.

4.	Silvestri, E., J. Cuddeback, K. Hall, T. Haxton, C. Jones, and J. Falik. 2021. "Sampling and
Analysis Plan (SAP) Template Tool for Addressing Environmental Contamination by
Pathogens." EPA/600/R-21/144. U.S. EPA, Washington, D.C.

https://www.epa.gov/esam/sampling-and-analysis-plan-sap-template-tool-addressing-
environmental-contamination-pathogens.

5.	Trade-off Tool for Sampling (TOTS), n.d. U.S. EPA, Washington, D.C. Accessed March 14,
2022. https://tots.epa.gov/.

6.	EPA QR Tool (version 1.3). n.d. U.S. EPA, Washington, D.C. Accessed March 14, 2022.

https://github.coro/LISEPA/QR Tool.

44


-------
7. "Create Routes (Linear Referencing)—ArcGIS Pro | Documentation." n.d. Accessed March
14, 2022. https://pro.arcgis.com/en/pro-app/latest/tool-reference/linear-referencing/create-

routes.htm.

8.	"What's New—ArcGIS Field Maps | Documentation." n.d. Accessed March 14, 2022.

https://doc.arceis.com/en/field-maps/faq/whats-new.htm.

9.	"About | BioResponder." n.d. Accessed March 14, 2022.
https://www.bioresponder.net/#about/index.

10.	"About | CBRNResponder." n.d. Accessed March 14, 2022.
https://www.cbrnresponder.net/#about/index.

11.	"CBRNResponder Bonus Webinar: Event Reports." 2020. https://YOiitu.be/2rf97E2fl 14.

12.	"Snapshot: ATAK Increases Situational Awareness, Communication | Homeland Security."
n.d. Accessed March 14, 2022. https://www.dhs.gov/science-and-

technologv/news/2(	snapshot-atak-increases-situational-awareness-communication.

13.	US EPA, OLEM. 2015. "SCRIBE Environmental Data Management." Overviews and
Factsheets. https://www.epa.gov/ert/environmental-response-team-information-management.

14.	"CBRNResponder | Terms of Use." n.d. Accessed March 14, 2022.

https://www.cbmresponder.net/#account/request.

15.	McLaughlin, Casey. 2021. "Making Sampling Data Accessible." U.S. EPA, Washington,
DC.

16.	McComb, Martin. 2020. "Full Data Management Lifecyle - 2019, Leveraging GIS Enterprise
& EPA Platforms for Emergency Response." U.S. EPA, Washington, D.C.

17.	Rodgers, M., A. Speciale, T. Boe, J. Falik, and E. Silvestri. 2021. "Tools Used for
Visualizing Sampling and Analysis Data During Response to a Contamination Incident."
EPA/600/R-21/150. U.S. EPA, Washington, D.C.

https ://cfpub. epa. gov/si/si pub 11ป' i -'ปx.)ixl Report.cfm?dirEntrvId=353479&Lab=CESER.

45


-------