DRAFT

DRAFT

DRAFT

INTRODUCTION

The New Chemicals Collaborative Research Program (NCCRP) is a joint activity of EPA's
Office of Research and Development (ORD) and the Office of Pollution Prevention and Toxics
(OPPT)1 to develop and apply innovative approaches to address the requirements of the Toxic
Substances Control Act (TSCA) for the review of new chemicals. TSCA requires EPA to review
all new chemical substances (i.e., those not yet in commerce) to make determinations regarding
potential risks to human health and the environment before manufacturing can commence. With
hundreds of new chemical notices submitted to OPPT per year and limited hazard and exposure
information, addressing these statutory requirements with sound science, transparency, and
consistency, while meeting tight statutory deadlines for decisions, requires continued evolution
of scientific methods, approaches, and tools. Bringing innovative science to modernize the new
chemicals evaluation procedures will help overcome information gaps and help OPPT meet
TSCA statutory requirements in a timely, effective, and efficient manner.

The NCCRP was announced in February 2022 followed by a public meeting in April 2022. The
NCCRP has been designed by ORD and OPPT to be integrative research plan within the
Agency's 2023-2026 Chemical Safety for Sustainability Strategic Research Action Plan.2 The
NCCRP is described in detail in the October 2022 report from EPA to the BOSC entitled The
New Chemicals Collaborative Research Program: Modernizing the Process and Bringing
Innovative Science to Evaluate New Chemicals Under TSCA.3 The research program described
in this EPA Report is the focus of this review by the BOSC.

The NCCRP is a focused research program which represents translation and extension of many
aspects of the computational toxicology research that has been in development in ORD for the
past 15 years. In many ways, the NCCRP actualizes the vision and objectives of the CompTox
BluePrint4 and EPA's NAM Work Plan5 by developing NAMs to provide data and information
needs for OPPT's new chemicals program. Importantly, the research conducted by the NCCRP
will also contribute to establishing the requisite degree of scientific confidence needed for these
methods to be used in regulatory decision making in OPPT. Through the NCCRP, ORD is
working with the OPPT to advance five key Research Areas:

1. Updating and refining chemical category formation approaches and improving read-
across inference methods;

1	OPPT is a division of the Office of Chemical Safety and Pollution Prevention (OCSPP). OPPT is the program
office that administers TSCA.

2	https://www.epa.gov/sYStem/files/documents/2022-10/CSS%20FY23-26%20StRAP EPA-
ORD Qctober%202022 508.pdf

3	BOSC Review Draft, October 2022. https://www.epa.gov/svstem/files/documents/2022-

10/White Paper New%20Chemicals%20Collaborative%20Research%20Program BOSC Final 240ct2Q22.pdf.

4	Thomas et. al., 2019. The Next Generation Blueprint of Computational Toxicology at the U.S. Environmental
Protection Agency. Toxicological Sciences, Volume 169, Issue 2, June 2019, Pages 317-332,
https://academic.oup.eom/toxsci/article/169/2/317/5369737.

5	EPA New Approach Methods Work Plan, December 2021. https://www.epa.gov/svstem/files/documents/2021-
11/nams-work-plan 11 15 21 508-tagged.pdf.


-------
2.	Developing and expanding databases containing TSCA chemical information

3.	Developing and refining predictive models for physicochemical properties,
environmental fate/transport, hazard, exposure, and toxicokinetics;

4.	Integrating and applying in vitro new approach methodologies (NAMs) to biologically
profile substances; and

5.	Developing a TSCA new chemicals decision support tool that utilizes curated data and
integrates lines of evidence across many chemical, computational, and biological
profiling platforms.

The NCCRP is somewhat unique for ORD in that it has been designed, in collaboration with
OPPT, to explicitly focus on research and development of specific scientific tools and methods
needed to modernize the approaches for evaluating chemicals in EPA's New Chemicals Program
under TSCA. It is vital, therefore, that the NCCRP include Research Area Coordination Teams
(RACTs)6 comprised of ORD scientists and EPA OPPT scientists. Such RACTs will ensure this
applied research program is designed and conducted in a manner that will deliver the specific
scientific work products needed by OPPT. In this same vein, from the outset, the NCCRP would
benefit from incorporating technology transfer activities as an integral component of each
research project. As noted in the NCCRP report to the BOSC, this focused research program has
been specifically designed to address OPPT's regulatory needs and bolster ORD's efforts to
develop NAMs."7 Therefore, its critical that the NCCRP research activities include actions to
help integrate these modernized approaches into the tool box of methods used by the OPPT and
other end users for the evaluation of new chemicals. Accordingly, to meet this shared
responsibility of ORD and OPPT, activities should be built into the NCCRP, such as education,
training and outreach to end users for each research tool or methodology, as appropriate.

The identified strengths, suggestions, and recommendations herein are informed by a review of
the EPA's draft The New Chemicals Collaborative Research Program: Modernizing the Process
and Bringing Innovative Science to Evaluate New Chemicals Under TSCA ("White Paper"),
EPA's presentations to the Committee, available scientific literature, and Committee members'
experiences using a variety of NAM tools including those of the EPA.

6	The RACT "..develops goals and objectives for the Output and establishes criteria for the work needed to
accomplish it. ORD researchers propose research Products, which the RACT reviews and refines to ensure Products
will meet the goals and objectives of the Output and reflect the timing and specific needs of [the] EPA program
[OPPT's New Chemicals Program]..." Strategic Research Action Plan, Fiscal Years 2023-2026, Chemical Safety
for Sustainability Research Program, EPA/600/R-22/238 | October 2022,
https://www.epa.gov/svstem/files/documents/2022-10/CSS%20FY23-26%20St

ORD Qctober%202022 508.pdf.

7	The New Chemicals Collaborative Research Program: Modernizing the Process and Bringing Innovative Science
to Evaluate New Chemicals Under TSCA; page 5. https://www.epa.gov/svstem/files/dociiinents/2022-
10/White_Paper_New%20Chemicals%20CoHaborative%20Research%20Program	BOSC	Final	240ct2Q22.pdf.


-------
Charge Question 1

Question 1

As described in Research Area 1 of the accompanying White Paper (pages 16-20), planned
research activities are focused on updating and refining the chemical categories and read across
methods used by OPPT. Please comment on whether there are other approaches or chemical
characteristics that could be considered when developing the categories and analog identification
methodologies.

Narrative (Provides background and context for strengths, suggestions, and
recommendations)

EPA ORD and OPPT are to be commended for including, as a critical pillar of the New
Chemicals Collaborative Research Program, research focused on modernizing the methods used
by OPPT to group chemicals into categories and the procedures to conduct read-across.
The OPPT new chemicals program currently relies heavily upon grouping of chemicals into
categories and read-across (i.e., inference prediction modeling to extrapolate data / information
from a similar substance to the substance undergoing review) to fill needs for data/information to
evaluate potential hazards, exposures and risks of new chemical submissions. As we understand
it, OPPT currently relies upon expert judgement procedures for grouping similar chemicals into a
category (or sub-category) by applying an OPPT chemical similari tpime guidance
document that was last updated in 2010. While expert scientific judgment has, in the past, often
played a large role in many scientific interpretation processes, such practices can be problematic
due to lack of transparency, difficulties in reproducibility, and concerns over subjectivity and
bias. In addition, it is our understanding that the current toxicity inference approaches used by
OPPT rely almost exclusively on extrapolating traditional toxicity testing data obtained from
laboratory animal studies.

The diverse data streams proposed to support chemical clustering and rapid hazard assessment
have tremendous promise to improve the ability to estimate toxicity over traditional methods,
however, the use of these new technologies should be fit for the purpose of the assessment.

While similarity in structure is one important attribute to evaluate when grouping chemicals into
a category, structural alerts alone is likely not be sufficient. Data from new lines of evidence, in
particular approaches that use mechanistic NAM assays to explore similarities in biological
response pathways (i.e., biological activity profiling), can provide critical information for
grouping. Over the past 15 years, there have been considerable advances made in scientific
understanding of biological pathways and how chemicals interact with biological systems. This
knowledge has been instrumental in enabling the development of advanced mechanistic assays
(NAMs) and improved computational profiling methods. New methods for dosimetry, such as
IVIVE, and improvements in exposure science and exposure modeling have also been brought to
the forefront during this time period. By working together on the New Chemicals Collaborative
Research Program, ORD and OPPT can bring this knowledge and these methods forward to
design and conduct the research needed to develop, evaluate, and establish scientific confidence
in, more objective, advanced and transparent approaches for grouping similar chemicals and
inference modeling to address data / information needs.


-------
Strengths (Bulleted list of program strengths)

•	By using many different attributes and methods, the breadth of this research coupled to
the systematic approach will improve the objectivity and transparency of the data and
procedures used to inform chemical category formation and the basis for similarities for
read-across.

•	This research project explicitly includes approaches to evaluate and integrate
computational and biological activity profiles, toxicokinetics, metabolite
formation, persistence, etc. This is expected to create a richer understanding of
similarities and differences.

•	The GenRA method is an easy-to-use tool that is expected to 1) improve
transparency and reproducibility in category formation and read across, and 2)
increase understanding and communication of uncertainties.

•	The explicit procedures envisioned to be actualized in GenRA is expected to
reduce subjective, expert judgment and unconscious / conscious bias.

•	Converting the structural information that underlies the existing new chemical
categories (NCC) into a machine-readable form (e.g., SMARTS patterns) will
help to make the process of reviewing whether a new chemical fits into an NCC
more systematic, transparent, and reproducible confidence in the predictions.

•	Understanding how well the chemicals in the TSCA non-confidential list fit within the
domain of applicability for the different (Q)SAR models ORD uses is important to help
make determinations as to the suitability of the predictions. This could also help to guide
NAM-based testing to expand the applicability domain of the (Q)SAR models and
improve confidence in the predictions.

Suggestions (Bulleted items that are important but don't rise to the level of
recommendation)

•	The title of this research area should be changed from "Update and Refine Chemical
Categories" to "Modernizing Chemical Categories and Improving Inference Modeling to
Fill Data / Information Needs." (high priority / low effort)

•	To build understanding and confidence in the new chemical grouping methods and
modernized read-across methods, from the outset, these research activities should
consider the end users in mind, with ORD and OPPT collaborating on education, training
and outreach to EPA staff and external stakeholders, (high priority / medium effort)

•	EPA ORD and OPPT should consider exploring approaches for data integration and
visualization to help document and communicate similarities and differences across
compounds for all of the attributes evaluated in GenRA. For example, a spider/radar plots


-------
or 3-D techniques - techniques that could facilitate side by side, or overlay, comparisons,
(high priority / low effort)

EPA should explore the potential to use of Quantitative Structure Use Relationships
(QSURs) and advanced high-throughput exposure models to inform category formation,
read across and screening level risk evaluations, (medium priority / medium effort)
Taking into consideration the New Chemical 'am approaches, including procedures
for requiring new data and information, consideration should be given to designing the
modernized New Chemicals evaluation procedures in a tiered manner, in which the first
tiers utilize predictive in silico tools to quickly identify potential toxicity, group
chemicals for read-across, predict potential exposures, and then additional information
can be incorporated as necessary to build the weight of evidence to support read-across.
This could include incorporation of approaches to efficiently predict approximate
metabolite abundance, activation or breakdown to a reactive species, or detoxification
may be informative to chemical grouping. Overall, this tiered approach should be
designed to be adaptable to different exposure and use scenarios. For example,
modernized clustering algorithms can be used to quickly identify analogues and support
read-across using tools such as GenRA when the chemical of interest is within the
domain of applicability of the models. However, for chemicals that do not lie well within
the domain of applicability, addition of bioactivity data from the rapid screening assays
and mechanistic biological pathway knowledge (e.g., AOPs) could improve hazard
estimation. Structuring this process as a flexible, tiered approach should encourage
assessors focus on the best tools for the particular risk decision at hand, (high priority /
low effort)

To facilitate transparency and reproducibility, clear decision criteria need to be defined
for each grouping or read-across tool. Explicit data interpretation procedures for model
results and a structured decision analysis framework for determining when additional
analysis or specific additional testing should be considered will be important for ensuring
these new methods are used to their best effect, (high priority / high effort)

The clusters for the TSCA active inventory should be periodically (e.g., every 4 years)
updated based upon the availability of new or updated data/knowledge on the chemicals
in the inventory or new methods to clustering. For example, if the model(s) that
calculates physicochemical properties contained within the fingerprint used to cluster the
chemicals is updated and can make predictions for more chemicals, (high priority / high
effort)

It will be important to clearly define the domain of applicability, as well as the areas of
uncertainty, to ensure appropriate use of the new tools. Chemicals that are not likely to
be well-addressed by a particular model should be clearly flagged, and explicit data
interpretation procedures provided for alternative assessment approaches. It will be
particularly important to address difficult-to-test substances and complex mixtures (e.g.,
UVCBs). These issues are larger than a single agency or research program. Leveraging
the broader regulatory science community through communities of practice and crowd-
sourcing solutions may help facilitate improvement in these areas, (high priority / high
effort)


-------
•	While publication in peer reviewed journals can be a critical step to broader scientific
acceptance of new and improved methods, the publication process can often delay public
dissemination of EPA work products which can slowdown and unnecessarily impede
uptake and use. These delays need to be avoided. This can be accomplished by
incorporating into the project design alternative methods for independent scientific
engagement and/or peer review (see EPA's Peer Review Handbook and, e.g. SciPinion)
that can be combined with stakeholder engagement. [E.G., Ad hoc presentations of
interim work products & updated plans, periodically focused webinars, planned peer
engagement on specific activities, include as a dedicated section of the annual EPA
NAMs workshop, etc.], (high priority / medium effort)

•	The importance of metabolism and degradation materials/pathways in chemical toxicity
must continue to be considered within prediction-based risk assessment approaches. We
are aware of efforts within the EPA as well as the broader research community to begin
to address this challenge. As the science progresses, opportunities should be explored to
incorporate prediction of metabolites and degradation products. Given the nascent state of
the science, significant resources are currently required for metabolite identification,
abundance and bioactivity determinations. Therefore, EPA should continue to monitor
developments in this space and incorporate newer methods into read-across approaches
when these applications are determined to be fit for purpose for OPPT's new chemicals
program, (low priority / high effort)

Recommendations (Priority action identified by the panel that is actionable by the ORD
program)

The panel offers the following recommendations:

1.	The Committee recommends EPA ORD, in conjunction with OPPT, design, conduct and
publicly disseminate case studies evaluating the performance of the current OPPT
categories compared to the new approaches, such as GenRA, to support a read across
assessment where analog toxicity data are compared to target chemical toxicity data that
are initially blinded to the assessor. Case studies should include several situations (e.g.,
where an understanding of metabolism is critical for establishing suitable analogs, where
bioactivity data are limited, where small changes in chemistry have the potential to have
significant impact on toxicity, etc.). These case study activities will help document
scientific confidence in the newer approaches, and support transitions from the existing
OPPT approaches to the newer read-across approaches (e.g., GenRA).

2.	The Committee recommends EPA ORD and OPPT explore the potential to use CBI data
within the GenRA and other inference models for grouping and read-across. One option
to explore would be using federated learning with differential privacy data methods, or
similar technologies, that allow the private data to be retained and protected locally while
still enabling the data to be used in model development. Another option to consider
would be developing a protected in-house user downloadable app (e.g., like the OECD


-------
tool box download) to enable data use while protecting CBI. We also recommend
discussing with FDA their approaches to using confidential data for inference model
development, such as FDA's Critical Path Initiative. This is a particularly important
research activity that may improve approaches for new chemicals that fall outside the
current domains on non-CBI databases.

3. The Committee recommends that, in addition to having a Research Area Coordination
Team (RACT), ORD and OPPT should establish a process and schedule for jointly
evaluating the scientific confidence and readiness of these NAMs for updating the new
chemical grouping and read-across methods that are intended to be used by OPPT's new
chemicals program. A set schedule is needed to ensure the review process is keeping pace
with advances in science and knowledge, to focus the next round of research, and to
provide the certainty needed for the Agency and stakeholders to efficiently and
confidently implement these methodologies. This would also ensure predictability in the
application of program guidance for a set time period. One schedule to consider is
alignment with the StRAP cycle. For example, the schedule for this scientific confidence
and readiness review could be sequenced to finish at a point in time where the results of
the review and recommendations for additional research serve as input into development
of the next StRAP.

Charge Question 2

Question 2

As described in Research Area 2 of the accompanying White Paper (pages 20-28), planned
research activities are focused on expansion and further development of existing public databases
in ORD containing chemistry, hazard, exposure, and toxicokinetic information relevant to TSCA
chemicals. Please comment on this effort, including in your feedback useful sources of chemical
information that could be incorporated into the curation efforts.

Narrative (Provides background and context for strengths, suggestions and
recommendations)

Data relevant to TSCA chemicals are available in a wide range of public sources along with
legacy OPPT TSCA files. Many of these legacy TSCA data are not in a digital form that can be
currently accessed. Moreover, data that exist in publicly available databases may not exist in a
form where they are easily and reproducibility queried and integrated. There is also a vast
amount of existing chemical information in peer reviewed and "gray" literature that is currently
not easily accessible. To address these complex challenges, ORD and OPPT seek to develop and
expand databases containing TSCA relevant information. Plans described in Research Area 2
include continued extraction and curation of existing data on physical-chemical properties,
environmental fate, hazard, and exposure. Plans are also outlined to map information in existing
ORD databases to standardized reporting templates, storing the linked information in an
International Uniform Chemical Information Database (IUCLID). Developing robust and


-------
comprehensive databases that digitize and merge this existing information will be essential for
rigorous predictive evaluation of new chemicals under TSCA. If successful, the proposed plan
will enable the reproducible development and refining of (Q)SAR models, inform the
development of new chemical categories, and provide readily accessible data for analogs in the
read-across evaluation of new chemicals.

In general, the strategies laid out by NCCRP are robust and well thought out. Digitization of
legacy OPPT TSCA data in a machine searchable format will enable these data (potentially
including CBI information) to be incorporated in new chemical characterization in a transparent
manner. By integrating existing databases on physicochemical properties and environmental fate
properties, household product chemical composition and function, multimedia monitoring data,
ecological hazard, human health hazards, and toxicokinetic data, OPPT will be able to leverage
vast amounts of existing data, assisting EPA in their legislative mandate for timely new chemical
evaluation. The development and integration of literature mining techniques will potentially
allow for the incorporation of relevant chemical information from the published and gray
literature. We commend OPPT and ORD for their commitment to open-source reproducible
science. We suggest an additional set of databases which may provide additional information
relevant to toxicological evaluation. We additionally make suggestions towards best practices for
data submission, curation, and harmonization. Finally, we make recommendations towards
replication, quality control, and validation to ensure that the plans result in reproducible and
transferable evaluation methods.

Strengths (Bulleted list of program strengths)

•	Single source of truth: Standardization of database vocabularies to an internationalizable
format will ease the use of data in more applications and create more transparency in the
evidence used for downstream applications.

•	Data source versioning: Versioning and storing of source databases will help to maintain
their data as part of a larger data store, and help guarantee the longevity of that data as
well as the reproducibility of analyses of the data.

•	Comprehensive set of databases identified: Proposed databases will capture relevant
information on chemical identity and structure, physiochemical and fate properties, health
hazard data, human exposure data, and toxicokinetics.

Suggestions (presented in order of priority)

Note: High priority and low effort suggestions may be considered for actions, high priority and
high effort suggestions may be integral to advancing the science, but beyond the current scope of
the NCCRP, and low priority suggestions are for consideration purposes.


-------
•	Data Life Cycle (high priority / low effort)

Source databases will deprecate, lose support over time, or, possibly, be identified as having
quality control issues. A protocol should exist to handle source data deprecation / removal.

•	Data Quality Control - Studies (high priority / medium effort)

Care should be taken to create a tracking system to unambiguously associate source studies
with aggregated report data to prevent data duplication and avoid impact on Weight of
Evidence analyses.

•	Capacity for data provenance (high priority / medium effort)

When models are created from the constructed data store, it should be possible to reference
which source data was used to construct the model.

•	A list of recommended databases (high priority / high effort):

Name

Link

Description

Chemical Identity & Properties

PFAS Tox
Database

httDs://Dfastoxdatabase.ors/

Collaborative group of
university and non-profit
based scientists to support
comparators.

ITRC

https://pfas-l .itrcweb.org/

Technical resources for
addressing environmental
releases of PFAS; small
database of

structure/phy si cal/chemi cal/to
xicology data

ChemlDPlus

http s: //chem. nlm. nih. gov/chemi dplus

Contains chemical, physical,
and some hazard/toxicology
information

Zinc20

https://pubs.acs.org/doi/10.1021/acs.jcim.
0c00675

Billions of small molecules
specifications.

Human

Metabolome

Database

httDs://hmdb.ca/

small molecule metabolites.
This includes drugbank
(drugs/metabolites relevant to
some PFAS like fluoxetine
and detergents -
antimicrobials

In Vitro Hazard Data

LINCSL1000

http s: //lincsproj ect. org/LIN C S/data/ overvi
ew

Compilation of Gene
Expression Profiles

The Cell Image
Library

http s: //doi. or s/10.1093/si sasci ence/si wO 1
4 and

http://www.cellimagelibrary.org/home

Morphological profiles of
30,000 small molecules via
cell painting


-------
Gene Expression
Omnibus

https://www.ncbi.nlm.nih.gov/geo/

a public functional genomics
database - array and sequence
data.

In Vivo Hazard Data

FAERS

http://open.fda.gov/data/faers/

FDA Adverse Event
Reporting System

Chembl

http ://www. ebi. ac.uk/chembl/

Manually curated database of
bioactive molecules;
combines

chemi cal/bi oacti vity/ genomi c
data

Clinvar

http://ncbi.nlm.nih.gov/clinvar/

Aggregated information on
Genomic Variation / Human
Health relationships

PharmGKB*

https://www.pharmgkb.org



ICE

https://ice.ntp.niehs.nih.gov/

data sets curated for targeted
toxicity endpoints by
NICEATM and others.

Comparative

Toxicogenomics

Database*

http://ctdbase.org

Curated associations between
chemicals, pathways,
diseases, exposures,
organisms, genes, and
anatomy

Echemportal

https://www.echemportal.org

Chemical hazard
classifications from 30+ data
participants

* Proprietary databases

A large list of life science databases with open-source scripts to extract and build versioned
parquet tables is available at https://github.com/orgs/biobricks-ai/repositories.

•	Open source for literature review tools (high priority / unknown effort)

When possible, open-source tools should be used for the referenced document review
workflows. Open-source tools enable greater transparency and replicability.

•	Harmonization of all entity types (high priority / high effort)

In addition to harmonization of chemical identifiers, there is a need to harmonize any entities
that associate chemicals with values. Understanding which tests are indicated for different
regulatory needs, and designing models that merge the outputs of different assays, is
challenging when there are ambiguous relationships between test protocols, assays, and
chemical properties. There are ontologies that attempt to hierarchically name assays
(bioassayontology.org). Adoption of an existing method, or creation of a new method, to
both unambiguously identify tests and identify relationships between tests is suggested. For
example, knowing which assays are referenced by which OECD guidelines and where those
guidelines are referenced in hazard classifications requires controlled vocabularies for assays,


-------
guidelines, classifications, and their relationships.

Peer-reviewed literature mining, focus on human studies (high priority / high effort)

While the health outcome databases appropriately focus on experimentally derived
toxicology data, a focus on mining the existing literature for epidemiological data linking
exposures and health outcomes could be considered. This is particularly relevant in the case
of some PFAS, where toxicokinetics and toxicodynamics are very different in humans than in
commonly used rodent models.

Validation Sets (high priority / high effort)

There is a need in the modeling ecosystem for comparative validation. When new
computational models are constructed to estimate NCCRP endpoints, their use should be
justified via comparison to existing tools. A large, hidden validation set, that is not publicly
shared, could be used periodically as a fair method of comparison for new models.

Data Imputation (medium priority / medium effort)

If there is a plan to impute or fill in missing chemical property gaps, the method of
imputation should be clear and the use of estimates to build new estimates should be limited
to reduce error propagation.

Guidance for data submission (low priority / low effort)

Several of the suggested source databases allow for public depositing of new data. GEO, for
example, allows researchers to deposit raw and processed high throughput sequencing and
array-based data identifying molecular signatures of chemical exposures. It would be useful
to know how to add new data to the constructed system and whether there are tools to deposit
directly, or what the recommendation is for submitting to source databases.

Data Quality Control - Data Depositors (low priority / high effort)

When source data are used as evidence in regulatory decisions, care should be taken related
to the identity of a data depositor. There are potential conflicts of interest and sources of error
associated with the identity of a data depositor.

Expanding Exposure Scenarios - (low priority /high effort)

In addition to CPDat and existing databases on consumer, occupational, and industrial
exposure pathways, and given the low safe use levels for some chemicals, and the
stakeholder concerns (NGOs, public) regarding exposure, it may be useful to expand
exposure scenarios to include dermal exposure scenarios in clothing and occupational
personal protective equipment. This is particularly relevant given the EPA's emphasis on
equity, environmental justice, and cumulative impacts. (See Washburn et al, 2005:
https://pubmed.ncbi.nlm.nih.gov/15984763/). As the EPA is compiling existing exposure
data through the Multimedia Monitoring Database, they could consider making these data
public and easily accessible, which could help to build trust with environmental justice and


-------
fenceline communities.

• Data Automated Curation (low priority / medium effort)

Some data sources are beginning to adopt semi-automated curation methods that use AI tools
to automate the extraction of structured data from unstructured sources. Automated methods
can introduce unknown biases and sources of error. When possible, this data should be
flagged and be separable from non-automated approaches.

Recommendations (Priority action identified by the panel that is actionable by the ORD
program)

The panel offers the following recommendations in order of priority:

1.	Ease of replication.

Implementing a system for easy replication has high value and relatively low added effort.
Accordingly, the Committee recommends EPA should include a programmatic method to easily
download a versioned copy of all of the open access data. This will allow stakeholders to better
align their analyses with best practices created in NCCRP. A single bulk download is a less
costly and more maintainable way to distribute the created data than APIs, which create uptime
and versioning issues and create additional work for developers. A bulk download that can be
accessed via tools like ftp, rclone, wget, curl, will make it easier for developers to use the created
data. When data is very large, serving data in a method that allows efficient mirroring (and
reduces redundant downloading) is recommended.

2.	Defined Procedures for Quality Control

The Committee recommends development of documented standard operating procedures for
quality control should be implemented rather than use of ad-hoc methods. Development of
automated processes to identify outliers, data conflicts, and or likely sources of error should be
considered to reduce the cost of these procedures. If missing data will be imputed, the methods
of imputation should follow a defined protocol and imputed values flagged. Automated quality
control tests are high value but also significant effort. Thus, the design and implementation of
such activities will need to be carefully thought through.

3.	Validation Sets

The Committee recommends EPA undertake the creation of standard validation sets for the
evaluation of NAMS. These validation sets could be periodically used to fairly, and
quantitatively, evaluate NAMS. If these validation sets are kept confidential (not necessary or
required), their value as a fair comparator increases and the capacity for NAM developers to
overfit AI models or construct in vitro models specifically to perform well on validation
decreases. However, managing validation sets could create significant value for the NAM
ecosystem, but present a high effort, high maintenance, and high responsibility deliverable.
Accordingly, the design and implementation of such activities will need to be carefully thought
through.


-------
Charge Question 3

Question 3

As described in Research Area 3 of the accompanying White Paper (pages 28-33), planned
research activities are focused on developing, refining, and evaluating (Q)SAR and other
predictive models for physical-chemical properties, environmental fate/transport, hazard,
exposure, and toxicokinetics.

a.	Please comment on the (Q)SAR and predictive modeling proposed, as well as the proposed
informatics platform for management of input data and development and management of
(Q)SAR and other predictive models. In your comments, please address whether there are
additional (Q)SAR models, approaches, or other informatics platform features that could be
considered.

b.	Please comment on any additional features that could be considered in the evaluation of these
models, applicability domain(s), and association documentation.

Narrative (Provides background and context for strengths, suggestions, and
recommendations)

In its review and response to charge question three, the committee considered the strengths and
possible weaknesses of the (Q)SAR and QSUR approaches presented, alternative approaches and
additional approaches and activities with the potential augment these QSAR/QSUR methods.
The committee also considered various forms of uncertainty in QSAR/QSUR approaches and
how to characterize and report on them.

The committee commends ORD and OPPT on an ambitious and groundbreaking approach to
advance chemical assessments within the USEPA and perhaps more broadly. Goals presented in
the "white paper" are clearly stated and if properly funded have significant potential to achieve
the desired effect of streamlining and improving chemical hazard assessments. Improvements
that are planned in QSARs for physical chemical processes, fate and transport, and toxicological
mechanism are well described and reasonable. The use of QSURs was considered innovative and
reasonable. The committee identified a need for confirmatory empirical (not-in silico) data to
ground truth model output for a subset of existing compounds.

The committee is impressed with the plans for the (Q)SAR and predictive modeling proposed, as
well as the proposed informatics platform, and found the Web TEST tool for (Q)SARs to be a
significant strength for the USEPA, primarily as an organizing platform to integrate data and
modeling efforts. The modeling directions (QSUR, HTTK, fate and transport) are all
appropriately aligned to stakeholder needs and will be useful tools that are publicly available to
assist data poor decisions. The committee also lauds the proposed expansion of the framework
(to OPERA) and the incorporation of QSUR as a novel tool that could greatly improve exposure
assessments.

The committee structured our suggestions and recommendations so that tasks that could have
significant impacts in the near term and that do not require substantial investment are listed first.
Suggestions and recommendations that are more visionary and challenging are listed at the end.
These tasks could require several iterations, review and engagement with the scientific


-------
community and stakeholders before reaching final form, but the committee believes that these
are appropriate directions for the agency to follow.

Strengths

•	Developing a data and computing infrastructure that integrates machine readable data and
modeling platforms will have a large long-term impact by enabling efficient use of
expanding/evolving models and growing data sets. This activity strongly compliments
other activities such as the WebTest tool and generation of toxicity data itself

•	The Webtest platform is a significant strength and should remain a priority because it
improves access and usability and enables community QSAR modeling building.

•	The selection of QSAR model targets (QSUR, HTTK, fate and transport, toxicity, etc.) is
clearly aligned with and supports stakeholder needs for decision making/risk assessment.

•	The addition of QSUR is innovative and has the potential to have high impact on other
activities like use cases for exposure assessment.

•	Requiring that QSAR/UR models are publicly available, including the associated
training sets, algorithms and validation work assures transparency, improves confidence
and allows all such models to be properly tested and benchmarked.

•	Clearly articulating the expectation that QSAR approaches are developed for application
in data-poor environment will assure appropriate methods are developed and appropriate
testing/verification/assessment approaches are created.

•	Expanding past EPISuite to OPERA will be a strength, given the added functionality of
the OPERA platform. Specifically linking structural characteristics to important
mechanisms of toxicity will facilitate the direction of ORD's activities in the QSAR
space.

Suggestions (Bulleted items that are important but don't rise to the level of
recommendation)

•	Develop and/or articulate EPA's plan for horizon scanning to assure that emerging
published QSAR models are added to the EPA model suite over time.

•	Explore the appropriateness of new machine learning approaches designed for sparse data
sets (few shot methods) for QSAR modeling. Traditionally developed for image analysis,
they may or may not be of value here. This should be a small effort: literature review or
ask an expert.

•	Given that chemical purity data is already collected by GC or LC MS analysis, evaluate
the value of implementing a method for measurement of partition coefficient during these
same GC and LC MS runs and implement if the EPA judges the value justifies the
investment. See: OECD. 2022. Test No. 117: Partition Coefficient (n-octanol/water),
HPLC Method. Organization Economique Cooperation and Development. 11 pp.
https://doi.ore/10.1787/9789264069824-en

•	As the EPA moves from development of open source QSAR models using open-source
data to use of data and models projected by CBI, develop appropriate standards and
criteria for utilization of those data and models.


-------
•	Consider tracking the opportunity that molecular dynamic simulation models (quantum
chemistry models from institutions like DOE and NSF) might offer for improving the
accuracy of prediction of chemical properties or calculation of additional chemical
properties useful for QSAR, categorization or QSUR. Molecular dynamic models might
also be able to adjust ligand-binding models developed for one species (estrogen, human)
to another species where the receptor exists in a different internal environment (pH,
temperature) etc.

•	Consider requiring that computational models be open source.

Recommendations (Priority action identified by the panel that is actionable by the ORD
program)

The panel offers the following recommendations:

1.	The Committee recommends EPA expand tools/approaches for reporting on confidence
in QSAR model predictions including measures of variance, and uncertainty (e.g.,
domain of applicability, strength of training data) and provide documentation how those
measures of variability and uncertainty are calculated, including the actual code. The
Committee recommends this activity be implemented straightaway.

2.	The Committee recommends EPA establish and implement methods, if feasible, for
including a "flag" in toxicity databases for compounds that cause non-specific effects
(e.g. surfactants and facile reactants), or other flags, for example related to overfitted
dose-response curves in some in vitro data sets, to assure that these problems do not
adversely and unknowingly affect QSAR modeling. The Committee recommends this
activity be implemented straightaway.

3.	To support the value and impact of the WebTest resource, the Committee recommends
EPA a) engage the regulatory science community in one or more workshops to provide
feedback on performance and usability, and solicit suggestions for further development
and b) develop and deploy a semi-automated (easy to access and utilize by the
community) workflow for model evaluation that is quantitative, transparent, consistent
and offers comparative benchmarking. The Committee recommends this activity be
implemented in the near- term.

4.	As efforts to develop databases of known metabolites matures, the Committee
recommends EPA develop a framework or method for incorporating assessment of
known metabolites as classes of compounds for QSAR modeling and incorporate QSAR
or other models that predict metabolites/breakdown products/transformation products for
later exposure and toxicity QSAR modeling. Transformation products are not currently
treated in the toxicity assessment. This recommendation should be considered for
implementation in the longer-term.

5.	As efforts to expand toxicity databases to address gaps in domains of applicability come
to a conclusion, the Committee recommends EPA identify the next priority areas where
toxicity data needs to be expanded to improve the ability to develop QSAR and related
models that support ecotoxicity assessments (e.g. terrestrial toxicity, others). This
recommendation should be considered for implementation in the longer-term:


-------
Charge Question 4

Question 4

As described in Research Area 4 of the accompanying White Paper (pages 33-40), planned
research activities are focused on developing and evaluating a suite of in vitro NAMs that could
be used by external stakeholders for testing and data submissions under TSCA, as well as
potentially informing and/or expanding new chemical categories. Please comment on the initial
screening strategy proposed. Please include in your comments, other assays and/or endpoints to
consider for the research plan.

Narrative (Provides background and context for strengths, suggestions, and
recommendations)

EPA's proposal outlines a fairly comprehensive NAM-based program to screen new chemicals
for safety in accordance with the Lautenberg Chemical Safety for the 21st Century Act. The
proposed approach follows the path identified in the EPA CompTox Blueprint, including: 1)
broad-based analyses for chemical interactions with numerous molecular/protein targets (discrete
target or generalized/multi-target effects) to cover a wide breadth of potential chemical-
biological target interactions; and 2) targeted analyses to predict potential adverse outcomes.

As our knowledge of biological pathways underpinning human and ecological health improves,
so should the appropriateness and availability of NAM-based approaches. Meanwhile, IATA
approaches as well as disorder and disease models that are fit for purpose can be used to enhance
current NAM predictions of chemical toxicity to support consistent evaluation of data within a
weight-of-evidence approach. For purposes of risk-based screening assessment of new chemical
submissions, EPA has focused on in silico and in vitro tools, primarily used in high-throughput
modes; this approach is appropriate to support EPA's requirement under TSCA to review new
chemical submissions with limited available toxicity and exposure data. These screening
methodologies can be incorporated into IATA approaches that include exposure information and
IVIVE to provide contextual dosimetry.

Strengths (Bulleted list of program strengths)

EPA's goal to identify a suite of fit-for-purpose NAMs to support new chemical review under
TSCA is helpful, and if developed and applied effectively, could improve TSCA reviews for
EPA, the regulated community, the public, and other stakeholders. Highlight strengths of the
outlined NCCRP include:

• Both human health and ecotoxicological assessments are included in the defined NAMs
approach.


-------
•	The integration of data streams in an Integrated Approaches to Testing and Assessment
(IATA)-based approach strengthens subsequent conclusions, particularly when
cheminformatic fingerprints/QSARs are combined with broad and targeted NAM
assessments to evaluate data consistency.

•	The broad coverage of potential toxicity pathways allows greater confidence in NAM-
based assessments. For example, the proposed human health assessment uses:

o Broad-based high content screening approaches to examine numerous chemical-
biological target interactions [i.e., using HTPP and HTTr for respiratory toxicity
to identify whether a chemical may act at a discrete molecular target (specific
MIE) or produce generalized stress responses due to multiple molecular targets
(non-specific response)]
o Targeted screening approaches to provide information on specific MIEs, key
events or hazard-related processes [i.e., SafetyPharm, DevTox Germ Layer
Reporter assay, genotoxicity assays - micronucleus test and Ames] coupled with
in vitro assays to refine HTTK for improved in vitro-to-in vivo (IVIVE)
extrapolation.

These multiple data streams will help identify molecular/protein targets, inform potential
hazard identification, and improve dosimetry estimates, which can be evaluated for
consistency, biological plausibility and thus, can improve confidence.

•	Incorporation of developmental toxicity potential (DevTox assay) is advantageous to
provide data on a 'high concern hazard' that is typically not available for new chemicals

•	Expanding available data using human ALI respiratory cell and/or precision-cut lung
slice cultures and HTTr data will provide valuable information on the performance of
these tools to predict potential inhalation hazards.

•	The proposal to screen 200-300 candidate chemicals is important for gaining scientific
confidence around application of EPA-identified suites of NAMs, particularly if the
candidate chemicals are selected to fill in vitro and in silico gaps and improve
applicability domains. Additionally, this exercise will help identify opportunities to
continuously improve NAM specificity and sensitivity (e.g., DevToxGLR currently at
58% specificity).

•	Applying Eco-HTTr for a subset of chemical structures for which ecotoxicity data and
QSAR applicability is limited will help delineate the domain of applicability of the
method while generating data that improves mechanistic understanding of ecotoxicity.

•	EPA's focus on analytical quality control of identity and purity of candidate chemicals is
critical.

Suggestions (Bulleted items that are important but don't rise to the level of
recommendation)

The proposal is to use NAMs in an IATA-based approach for chemical assessments;
consequently, suggestions address NAMs at all levels.


-------
High priority/Low effort: EPA should review and suggest plausible methods for poorly
soluble or non-aerosolizable chemicals (e.g., microvolume dosing in DMSO using
applicable instrumentation and NAMs).

High priority/Low effort: TSCA requires that risk evaluations of new and existing
chemicals consider potentially exposed and susceptible subpopulations such as infants,
children, pregnant women, workers, and the elderly ("vulnerable subpopulations"). The
Committee suggests that the NCCRP explicitly describe how a suite of selected in vitro
NAMs considers (or does not) these vulnerable subpopulations and continuously work
toward better accounting for such subpopulations as is also in line with the agency's
increasing emphasis on equity, environmental justice, and cumulative impacts.

High priority/Medium effort: The application of biological test systems to obtain
endpoint-specific data should be conducted using standardized approaches that have been
optimized as part of the fit-for-purpose determination. While the culture and assay
methodology for more conventional in vitro test systems may have been well-defined,
this may not be the case for the newer, more complex systems such as ALI and
microphysiological systems (and their exposure and dosimetry methods). Where possible,
EPA could partner with various stakeholder organizations to facilitate method
development/standardization, which would allow additional expert input, other funding
sources and accelerated timelines.

High priority/Medium effort: Before proceeding to invest significantly more resources
into the DevTox GLR-Endo assay development and standardization research, ORD and
OPPT should work together to develop (and collect feedback from the regulatory science
community on) a detailed review paper (DRP) on the state of the science of
developmental toxicity NAMs, including comparisons of the predictive performance,
strengths and limitations of the DevTox GLR-Endo assay compared to, for example, the
Murine Embryonic Stem Cell Assay and the Zebrafish embryo develop-mental
toxicology assay. This DRP would be expected to provide scientific justification to
support investing in research in the most promising fit for purpose assays suited for
OPPT's new chemicals program decision context.

High priority/High effort: We suggest EPA articulate how considerations like route of
exposure (inhalation, dermal vs. oral), bioavailability, metabolism requirements (e.g.,
formation of active metabolites), etc. will affect NAM requirements. For example, if the
relevant route of exposure is dermal and the compound is poorly absorbed, are NAM data
requirements the same?

High priority/High effort: The Committee suggests that EPA expand the chemical
domain for cell painting (HTPP) to better represent the TSCA universe applying a
cheminformatics approach to ensure appropriate chemical diversity. It would be useful
for EPA to compare data generated from HTTr vs HTPP in terms of
identification/correlation of bioactivity profiles (cell phenotypic changes vs expression
changes) and bioactive PoD concentrations. As HTPP matures, EPA should develop a
detailed review paper (DRP) that includes endpoints examined, translation to in vivo
adverse effects, grouping endpoint data to identify positive responses, sensitivity of
HTPP vs. other broad-based NAM screening approaches, impact of cell type, availability


-------
of orthogonal assays, and domain of chemicals tested. In addition, there is a new
HESI/Broad Institute Emerging Systems Toxicology for Assessment of Risk (eSTAR)
project to use cell painting and transcriptomics to evaluate liver toxicity. EPA could join
this group to provide their experience and gather stakeholder input on the use of these
technologies.

High priority/High effort: The EPA should consider newer approaches to assess
genotoxicity to ensure that the selected methodologies are the most appropriate. While
conventional assays may have provided substantial guidance on the evaluation for the
genotoxic potential of materials, these methods are often cumbersome and time
consuming. How are newer (and potentially more predictive) assays being incorporated
into the overall testing scheme to replace the older assays?

Medium priority/Medium effort: Consider including human precision-cut lung slice
cultures (where airway contractility may be evaluated as a phenotypic endpoint), along
with the other identified complex, heterocellular 3D experimental models that offer high
content phenotypic responses. Recent advances in preservation and increased throughput
have made these more accessible and allow for larger scale and repeat donor-based
studies.

Medium priority/High effort: The Committee suggests EPA continue to evolve the
BioTransformer (OECD Toolbox) program to address the likelihood of metabolite
formation. BioTransformer (OECD Toolbox) which is used to predict liver-generated
metabolite can predict metabolites that have not been observed in guideline studies. For
example, HTTK and HTTP data generation on metabolism and comparison with other in
silico tools to predict metabolism will be valuable to better understand the relevance of
these predictions.

Medium priority/High effort: If selected, DevTox GLR assay endpoints (biomarkers for
differentiation of the endoderm, mesoderm and ectoderm germ layers) represent a very
limited portion of development. The Kapraun/Wambaugh HTTK computational model
for pregnancy (Kapraun et al., 2022) simulates gestational week 13 until parturition,
whereas the gastrulation step measured in the DevTox GLR assay occurs at weeks 3-4 of
pregnancy. The Committee suggests that the NCCRP include developmental toxicity
assays that span longer, equally relevant periods of gestation. EPA may consider NAMs
such as ReproTracker®, and devTOXquickPredict™) in this effort. Furthermore, the
committee recommends that each of these applicable gestational stages be incorporated
into HTTK models to allow IVIVE.

Low priority/Medium effort: As the technology regarding all of the more complex
biological test systems (e.g., 3D reconstructed tissues, organ on a chip, precision-cut lung
slices) is rapidly evolving, the review of these test systems should be ongoing by experts
in the field and EPA should routinely assess and modify where necessary NAMs in the
NCCRP initiative accordingly.

Medium priority/Low effort: As a longer term suggestion, the Committee suggest that the
NCCRP consider how and when higher order NAMs (e.g., zebrafish embryos, planaria, c.
elegans) would support effective assessment of integrated endpoints including
neurobehavior.


-------
Recommendations (Priority action identified by the panel that is actionable by the ORD
program)

The panel offers the following recommendations:

•	The Committee recommends that EPA's NCCRP institute dedicated reviews (perhaps
aligned with the StRAP cycle) of the program to assess progress, opportunities, and
challenges with implementation, including an opportunity for stakeholders and the public
to provide input and feedback. This will be especially valuable for further refinement and
application of more innovative NAMs like HTPP.

•	The Committee recommends that the agency optimize and standardizes NAM
development using Good In Vitro Method Practices (GIVIMP) which would aid in their
acceptance and transferability.

•	The Committee recommends that research aimed at defining a suite of in vitro NAMs to
inform new chemical reviews account for potentially exposed or susceptible
subpopulations specifically as it relates to relevant, differential biological considerations
across the population (e.g., variance in toxicokinetics, disease states, age).

Charge Question 5

Question 5

In the Background of the accompanying White Paper (pages 5-16), information on challenges in
new chemical assessment, and the vision statement for the NCCRP, are presented. The primary
vision of the NCCRP is to modernize the process for evaluating new chemicals under TSCA by
supporting the evolution of OPPT's use of new and existing methods, approaches, and tools
using innovative science. Please comment on the extent to which the Research Areas may
address the issues identified in the Background and vision statement. Please also include
potential additional research areas for EPA to consider.

Narrative (Provides background and context for strengths, suggestions, and
recommendations)

EPA is required to make timely evaluations of data poor chemicals that are newly entering the
market, using the best available science and methods, in a way that is as transparent as possible.
To address these challenges, EPA has developed a collaborative research program between
OPPT and ORD. This collaborative research program proposes four research areas: chemical
categories and read-across, database development and growth, predictive models for hazard,
kinetics and exposure, and in vitro NAMs. A fifth research area focuses on decision support tools
and is minimally included in this review as methods are still under development.

This charge question asks the BOSC to comment broadly on the extent to which the challenges
EPA identifies in the background and research statement are addressed by the research proposal.
These challenges include:

• High volume of submissions (average of 500 new chemical submissions per year)


-------
•	Need for rapid decision making (general requirement for EPA to make a determination in
90 days)

•	Lack of information available on chemicals including human and environmental hazard
data as well as use and exposure data

•	Requirement to make (and justify) a formal decision for all new chemicals

•	Substantial informatic needs to making and documenting decisions

•	Promoting transparency when possible while maintaining CBI on a large percentage of
the new chemicals.

The research proposal leverages efforts in ORD to help fill data gaps and manage information
and builds on years of extensive work to develop predictive toxicology and exposure science
tools. This collaborative approach has many strengths that meet the challenges discussed above.
Updating, expanding, and developing new chemical categories and furthering the development
and refinement of QSAR and predictive models will help fill data gaps. Using predictive models
when other data are not available will help EPA make science-based decisions efficiently, which
is critical to address time, information and resource limitations described above. The BOSC is
pleased to see that the long-term investment within ORD in predictive tools is finding application
within the agency.

The NAS Tox21 report and prior BOSC reviews have recommended efforts to incorporate
human and epidemiologic data into the development and refinement of predictive toxicology
tools. Furthering these recommendations, we are pleased to note the ORD case study with PFAS
and electronic health records. Additional work on incorporating clinical data, occupational health
data, and molecular epidemiologic data into some of the decision-support tools, as feasible,
would increase confidence in the tools and more clearly link AOP networks with potential
human relevance. The BOSC recognizes that this is a long-term objective, and encourages
further work toward this goal.

The BOSC is concerned that the significant percentage of new chemicals with CBI claims may
pose a challenge for evaluating the effectiveness of the tools that ORD is developing. If the CBI
chemicals differ systematically from the non-CBI chemicals, then the plan to validate the tools
using only non-CBI information could result in failures to recognize issues with the tools. For
example, a large percentage of CBI claims at the state-level are for polymers, so ORD should be
sure to test the tools on a wide array of polymer structures. For both polymers and UVCBs, it is
also critical to model how they change over time, and ensure that the entire mixture is evaluated.

Strengths (Bulleted list of program strengths)

•	The proposed research program is well-tailored to rapidly evaluating chemicals that have
little or no data.

•	Leverages resources and skills that the ORD team has already developed, and prioritizes
and operationalizes cross-agency connections and collaboration.

•	The modeling of potential use and exposure is an important component of this effort.

•	Another strength of the proposal envisions generating data, especially that:

o the approaches will be assessed with about 200-300 chemicals
o data will be generated to inform IVIVE and kinetics modeling, and
o the effects of chemicals that may be inhaled will be explored.


-------
•	The vision to put multiple data streams together into a unified usable decision support
tool is ambitious and clearly needed.

Suggestions (Bulleted items that are important but don't rise to the level of
recommendation)

•	Consider including ground-truthing the ORD exposure models using existing
biomonitoring datasets (e.g., from CDC, NIH), including, where feasible, biomonitoring
using non-targeted assessment, in this proposal. This could be similar to the multimedia
monitoring database for environmental chemical data.

•	Having analytical methods for environmental monitoring early for newly-introduced
chemicals is also important for continued evaluation of exposure once a chemical is on
the market.

•	ORD should consider longer term goals of developing tools to predict how the toxicity of
UVCBs change throughout their lifecycle, from manufacture through disposal, due to
shifts in the composition of the mixtures.

•	Longer term work should include the development of exposure models to include and
predict unintended exposures from activities like recycling consumer products into new
products, potable water reuse and composting, when feasible.

Recommendations (Priority action identified by the panel that is actionable by the ORD
program)

The panel offers the following recommendations:

•	The Committee recommends EPA consider ways to integrate human data into databases
and tools when possible, including clinical, occupational and other epidemiological study
data, especially in the context of AOP networks. This information could provide a link
between mechanistic results and human outcomes, or help to benchmark tools EPA is
developing.

•	The Committee recommends EPA assemble (or develop) the training or reference
chemical sets used for developing and evaluating methods and models such that they
mirror the characteristics of CBI and non-CBI chemicals that OPPT typically receives,
including polymers and UVCBs. EPA should also try to identify and address relevant
impurities and byproducts, including residual monomers and oligomers. This will ensure
that the methods are applicable to the chemistries OPPT is typically addressing.


-------